Wednesday, December 26, 2012

The Beginning of the World

Last Friday was, as I’m sure most of my readers noticed, an ordinary day. Here in the north central Appalachians, it was chilly but not unseasonably so, with high gray clouds overhead and a lively wind setting the dead leaves aswirl; wrens and sparrows hopped here and there in my garden, poking among the recently turned soil of the beds. No cataclysmic earth changes, alien landings, returning messiahs, or vast leaps of consciousness disturbed their foraging. They neither knew nor cared that one of the great apocalyptic delusions of modern times was reaching its inevitable end around them.

The inimitable Dr. Rita Louise, on whose radio talk show I spent a couple of hours on Friday, may have summed it up best when she wished her listeners a happy Mayan Fools Day.  Not that the ancient Mayans themselves were fools, far from it, but then they had precisely nothing to do with the competing fantasies of doom and universal enlightenment that spent the last decade and more buzzing like flies around last Friday’s date.

It’s worth taking a look back over the genesis of the 2012 hysteria, if only because we’re certain to see plenty of reruns in the years ahead. In the first half of the 20th century, as archeologists learned to read dates in the Mayan Long Count calendar, it became clear that one of the major cycles of the old Mayan timekeeping system would roll over on that day.  By the 1970s, that detail found its way into alternative culture in the United States, setting off the first tentative speculations about a 2012 apocalypse, notably drug guru Terence McKenna’s quirky "Timewave Zero" theory.

It was the late New Age promoter Jose Arguelles, though, who launched the 2012 fad on its way with his 1984 book The Mayan Factor and a series of sequels, proclaiming that the rollover of the Mayan calendar in 2012 marked the imminent transformation of human consciousness that the New Age movement was predicting so enthusiastically back then.  The exactness of the date made an intriguing contrast with the vagueness of Arguelles’ predictions about it, and this contrast left ample room for other authors in the same field to jump on the bandwagon and redefine the prophecy to fit whatever their own eschatological preferences happened to be.  This they promptly did.

Early on, 2012 faced plenty of competition from alternative dates for the great transformation.  The year 2000 had been a great favorite for a century, and became 2012’s most important rival, but it came and went without bringing anything more interesting than another round of sordid business as usual.  Thereafter, 2012 reigned supreme, and became the center of a frenzy of anticipation that was at least as much about marketing as anything else.  I can testify from my own experience that for a while there, late in the last decade, if you wanted to write a book about anything even vaguely tangential to New Age subjects and couldn’t give it a 2012 spin, many publishers simply weren’t interested.

So the predictions piled up.  The fact that no two of them predicted the same thing did nothing to weaken the mass appeal of the date.  Neither did the fact, which became increasingly clear as the last months of 2012 approached, that a great many people who talked endlessly about the wonderful or terrible things that were about to happen weren’t acting as though they believed a word of it.  That was by and large as true of the New Age writers and pundits who fed the hysteria as it was of their readers and audiences; I long ago lost track of the number of 2012 prophets who, aside from scheduling a holiday trip to the Yucatan or some other fashionable spot for the big day, acted in all respects as though they expected the world to keep going in its current manner straight into 2013 and beyond. 

That came as a surprise to me.  Regular readers may recall my earlier speculation that 2012 would see scenes reminiscent of the "Great Disappointment" of 1844, with crowds of true believers standing on hilltops waiting for their first glimpse of alien spacecraft descending from heaven or what have you. Instead, in the last months of this year, some of the writers and pundits most deeply involved in the 2012 hysteria started claiming that, well, actually, December 21st wasn’t going to be the day everything changed; it would, ahem, usher in a period of transition of undefined length during which everything would sooner or later get around to changing.  The closer last Friday came, the more evasive the predictions became, and Mayan Fools Day and its aftermath were notable for the near-total silence that spread across the apocalyptic end of the blogosphere. Say what you will about Harold Camping, at least he had the courage to go on the air after his May prophecy flopped and admit that he must have gotten his math wrong somewhere.

Now of course Camping went on at once to propose a new date for the Rapture, which flopped with equal inevitability a few months later.  It’s a foregone conclusion that some of the 2012 prophets will do the same thing shortly, if only to kick the apocalypse marketing machine back into gear.  It’s entirely possible that they’ll succeed in setting off a new frenzy for some other date, because the social forces that make apocalyptic fantasies so tempting to believe just now have not lost any of their potency.

The most important of those forces, as I’ve argued in previous posts, is the widening mismatch between the fantasy of entitlement that has metastasized through contemporary American society, on the one hand, and the ending of an age of fossil-fueled imperial extravagance on the other. As the United States goes bankrupt trying to maintain its global empire, and industrial civilization as a whole slides down the far side of a dizzying range of depletion curves, it’s becoming harder by the day for Americans to make believe that the old saws of upward mobility and an ever brighter future have any relevance to their own lives—and yet those beliefs are central to the psychology, the self-image, and the worldview of most Americans.  The resulting cognitive dissonance is hard to bear, and apocalyptic fantasies offer a convenient way out.  They promise that the world will change, so that the believers don’t have to.

That same frantic desire to ignore the arrival of inescapable change pervades today’s cultural scene, even in those subcultures that insist most loudly that change is what they want.  In recent months, to cite only one example, nearly every person who’s mentioned to me the claim that climate change could make the Earth uninhabitable has gone on to ask, often in so many words, "So why should I consume less now?"  The overt logic here is usually that individual action can’t possibly be enough.  Whether or not that’s true is anyone’s guess, but cutting your own carbon footprint actually does something, which is more than can be said for sitting around enjoying a standard industrial world lifestyle while waiting for that imaginary Kum Ba Ya moment when everyone else in the world will embrace limits not even the most ardent climate change activists are willing to accept themselves.

Another example? Consider the rhetoric of elite privilege that clusters around the otherwise inoffensive label "1%."  That rhetoric plays plenty of roles in today’s society, but one of them pops up reliably any time I talk about using less.  Why, people ask me in angry tones, should they give up their cars when the absurdly rich are enjoying gigantic luxury yachts?  Now of course we could have a conversation about the total contribution to global warming of cars owned by people who aren’t rich, compared to that of the fairly small number of top-end luxury yachts that usually figure in such arguments, but there’s another point that needs to be raised. None of the people who make this argument to me have any control over whether rich people have luxury yachts. All of them have a great deal of control over whether and how often they themselves use cars. Blaming the global ecological crisis on the very rich thus functions, in practice, as one more way to evade the necessity of unwelcome change.

Along these same lines, dear reader, as you surf the peak oil and climate change blogosphere and read the various opinions on display there, I’d encourage you to ask yourself what those opinions amount to in actual practice.  A remarkably large fraction of them, straight across the political landscape from furthest left to furthest right and including all stops in between, add up to demands that somebody else, somewhere else, do something. Since the people making such demands rarely do anything to pressure, or even to encourage, those other people elsewhere to do whatever it is they’re supposed to do, it’s not exactly hard to do the math and recognize that here again, these opinions amount to so many ways of insisting that the people holding them don’t have to give up the extravagant and unsustainable lifestyles most people in the industrial world think of as normal and justifiable. 

There’s another way to make the same point, which is that most of what you’ll see being proposed in the peak oil and climate change blogosphere has been proposed over and over and over again already, without the least impact on our predicament. From the protest marches and the petitions, through the latest round of grand plans for energy futures destined to sit on the shelves cheek by jowl with the last round, right up to this week’s flurry of buoyantly optimistic blog posts lauding any technofix you care to name from cold fusion and algal biodiesel to shale gas and drill-baby-drill:  been there, done that, used the T-shirt to wipe another dozen endangered species off the face of the planet, and we’re still stuck in the same place.  The one thing next to nobody wants to talk about is the one thing that distinguished the largely successful environmental movement of the 1960s and 1970s from the largely futile environmental movement since that time, which is that activists in the earlier movement were willing to start the ball rolling by making the necessary changes in their own lives first.

The difficulty, of course, is that making these changes is precisely what many of today’s green activists are desperately trying to avoid. That’s understandable, since transitioning to a lifestyle that’s actually sustainable involves giving up many of the comforts, perks, and privileges central to the psychology and identity of people in modern industrial societies.  In today’s world of accelerating downward mobility, especially, the thought of taking any action that might result in being mistaken for the poor is something most Americans in particular can’t bear to contemplate—even when those same Americans recognize on some level that sooner or later, like it or not, they’re going to end up poor anyway.

Those of my readers who would like to see this last bit of irony focused to incandescence need only get some comfortably middle class eco-liberal to start waxing lyrical about life in the sustainable world of the future, when we’ll all have to get by on a small fraction of our current resource base.  This is rarely difficult; I field such comments quite often, sketching out a rose-colored contrast between today’s comfortable but unsatisfying lifestyles and the more meaningful and fulfilling existence that will be ours in a future of honest hard work in harmony with nature.  Wait until your target is in full spate, and then point out that he could embrace that more meaningful and fulfilling lifestyle right now by the simple expedient of discarding the comforts and privileges that stand in the way.  You’ll get to watch backpedaling on a heroic scale, accompanied by a flurry of excuses meant to justify your target’s continued dependence on the very comforts and privileges he was belittling a few moments before.

What makes the irony perfect is that, by and large, the people whom you’ll hear criticizing the modern lifestyles they themselves aren’t willing to renounce aren’t just mouthing verbal noises. They realize, many of them, that the lifestyles that industrial societies provide even to their more privileged inmates are barren of meaning and value, that the pursuit and consumption of an endless series of increasingly shoddy manufactured products is a very poor substitute for a life well lived, and that stepping outside the narrowing walls of a world defined by the perks of the consumer economy is the first step toward a more meaningful existence.  They know this; what they lack, by and large, is the courage to act on that knowledge, and so they wander the beach like J. Alfred Prufrock in Eliot’s poem, letting the very last inch or so of the waves splash over their feet—the bottoms of their trousers rolled up carefully, to be sure, to keep them from getting wet—when they know that a running leap into the green and foaming water is the one thing that can save them. Thus it’s not surprising that their daydreams cluster around imaginary tidal waves that will come rolling in from the deep ocean to sweep them away and make the whole question moot.

This is why it’s as certain as anything can be that within a year or so at most, a good many of the people who spent the last decade or so talking endlessly about last Friday will have some other date lined up for the end of the world, and will talk about it just as incessantly.  It’s that or face up to the fact that the only way to live up to the ideals they think they espouse is to walk straight toward the thing they most fear, which is the loss of the perks and privileges and comforts that define their identity—an identity many of them hate, but still can’t imagine doing without.

Meanwhile, of course, the economy, the infrastructure, and the resource flows that make those perks and privileges and comforts possible are coming apart around them.  There’s a great deal of wry amusement to be gained from watching one imaginary cataclysm after another seize the imagination of the peak oil scene or society as a whole, while the thing people think they’re talking about—the collapse of industrial civilization—has been unfolding all around them for several years now, in exactly the way that real collapses of real civilizations happen in the real world.

Look around you, dear reader, as the economy stumbles through another round of contraction papered over with increasingly desperate fiscal gimmicks, the political system of your country moves ever deeper into dysfunction, jobs and livelihoods go away forever, whatever social safety net you’re used to having comes apart, towns and neighborhoods devastated by natural disasters are abandoned rather than being rebuilt, and the basic services that once defined a modern society stop being available to a larger and larger fraction of the people of the industrial world.  This is what collapse looks like.  This is what people in the crumbling Roman Empire and all those other extinct civilizations saw when they looked out the window.  To those in the middle of the process, as I’ve discussed in previous posts, it seems slow, but future generations with the benefit of hindsight will shake their heads in wonder at how fast industrial civilization went to pieces.

I commented in a post at the start of this year that the then-current round of fast-collapse predictions—the same predictions, mind you, that had been retailed at the start of the year before, the year before that, and so on—were not only wrong, as of course they turned out to be, but missed the collapse that was already under way. The same point holds good for the identical predictions that will no doubt be retailed over the next few weeks, insisting that this is the year when the stock market will plunge to zero, the dollar and/or the Euro will lose all their value, the economy will seize up completely and leave the grocery shelves bare, and so on endlessly; or, for that matter, that this is the year when cold fusion or algal biodiesel or some other vaporware technology will save us, or the climate change Kum Ba Ya moment I mentioned earlier will get around to happening, or what have you. 

It’s as safe as a bet can be that none of these things will happen in 2013, either.  Here again, though, the prophecies in question are not so much wrong as irrelevant.  If you’re on a sinking ocean liner and the water’s rising fast belowdecks, it’s not exactly useful to get into heated debates with your fellow passengers about whether the ship is most likely to be vaporized by aliens or eaten by Godzilla.  In the same way, it’s a bit late to speculate about how industrial civilization will collapse, or how to prevent it from collapsing, when the collapse is already well under way.  What matters at that stage in the game is getting some sense of how the process will unfold, not in some abstract sense but in the uncomfortably specific sense of where you are, with what you have, in the days and weeks and months and years immediately ahead of you; that, and then deciding what you are going to do about it. 

With that in mind, dear reader, I’d like to ask you to do something right now, before going on to the paragraph after this one.  If you’re in the temperate or subarctic regions of the northern hemisphere, and you’re someplace where you can adjust the temperature, get up and go turn the thermostat down three degrees; if that makes the place too chilly for your tastes, take another moment or two to put on a sweater.  If you’re in a different place or a different situation, do something else simple to decrease the amount of energy you’re using at this moment.  Go ahead, do it now; I’ll wait for you here.

Have you done it?  If so, you’ve just accomplished something that all the apocalyptic fantasies, internet debates, and protest marches of the last two decades haven’t:  you’ve decreased, by however little, the amount of carbon dioxide going into the atmosphere. That sweater, or rather the act of putting it on instead of turning up the heat, has also made you just a little less dependent on fossil fuels. In both cases, to be sure, the change you’ve made is very small, but a small change is better than no change at all—and a small change that can be repeated, expanded, and turned into a stepping stone on the way to  bigger changes, is infinitely better than any amount of grand plans and words and handwaving that never quite manage to accomplish anything in the real world.

Turning down your thermostat, it’s been said repeatedly, isn’t going to save the world.  That’s quite true, though it’s equally true that the actions that have been pursued by climate change and peak oil activists to date don’t look particularly likely to save the world, either, and let’s not even talk about what wasn’t accomplished by all the wasted breath over last Friday’s nonevent.  That being the case, taking even the smallest practical steps in your own life and then proceeding from there will take you a good deal further than waiting for the mass movements that never happen, the new technologies that never pan out, or for that matter the next deus ex machina some canny marketer happens to pin onto another arbitrary date in the future, as a launching pad for the next round of apocalyptic hysteria.

Meanwhile, a world is ending.  The promoters of the 2012 industry got that right, though they missed just about everything else; the process has been under way for some years now, and it won’t reach its conclusion in our lifetimes, but what we may as well call the modern world is coming to an end around us.  The ancient Mayans knew, however, that the end of one world is always the beginning of another, and it’s an interesting detail of all the old Mesoamerican cosmological myths that the replacement for the old world doesn’t just pop into being.  Somebody has to take action to make the world begin. 

It’s a valid point, and one that can be applied to our present situation, when so many people are sitting around waiting for the end and so few seem to be willing to kickstart the beginning in the only way that matters—that is, by making actual changes in their own lives.  The deindustrial world of the future is poised to begin, but someone has to begin it.  Shall we?

Wednesday, December 19, 2012

Enacting Democracy

The recovery of reason, the theme of last week’s post here on The Archdruid Report, has implications that go well past the obvious. One of the examples that comes first to mind is also directly relevant to the theme of this series of posts, and it unfolds from an experience that many people have mentioned to me in recent years:  the inability of Americans with different beliefs to sit down and have a constructive conversation about their disagreements.

Those of my readers who have tried to do this any time recently, unless they were very lucky, will have found stalemate the all but inevitable outcome. Each side trots out its favorite talking points, most of them sound bites culled from popular media of one kind or another. When these fail to have the expected effect on the other side, both sides try again, with similar results, until finally one or both sides withdraw into frustration and hostility.

Though it’s unpopular these days to point this out, both sides in the current American culture wars follow this same wearily predictable pattern. Yes, I’m familiar with the recent flurry of liberal psychologists who insist that conservatives are just too irrational to accept what liberals see as self-evident truths; I don’t buy their claims, not least because I’ve watched liberals behave with exactly the same degree of illogic in parallel situations. The problem on both sides, as I see it, is the debasement of thinking discussed in last week’s post:  the malign transformation of our inner discourse into a set of arbitrary linkages between verbal noises and simple emotional reactions.  If a verbal noise produces warm fuzzy emotions in one person and cold prickly emotions in another, they are not going to be able to communicate unless both are able to get past that unthinking reaction—and getting past that unthinking reaction is something that very few Americans these days are able to do.

There’s another useful way to speak of the confusion of language in today’s America, and that’s to point out that nearly all our collective discourse has been reduced to phatic communication.  That seemingly exotic phrase describes a very familiar process:  the use of verbal noises to signal belonging and readiness for social interaction.  When two men sit down in a bar here in Cumberland, and one says to the other, “So, how about them Ravens?”—we’re halfway between Baltimore and Pittsburgh, so in football season it’s either that or “How about them Steelers?”—the question needn’t indicate any interest in the team in question.  Rather, it’s a formal way to acknowledge the other person’s presence and claim membership in a community.  In a different context, the question might be “Nice weather, isn’t it?” or some other equally vacant utterance. The form varies but the content—or more precisely the lack of content—remains identical.

Much of today’s political discourse serves exactly the same purpose: it signals readiness for social interaction and claims membership in a specific political subculture, and that’s basically all it does. The verbal noises that get used for phatic communication in that context vary even with fairly small shifts across the political landscape, but if you sit in on a discussion among people who more or less agree with each other’s politics, you can usually figure out pretty quickly what the relevant warm-fuzzy and cold-prickly phrases are, and once you’ve done that you can identify yourself either as a member of the community or as an outsider with a very few words.  It’s an experiment I recommend, partly for the entertainment value, and partly because there are few better ways to learn just how much of what passes for political thought these days is a set of essentially content-free signals meant to define the boundaries of a group.

It’s really quite remarkable to watch the range of things that get turned into phatic labels for political subcultures these days. Not long ago, for example, "Merry Christmas" and "Happy Holidays" were equally content-free phatic utterances used from the middle of November to the end of the year across most of American society. These days, "Merry Christmas" has been turned into a phatic badge on the rightward end of the contemporary culture wars, and "Happy Holidays" is well on its way to becoming a phatic badge of equal force on the left.  Myself, I have no problem wishing my Christian neighbors a merry Christmas—that is what they’re celebrating, after all—and wishing a happy Hanukkah, a blessed solstice, or even a merry Krampustide to those who celebrate these other festivities; one of the benefits of being able to use language for purposes other than phatic communication is that, when a phatic noise is the right thing to use, you can choose your signals deliberately to get the results you want.

It thus probably needs to be said that there’s nothing wrong with phatic communication. Human beings are social primates, with the normal set of social primate instincts and reactions, and casual comments about football teams and the weather are no more objectionable in themselves than the grunts and postures baboons use to accomplish the same ends.  The problem here is simply a function of the fact that human language has functions other than phatic communication, and when those other functions are of crucial importance, staying stuck in phatic communication doesn’t help much.

There’s an old word, dialectic, that may be worth introducing here. No, it doesn’t have anything to do with Marxism; long before Hegel’s time, it was used for exactly the kind of communication that’s most lacking in American society these days, the kind in which two or more people sit down and say, in effect, “let us reason together.”  The ancient philosopher Plotinus described dialectic as the most precious part of philosophy, and the point’s a valid one; the ability to sit down with someone who disagrees with you about some important issue, discuss the matter, determine what common ground exists and where the differences of opinion lie, and either resolve the disagreement or sort out the questions of fact and value that have to be settled in order to resolve it, represents a high level of the practical wisdom that philosophy once upon a time was meant to cultivate.

Dialectic is a learned skill, and not a particularly difficult one, either.  Anyone who can tell the difference between a fact and an opinion, recognize a dozen or so of the standard logical fallacies, follow an argument step by step from its premises to its conclusion, and forbear from dragging the discussion down to the level of personal slurs, can pick it up promptly given a competent teacher and a little practice.  In the ancient world, dialectic was the way that philosophy was taught: a teacher would start a conversation with a couple of senior students on some specific theme, and go from there. If the dialogue that followed was any good, it wouldn’t simply rehash existing knowledge, but turn into an adventure of the mind that broke new ground; those of my readers who are familiar with the dialogues of Plato, which were meant to imitate dialectic at work, will have some sense of how this worked.

Pass beyond the circle of students around a teacher, and dialectic merges into rhetoric. That’s a word that gets plenty of use these days, nearly always with a heavy cargo of cold pricklies attached to it. Until quite recently, though, rhetoric was well understood as one of the essential skills of citizenship: the ability to stand up and explain, in clear, concise, and compelling language, what you think about a given issue. Of all the  skills of democracy, it’s hard to think of one more thoroughly misplaced than this one.  How many times, dear reader, have you heard people bemoaning the fact that people in America aren’t willing to listen to one another?  There’s a reason for that, though it’s not one you’re likely to hear; it’s that next to nobody in this country seems to be able to make a cogent, sensible comment on an issue—on anyissue—and then sit down, shut up, and let somebody else take the floor.  It seems to have been completely forgotten nowadays that competent rhetoric makes the listener want to keep listening.

Rhetoric is another learned skill.  There are plenty of good textbooks on the subject, ranging from ancient Greek texts to online tutorials packed with the latest buzzwords, and there’s also a voluntary organization—Toastmasters International—that teaches rhetorical skills via a network of local clubs.  It’s not particularly difficult to learn, either. The great obstacle here is the terror of public speaking that’s efficiently instilled in American schoolchildren by the culture of bullying that pervades our public schools, and that can be outgrown; I had a world-class case of it not all that many years ago, for example. The benefits to learning it are not small, and are far from limited to its crucial role in fostering democracy, but we’ll stay focused on this latter for now.

When citizens can stand up in a meeting and present their points of view in concise, thoughtful, and convincing words, democracy becomes possible. When they can’t—when the only thing that takes place in a meeting is a collection of verbal noises denoting "warm fuzzy!" and "cold prickly!" to those others present who happen to link noises and emotions in the same way the speaker does—democracy is not an option, because it’s impossible to establish any shared basis for communication between those with different emotional reactions to any given set of verbal noises.  Transform those noises into words with mutually agreed meanings and you can get past that barrier, but transforming verbal noises into words with mutually agreed meanings is a skill very few Americans know any more.

The ability to converse in a reasoned and reasonable fashion, and the ability to present a viewpoint in a clear, cogent, and convincing manner, are thus among the core skills of democratic process that have been lost by contemporary American society and need to be recovered in a hurry.  Add these to the basic capacity to reason discussed in last week’s post, and you’ve got all the foundations for democratic process. You don’t yet have anything built on those foundations, but that’s the next step.  Democratic process itself comprises one more set of skills—the skills that allow a group of  people to meet together, discuss controversial issues, and agree on a collective response to them.

Those skills are not to be found in the so-called consensus methods that have kept activists on the Left spinning their wheels uselessly for three decades now.  I trust my readers remember the flood of self-congratulatory verbiage put forth by the Occupy movement in 2011; that movement vanished with scarcely a trace once the weather turned cold last year, and despite loud claims that it would pop back up again in the spring, it did no such thing.  There were a good many factors behind its failure, but among thwm was the manipulative behavior of activists who seized control of the movement using a supposedly egalitarian consensus system that placed all effective power, and a great deal of donated money, in their unelected and unsupervised hands. 

After months of circular debate that never quite managed to result in meaningful action, the vast majority of the protesters were convinced that their concerns would not be addressed and their efforts were wasted, and simply went home. This would be significant enough if it was new; in point of fact, it’s been the outcome of nearly every attempt at organized protest since the early 1980s, when the current suite of manipulative pseudoconsensus methods were adopted across most of the activist Left. If you want to know why the Left accomplished next to nothing for thirty years, while activists on the right were getting candidates into office and laws on the books, that’s an important part of the reason.

This is all the more embarrassing in that the toolkit of democratic process has been sitting on the shelf the whole time, waiting for somebody to notice that liberal and radical groups in the past used to use methods of organization that, however unfashionable they have become, actually work.  There are a lot of details, and entire books in fine print have been written on the minutiae, but the core elements of democratic process can be described in a paragraph. 

This is how it works.  Everyone has an equal voice and an equal vote, but the right to participate depends on willingness to follow the rules, and members can be ejected for abusive behavior; the chairperson of the meeting, and the handful of other people needed to make it work, are elected to be impartial referees of the process, and can be overruled or removed by vote if they abuse their positions; one person speaks at a time, and the chairperson determines who speaks next; an overly longwinded speaker can be told to shut up by the chairperson, or by vote of the members; once a vote takes place on any issue, the issue can’t be brought back up for debate again without a 2/3 majority, to keep a minority with an agenda from holding the meeting hostage; and the goal of the meeting, and of every part of the process, is to come to a decision, act on it, and get home at a reasonable hour. 

That’s democratic process.  It evolved organically over many centuries from its origins in the rough but functional practices of Anglo-Saxon tribal assemblies, and like other organic systems, it looks much sloppier but works much better than the idealized abstractions cooked up by radicals on either end of the spectrum. It’s easy to compare it unfavorably to one or another of those idealized abstractions, but the proof of the pudding is in the eating; those who want to demonstrate that some other system is as effective as democratic process are welcome to use that other system on smaller scales, with voluntary organizations and local communities, and prove that it works. That was, after all, how democratic process emerged as the default option in the Western world: in actual practice, in an assortment of voluntary organizations, local communities, political parties and protest groups, it proved to be more effective than the alternatives.

I should say, finally, that even the most lively revival of the core skills of democracy isn’t likely to affect the political sphere much for a couple of decades at least; if nothing else, the sheer inertia of a political dialogue debased as far as ours has been will take at least a generation to pass off. The point in reviving these things now is to lay foundations for the future.  Right now, in the fading years of the Age of Abundance, it’s fairly easy to learn the things I’ve discussed in last week’s and this week’s post; the intellectual resources needed for such a project can be found readily in libraries and on the internet, and a great many people have enough spare time to invest in such a project that much could be done.  The further we proceed into resource depletion, infrastructure breakdown, environmental instability, and the rest of the bubbling witch’s brew we’ve cooked up for ourselves in the cauldron of the near future, the less true that is likely to be. Thus any effort to make democratic process and the skills that make it possible available to the far future has to begin now, or soon.

It’s a good season to keep such logic in mind.  Those of my readers who have gardens, or are planning to plant one in the new year, will already be glancing through seed catalogs and roughing out, at least in the mind’s eye, what they will need for the spring planting.  In the same sense, though on a larger and more daunting scale, those of us who are watching the stormclouds of a greater winter gather on the horizon should be thinking about what seeds they intend to store for a more distant springtime. To my mind, at least, there is no greater challenge and no more important work to be done.

In the meantime, I wish each of you a blessed solstice, or whatever other festival your own faith or traditions assign to this time of year.  Next week, when winter is here and the partying is done, we’ll have a lot more to talk about.

****************
End of the World of the Week #53

The last months of 1999, the subject of last week’s End of the World of the Week, were in many ways just a running start for one of the most wildly popular apocalyptic dates in history, the year 2000. An astonishing number of predictions of all kinds clustered around that impressively round number. For decades beforehand, in fact, the odds were pretty good that any projection of trends in the fairly near future would begin, “In the year 2000...”

Apocalyptic prophecies were among the many notions that clustered around that year, with the widely ballyhooed Y2K bug only the most heavily publicized among them. As far back as the 13th century, the Catholic theologian Peter Olivi predicted that the Second Coming would take place that year.  Isaac Newton made the same prediction in an early prophetic work, before settling on 2060 in his later writings.  Puritan pastor Jonathan Edwards, author of the famous sermon Sinners in the Hands of an Angry God, tapped 2000 as the beginning of Christ’s millennial reign, as did Edgar Cayce and Sun Myung Moon.

Plenty of more exotic apocalypses were pinned on the same date. Popular psychic Ruth Montgomery proclaimed that the Earth would be knocked off its axis. Melody Mehta, a less widely known figure in the same business, insisted that a comet would knock Mars’ moon Phobos out of its orbit and send it careening into the Earth. On a less cosmic scale, financial writers Peter Jay and Michael Stewart predicted economic collapse and the rise of dictatorships in Europe and the US in a book somewhat unoriginally titled Apocalypse 2000. No matter what kind of apocalypse you were in the market for, 2000 had something on offer.

Except, of course, that none of them happened. In fact, the vast majority of all the predictions made for the year 2000, from the most exotic to the the most prosaic, landed with a thud.  The fact that so many predictions clustered around that date, it turned out, showed only that when people try to predict the future, some dates are more popular than others.

—for more failed end time prophecies, see my book Apocalypse Not

Tuesday, December 18, 2012

The Night Is A Tunnel

I have told the sky all my loneliest thoughts of you. And all it does is shine star light back at me. But I guess that's what makes it such a good listener. 

Wednesday, December 12, 2012

Producing Democracy

Last week's post here on The Archdruid Report attempted to raise a question that, as I see it, deserves much more discussion than it gets these days.  Most currently popular ways of trying to put pressure on the American political system presuppose that the politicians will pay attention if the people, or the activists who claim to speak in their name, simply make enough noise. The difficulty is that the activists, or for that matter the people, aren’t actually giving the politicians any reason to pay attention; they’re simply making noise, and the politicians have gotten increasingly confident that the noise can be ignored with impunity.

That’s what’s implied by saying that protest marches, petitions, letter-writing campaigns, and the like consume democracy.  These standard modes of activism only work if the officials who make decisions have some reason to think that the activists can follow through with some more effective action, such as a serious challenge in the next election, if they’re ignored.  It’s those other actions that produce democracy or, in less metaphoric terms, convince elected officials that ignoring the activists could put their careers at risk.

What sets democracy apart from most other systems of government is that it gives citizens a peaceful way to make good on such threats.  This is a huge advantage, for the most pragmatic of reasons.  In autocratic societies, where the populace has no way to get rid of inept officials short of revolution, a vast amount of administrative idiocy and incompetence goes unpunished. The notion that autocracies are by definition more competent than democracies is quite simply untrue; the annals of autocratic states such as ancien régimeFrance and Communist Russia are packed with examples of the most egregious incompetence—and no, despite the slogans, Mussolini didn’t make the trains run on time, either.  It’s simply easier to cover up governmental stupidity in an autocracy, since there aren’t media independent of government control to spread the word and embarrass the powers that be.

Yet that advantage, again, depends on the ability of citizens to vote the rascals out when they deserve it. In today’s America, that ability is little more than theoretical these days.  I’ve discussed in a number of posts already how what was once one of the world’s liveliest and most robust democratic systems has lapsed into a sham democracy uncomfortably reminiscent of the old Eastern Bloc states, where everyone had the right to vote for a preselected list of candidates who all support the same things. The reasons for that decay are complex, and again, I’ve talked about them in detail already. What I want to address this week is what might be done about it—and that requires a second look at the countervailing forces that were once hardwired into the grassroots level of the American system.

A thought experiment might help clarify the issues here.  Imagine, dear reader, that early next year you hear that a couple of legislators and popular media figures in your state are talking about forming a new political party that will offer a meaningful alternative to the stalemate in Washington DC.  The two major parties ignore them, but by early 2014 the new party is more or less in existence, and candidates under its banner are running for Congress and a range of state offices.  The morning after the 2014 election, Republicans and Democrats across the nation wake up to discover that they are going to have to deal with a significant third-party presence in Congress and a handful of state governments controlled lock, stock and barrel by the new party.

The two years leading up to the 2016 election pass by in a flurry of political activity as the old parties struggle to regain their joint monopoly over the political process and the new party scrambles to build a national presence.  In 2016, the new party nominates its first presidential candidate, a longtime activist and public figure.  The campaign faces an uphill fight, and loses heavily; some of the new party’s people in Congress and state governments are ousted as well. Pundits insist that it was all a flash in the pan, but they’re discomfited in the midterm elections in 2018 when the new party scores a major upset, winning a majority in the House of Representatives and nearly doubling its Senate presence.

The new party’s gains strain the existing structure of American partisan politics to the breaking point. As the 2020 election nears, the Democratic Party, outflanked and marginalized by the rising new party, disintegrates in internecine feuding and fails to field a presidential candidate at all.  The Republican Party breaks in two, with Tea Party and country-club Republicans holding competing conventions and nominating separate presidential tickets.  Yet another new party springs up, composed mostly of old guard politicians from what used to be the middle of the road, and nominates its own candidate. Under US law, whatever party gets the most votes in any state wins that state’s votes in the electoral college—and so the new party, by winning a plurality of the popular vote in just enough states to matter, sees its candidate raising his hand on January 20, 2021 to take the oath of office as the nation’s next president.

Suggest a scenario of that kind to most Americans today and they’ll dismiss it as impossible. That’s all the more curious, in that every detail of the thought experiment I’ve just sketched out is drawn from an earlier period in American history.  The years in question ran from 1854 to 1860, and the new party was the Republican Party; the Whigs were the party that imploded, the Democrats the party that split in two, the short-lived fourth party was the Constitutional Union Party and, of course, the tall figure standing up there taking the oath of office in 1861 was Abraham Lincoln.

Yet it’s true that an upset of the same kind would be much more difficult to pull off today.  Several different factors combine to make that the case, but to my mind, the most important of them is the simple and awkward fact that the skills that would be needed to make it happen are no longer to be found among activists or, for that matter, American citizens in general.  Organizing a new political party, building up a constituency on a national level, and making the machinery of democracy turn over in response, requires the pragmatic application of certain learned and learnable skill sets, which most people in America today do not know and are by and large uninterested in learning.  There are, broadly speaking, three such skill sets, and we’ll take them one at a time

The first can’t be discussed without opening an industrial sized can of worms, one that will take the rest of this post and still leave ends wriggling in all directions, but that can’t be helped. I’d like to start the can opener going with one of the odder conversations that spun off from last week’s post.   My regular readers will remember that one of the core themes of that post was the suggestion that, though democratic systems are routinely corrupt and suffer from a galaxy of other pervasive problems, they generally provide more civil rights and more consistent access to due process to their citizens than do autocratic systems, and that this is true even in a nonindustrial setting.

One of my readers took heated exception to this claim, and insisted that preindustrial America was no better than any other country of the time. It’s the debate that followed, though, that brought out the detail I want to emphasize.  To defend his counterclaim, my reader talked about the current US prison system, the evils of intellectual property rights, and a flurry of other issues irrelevant to the point at hand, ending up with a claim that since Adolf Hitler was elected Chancellor of Germany in 1933, Nazi Germany was a democracy and democracy was therefore bad.  None of his arguments had any bearing on whether citizens of the United States in its preindustrial days—not, please note, all its residents, much less people outside its borders; democracies abuse noncitizens more or less as often as other forms of government do, which is why I specified citizens in my distinctly lukewarm words of praise—had more civil rights and more reliable access to due process than citizens of autocratic countries during the same period.

It’s a matter of historical record that in 1800, say, when the United States was still almost wholly agrarian, an American citizen could stand up in a public meeting anywhere in the country, say that President Adams was a liar, a thief, and a scoundrel who should be hounded out of office at the first opportunity, and suffer no civil or criminal penalty whatever for that utterance—and could go on with equal impunity to make good on his words by doing his best to hound Adams out of office and put Tom Jefferson in his place.  It’s equally a matter of historical record that making a similar comment in the same year about First Consul Napoleon Bonaparte, Tsar Pavel I, Sultan Selim III, the Jiaqing Emperor, or Shogun Tokugawa Ienari in the respective domains of these autocrats would have resulted in a painful death, or at best a long stay in prison under ghastly conditions, and let’s not even talk about what happened to people who showed any sign of trying to replace these heads of state with some other candidate. That’s a significant difference in civil rights, and it’s what I was talking about, but my attempts to suggest to my reader that he was not addressing my point got answered by increasingly irritable comments insisting that yes, he was.

It finally dawned on me that from his perspective, he was, because the point he thought I was making was something like “democracy is good,” or more exactly that the verbal noise “democracy” ought to be linked with warm fuzzy feelings.  He was insisting in response, more or less, that the verbal noise “democracy” ought to be linked to cold prickly feelings, and his rhetorical strategy—a very common one on the internet these days, as it happens—was simply to attempt to associate various cold prickly feelings with the verbal noise in question, in the hope that enough of the cold prickly feelings would stick to the verbal noise to make his point.  The fact that I might be trying to do something other than linking a verbal noise to an emotional state seemingly did not occur to him.

It’s only fair to point out that he was far from the only person whose response to that post amounted to some equally simplistic emotional linkage. On the other side of the political spectrum, for instance, was a reader who insisted that the United States was not an empire, because empires are bad and the United States is good.  To him, the verbal noise “empire” was linked to cold prickly feelings, and those clashed unbearably with the warm fuzzy feelings he linked to the word “America.” It’s a fine example of the lumpen-Aristotelianism against which Alfred Korzybski contended in vain: A is A and therefore A cannot be not-A, even if A is a poorly chosen hypergeneralization that relates primarily to an emotional state and embraces an assortment of vaguely defined abstractions with no connection between them other than a nearly arbitrary assignment to the same verbal noise.

I don’t bring up these examples because they’re in any way unusual; they’re not. I bring them up because they quite adequately represent most of what passes for political discussion in America today. Examples abound; for one, think of the way the right uses the word “socialist” to mean exactly what the left means by the word “fascist.”  In plain English, either one translates out as “I hate you,” but both can be far more adequately represented by the snarl a baboon makes when it’s baring its teeth in threat. Now of course both words, like “democracy” and “empire,” actually mean something specific, but you won’t find that out by watching their usage these days.

For another—well, I wonder how many of my readers have had, as I have, the experience of attempting to talk about the policies and behavior of a politician when the other person in the conversation insists on reducing everything to personalities. I think of an email exchange I endured a while back, in which my correspondent was trying to convince me that I was wrong to criticize Barack Obama, since he was a nice man, a man of integrity, and so on.  Every issue got dragged back to the man’s personality—or, more precisely, to my correspondent’s impressions of his personality, garnered at third hand from the media. When I brought up the extent to which the Obama administration’s policies copied those of his predecessor, for example, I got a frosty response about how wrong it was to equate Obama and Bush, since they were such different people.  One was, after all, linked with warm fuzzy feelings in my correspondent’s mind, while the other was linked with cold prickly feelings, and A cannot equal not-A.

One way to talk about the point I’m trying to make here is that the great majority of Americans have never learned how to think.  I stress the word “learned” here; thinking is a learned skill, not an innate ability.  The sort of mental activity that’s natural to human beings is exactly the sort of linkage of verbal noises to emotional states and vague abstractions I’ve outlined above. To get beyond that—to figure out whether the verbal noises mean anything, to recognize that an emotional state is not an objective description of the thing that triggers it, and to replace the vague abstractions with clearly defined concepts that illuminate more than they obscure—takes education.

Now of course we have an educational system in the United States. More precisely, we have two of them, a public school system that reliably provides some of the worst education in the industrial world, and a higher education industry that provides little more than job training—and these days, by and large, it’s training for jobs that don’t exist.  You can quite easily pass through both systems with good grades, and never learn how to work through an argument to see if it makes sense or check the credentials of a purported fact. That’s a problem for a galaxy of reasons, but one of them bears directly on the theme of this post, for it’s a matter of historical record, again, that democratic politics work only when the people who have the right to vote—however large or small that class happens to be—also get an education in the basic skills of thinking.

That’s why the first-draft versions of Western democracy emerged in the ancient Mediterranean world, especially but not only in the city-states of Greece, at a time when the replacement of hieroglyphs with alphabets had made literacy a common skill among urban citizens and one of history’s great intellectual revolutions was inventing logic and formal mathematics.  It’s why democratic ideas began to spread explosively through western Europe once education stopped being a monopoly of religious institutions and refocused on the same logical and mathematical principles that sparked an equivalent shift in the ancient Mediterranean, courtesy of the Renaissance and its aftermath.  It’s why the extension of democracy to previously excluded groups in the United States followed, after varying lag times, the extension of public education to these same groups—and it’s also why the collapse of American education in recent decades has been promptly followed by the collapse of American democracy.

It’s common enough to hear claims that American voters of previous generations must have been as poorly equipped in the skills of thinking as their equivalents today.  I would encourage any of my readers who want to make such a claim, or who like to think that the inhabitants of our self-styled information society must inevitably be better at thinking than people of an earlier age, to take the time to read the Lincoln-Douglas debates in their entirety, and then compare them to this year’s presidential debates.  Lincoln and Douglas, remember, were not speaking to a roomful of Ph.D.s; they were in a hotly contested Congressional election, in front of audiences of farmers, millworkers, shopkeepers, and craftsmen, the ordinary voters of 1858 Illinois, few of whom had more than an eighth grade education and many of whom had much less.  It does not speak well for the pretensions of today’s America that its presidential candidates this year pursued their debates on a level that a crowd of Chicago feedlot workers in 1858 would have found embarrassingly simplistic.

That’s among the many reasons why devising a framework for adult education outside the grip of the current American education industry is one of the most pressing needs of the decade or two right ahead of us.  That huge topic, though, is going to require a series of posts all to itself. What I want to stress here is that teaching the electorate to think is not the only challenge here; those of my readers who may be involved in trying to change the direction of contemporary American society on any scale, and for any reason, might find it useful to turn a cold and beady eye upon their own mental processes, and on those of the movements they happen to support.

As extraordinary amount of what passes for argument in today’s activist scene, after all, is exactly the sort of linking of verbal noises with simple emotional reactions, warm and fuzzy or cold and prickly as needed.  Some of this may be coldly cynical manipulation on the part of canny operators pushing the emotional buttons of their intended targets, to be sure, but a cold and cynical manipulator who sees his manipulations failing normally changes tack, and tries to find something that will work better.  That isn’t what we see among activists, though.  Consider the way that the climate change movement went from an apparently unstoppable juggernaut a decade ago to nearly total failure today.  The strategy chosen by the great majority of climate change activists could be adequately described as the mass production of cold pricklies; when the other side in the debate figured out how to counteract that, the activists’ sole response was to shout "Cold prickly! Cold prickly! COLD PRICKLY!!!" as loud as they could, and then wonder why people weren’t listening.

You can’t craft an effective strategy if your mental processes are limited to linking up verbal noises, simple emotional reactions, and vague abstractions.  It really is as simple as that. Until those who hope to have an influence on any level recognize this, they’re not going to have the influence they seek, and America is going to continue stumbling on autopilot toward a wretched end. Once that hurdle is past, the remaining steps are a good deal easier; we’ll get to them next week.

****************
End of the World of the Week #52

Does it make an apocalyptic prophecy more likely to come true if several different visionaries, from different traditions and times, agree on it?  Well, the last months of 1999 were targeted for the end of the world by a flurry of predictions from a diverse cast of would-be prophets.  Hal Lindsey, who admittedly predicted the end of the world every ten years or so all through the latter 20th century, insisted in one of his books that the Second Coming would arrive before the year 2000; so did preacher James Gordon Lindsay (no relation); so did conspiracy theorist Texe Marrs; so did literature released by the Seventh Day Adventists and the Jehovah’s Witnesses; and so, curiously enough, did Timothy Dwight IV, the distinguished Revolutionary War era educator and theologian.

My regular readers will already know the outcome of the story:  Jesus pulled another no-show.  All of the prophets in question except for Dwight, who had the good common sense to die in 1817, did the usual thing and found new dates on which to anchor their fantasies.

—for more failed end time prophecies, see my book Apocalypse Not

Tuesday, December 11, 2012

The Nature Of A River Is To Run

When it rains, the river will try and take you away from me.

Which is why, when it rains, we must hold onto each other, just a little tighter.

Friday, December 7, 2012

The Shadows On The Curtain

Sometimes the most important thing you can say is just

"Are you ok?"

Wednesday, December 5, 2012

Consuming Democracy

For most of a year now, my posts here on The Archdruid Report have focused on the nature, rise, and impending fall of America’s global empire.  It’s been a long road and, as usual, it strayed in directions I wasn’t expecting to explore when this sequence of posts began last winter. Still, as I see it, we’ve covered all the core issues except one, and that’s the question of what can and should be done as the American empire totters to its end.

Regular readers will know already that this question isn’t going to be answered with some grandiose scheme for salvaging, replacing, transforming, or dismantling America’s empire, of the sort popular with activists on both sides of an increasingly irrelevant political spectrum—the sort of project that merely requires all those who hold political and economic power to hand it over meekly to some cabal of unelected ideologues, so that the latter can once again learn the hard way that people won’t behave like angels no matter what set of theories is applied to them. At the same time, there are choices still open to Americans and others in an era of imperial decline; we’re not limited, unless we choose to be, to huddling in our basements until the rubble stops bouncing.

Mind you, there are at least two things welded firmly enough in place in our near future that no action of yours, mine, or anyone’s will change them.  The first is that America’s global empire will fall; the second is that those who rule it will not let it fall without a struggle.  The US government and the loose and fractious alliance of power centers that dominate it are clearly unwilling to take Britain’s path, and accept the end of empire in exchange for a relatively untraumatic imperial unraveling.  To judge by all the evidence that’s currently available, they’ll cling to the shreds of imperial power, and the wealth and privilege that goes with it, until the last of those shreds are pulled from their cold stiff hands.  That’s a common boast, but it bears remembering that the moment always comes when those shreds get pried loose from those pale and rigid fingers.

These two hard facts, the imminence of imperial downfall and the unwillingess of the existing order to accept that imminence, impose certain consequences on the decades ahead of us.  Some of the most obvious of those consequences are economic.  The American standard of living, as I’ve pointed out more than once, has been buoyed to its current frankly absurd level by a tribute economy that funnels much of the wealth of the world to the the United States.  We’ve all heard the self-congratulatory nonsense that insists that this nation’s prosperity is a product of American ingenuity or what have you, but let us please be real; nothing Americans do—nothing, that is, other than maintaining garrisons in more than 140 countries and bombing the bejesus out of nations that get too far out of line—justifies the fact that the five per cent of humanity that can apply for a US passport get to use a quarter of the planet’s energy and a third of its natural resources and industrial product.

As our empire ends, that vast imbalance will go away forever.  It really is as simple as that. In the future now breathing down our necks, Americans will have to get used to living, as our not so distant ancestors did, on a much more modest fraction of the world’s wealth—and they’ll have to do it, please remember, at a time when the ongoing depletion of fossil fuels and other nonrenewable resources, and the ongoing disruption of the environment, are making ever sharper inroads on the total amount of wealth that’s there to distribute in the first place.  That means that everything that counts as an ordinary American lifestyle today is going to go away in the decades ahead of us. It also means that my American readers, not to mention everyone else in this country, are going to be very much poorer in the wake of empire than they are today.

That’s a sufficiently important issue that I’ve discussed it here a number of times already, and it bears repeating. All too many of the plans currently in circulation in the green end of US alternative culture covertly assume that we’ll still be able to dispose of wealth on the same scale as we do today.  The lifeboat ecovillages beloved by the prepper end of that subculture, just as much as the solar satellites and county-sized algal biodiesel farms that feature in the daydreams of their green cornucopian counterparts, presuppose that such projects can be supplied with the startup capital, the resources, the labor, and all the other requirements they need. 

The end of American empire means that these things aren’t going to happen.  To judge by previous examples, it will take whatever global empire replaces ours some decades to really get the wealth pump running at full speed and flood its own economy with a torrent of unearned wealth.  By the time that happens, the decline in global wealth driven by resource depletion and environmental disruption will make the sort of grand projects Americans envisioned in our empire’s glory days a fading memory all over the world.  Thus we will not get the solar satellites or the algal biodiesel, and if the lifeboat ecovillages appear, they’ll resemble St. Benedict’s original hovel at Monte Cassino much more than the greenwashed Levittowns so often proposed these days.  Instead, as the natural systems that undergird industrial civilization crumble away, industrial societies will lose the capacity to accomplish anything at all beyond bare survival—and eventually that, too, will turn out to be more than they can do.

That’s the shape of our economic future.  My more attentive readers will have noticed, though, that it says little about the shape of our political future, and that latter deserves discussion. One of the lessons of history is that peoples with nearly identical economic arrangements can have radically different political institutions, affording them equally varied access to civil liberties and influence on the decisions that shape their lives.  Thus it’s reasonable and, I think, necessary to talk about the factors that will help define the political dimension of America’s post-imperial future—and in particular, the prospects for democracy in the wake of imperial collapse.

There are at least two barriers to that important conversation.  The first is the weird but widespread notion that the word “democracy”—or, if you will, “real democracy”—stands for a political system in which people somehow don’t do the things they do in every other political system, such as using unfair advantages of various kinds to influence the political process.  Let’s start with the obvious example.  How often, dear reader, have you heard a pundit or protester contrasting vote fraud, say, or bribery of public officials with “real democracy”?

Yet real democracy, meaning the sort of democracy that is capable of existing in the real world, is always plagued with corruption. If you give people the right to dispose of their vote however they wish, after all, a fair number of them will wish to sell that vote to the highest bidder in as direct a fashion as the local laws allow.  If you give public officials the responsibility to make decisions, a fair number of them will make those decisions for their own private benefit.  If you give voters the right to choose public officials, in turn, and give candidates for public office the chance to convince the public to choose them, you’ve guaranteed that a good many plausible rascals will be elected to office, because that’s who the people will choose.  That can’t be avoided without abandoning democracy altogether.

Now of course there’s a significant minority of people who react to the inherent problems of democracy by insisting that it should be abandoned altogether, and replaced with some other system portrayed in suitably rose-colored terms—usually, though not always, something along the lines referred to earlier, in which an unelected cabal of ideologues gets to tell everyone else what to do. The claim that some such project will provide better government than democracies do, though, has been put to the test more times than I care to count, and it consistently fails. Winston Churchill was thus quite correct when he said that democracy is the worst possible system of government, except for all the others; what makes democracy valuable is not that it’s so wonderful, but that every other option has proven itself, in practice, to be so much worse.

Just now, furthermore, democracy has another significant advantage: it doesn’t require the complicated infrastructure of industrial society.  The current United States constitution was adopted at a time when the most technologically sophisticated factories in the country were powered by wooden water wheels, and presidents used to be inaugurated on March 4th to give them enough time to get to Washington on horseback over muddy winter roads. (The date wasn’t moved to January 20 until 1933.)  America was still anything but industrialized in the 1820s, the decade that kickstarted the boisterous transformations that sent an aristocratic republic where only the rich could vote careering toward ever more inclusive visions of citizenship.  In the deindustrial future, when the prevailing economic forms and standards of living may resemble those of the 1790s or 1820s much more closely than they do those of today, that same constitution will be right at home, and will arguably work better than it has since the imperial tribute economy began flooding the country with unearned wealth.

There’s just one problem with this otherwise appealing prospect, which is that American democracy at the moment is very nearly on its last legs.  A great many people are aware of this fact, but most of them blame it on the machinations of some evil elite or other.  Popular though this notion is, I’d like to suggest that it’s mistaken.  Of course there are plenty of greedy and power-hungry people in positions of wealth and influence, and there always are.  By and large, people don’t get wealth and influence unless they have a desire for wealth and influence, and “having a desire for wealth and influence” is simply another way of saying “greedy and power-hungry.” Every political and economic system, especially those that claim to be motivated solely by the highest of ideals, attracts people who are greedy and power-hungry. Political systems that work, by definition, are able to cope with the perennial habit that human beings have of trying to get wealth and power they haven’t earned.  The question that needs to be asked is why ours is failing to cope with that today. 

The answer is going to require us to duck around some of the most deeply ingrained habits of popular thought, so we’ll take it a step at a time. 

We can define democracy, for the sake of the current discussion, as a form of government in which ordinary citizens have significant influence over the people and policies that affect their lives. That influence—the effective ability of citizens to make their voices heard in the corridors of power—is a fluid and complex thing.  In most contemporary democracies, it’s exercised primarily through elections in which officials can be thrown out of office and replaced by somebody else.  When a democracy’s more or less healthy, that’s an effective check; there are always other people angling for any office, whether it’s president or town dogcatcher, and an official who wants to hold onto her office needs to glance back constantly over her shoulder to make sure that her constituents aren’t irritated enough at her to throw their support to one of her rivals.

The entire strategy of political protest depends on the threat of the next election.  Why would it matter to anybody anywhere if a bunch of activists grab signs and go marching down Main Street, or for that matter down the Mall in Washington DC?  Waving signs and chanting slogans may be good aerobic exercise, but that’s all it is; it has no effect on the political process unless it delivers a meaningful message to the politicians or the people.  When protest matters, the message to the politicians is blunt:  “This matters enough to us that we’re willing to show up and march down the street, and it should matter to you, too, if you want our votes next November.”  The message to the people is less direct but equally forceful:  “All these people are concerned about this issue; if you’re already concerned about it, you’re not alone; if you aren’t, you should learn more about it”—and the result, again, is meant to show up in the polls at the next election.

You’ll notice that the strategy of protest thus only means anything if the protesters have the means, the motive, and the opportunity to follow through on these two messages.  The politicians need to be given good reason to think that ignoring the protesters might indeed get them thrown out of office; the people need to be given good reason to think that the protesters speak for a significant fraction of the citizenry, and that their concerns are worth hearing.  If these are lacking, again, it’s just aerobic exercise.

That, in turn, is why protest in America has become as toothless as it is today.  Perhaps, dear reader, you went to Washington DC sometime in the last decade to join a protest march to try to pressure the US government into doing something about global warming.  If the president just then was a Democrat, he didn’t have to pay the least attention to the march, no matter how big and loud it was; he knew perfectly well that he could ignore all the issues that matter to you, break his campaign promises right and left, and plagiarize all the most hated policies of the previous occupant of the White House, without the least political risk to himself.  All he had to do come election time is wave the scary Republicans at you, and you’d vote for him anyway.  If he was a Republican, in turn, he knew with perfect certainty that you weren’t going to vote for him no matter what he did, and so he could ignore you with equal impunity.

No matter what party he belonged to, furthermore, the president also had a very good idea how many of the protesters were going to climb into their otherwise unoccupied SUVs for the drive back home to their carbon-hungry lifestyles; he knew that if he actually wanted to make them change those lifestyles—say, by letting the price of gasoline rise to European levels—most of them would chuck their ideals in an eyeblink and turn on him with screams of indignation; and a phone call to the Secretary of Energy would remind him that any meaningful response to climate change would require such steps as letting the price of gas rise to European levels.  He knew perfectly well, in other words, that most of the protesters didn’t actually want him to do what they claimed they wanted him to do; they wanted to feel good about doing something to save the Earth, but didn’t want to put up with any of the inconveniences that would be involved in any real movement in that direction, and so attending a protest march offered them an easy way to have their planet and eat it too.

It’s only fair to say that the same logic applies with precisely equal force on the other side of the line. If, dear reader, the protest march you attended was in support of some allegedly conservative cause—well, it wasn’t actually conservative, to begin with; the tiny minority of authentic conservatives in this country have been shut out of the political conversation for decades, but that’s an issue for another post—the man in the White House had no more reason to worry about your opinions than he had to fret about the liberal protest the week before.  If he was a Republican, he knew that he could ignore your concerns and his own campaign promises, and you’d vote for him anyway once he waved the scary Democrats at you.  If he was a Democrat, he knew that you’d vote against him no matter what.  Either way, in turn, he had a very good idea how many of the people out there who were denouncing drug abuse and waving pro-life and family-values placards fell all over themselves to find excuses for Rush Limbaugh’s drug bust, and paid for abortions when they knocked up the teenage girlfriends their wives don’t know about.

Does this mean that protest marches are a waste of time?  Not at all.  Nor does it mean that any of the other venerable means of exerting pressure on politicians are useless.  The problem is not in these measures themselves; it’s the absence of something else that makes them toothless.

That something else was discussed in an earlier post in this sequence: grassroots political organization.  That’s where political power comes from in a democratic society, and without it, all the marches and petitions and passionate rhetoric in the world are so much empty noise. Through most of American history, the standard way to put this fact to work was to get involved in an existing political party at the caucus level and start leaning on the levers that, given time and hard work, shift the country’s politics out of one trajectory and into another.  These days, both parties have been so thoroughly corrupted into instruments of top-down manipulation on the part of major power centers and veto groups that trying to return them to useful condition is almost certainly a waste of time.  At the same time, the fact that US politics is not currently dominated by Federalists and Whigs shows that even a resolutely two-party political culture is now and then subject to the replacement of one party by another, if the new party on the block takes the time to learn what works, and then does it.

The point I’m trying to explore here can be made in an even more forceful way.  Protest marches, like letter-writing campaigns and other means of putting pressure on politicians, have no power in and of themselves; their effect depends on the implied promise that the politicians will be held accountable to their choices come election time, and that promise depends, in turn, on the existence of grassroots political organization strong enough to make a difference in the voting booth. It’s the grassroots organization, we might as well say, that produces democracy; marches and other methods of pressuring politicians are simply means of consuming democracy—and when everyone wants to consume a product but nobody takes the time and trouble to produce it, sooner or later you get a shortage.

We have a severe and growing democracy shortage here in America.  In next week’s post, I’ll talk about some of the things that will be necessary to increase the supply.

****************
End of the World of the Week #51

What do you do if you throw an apocalypse and nobody comes?  That was the challenge faced by the Watch Tower Tract Society, an American offshoot of Christianity, when the end of the world failed to show up on schedule in 1914.  That date had played a central role in the Society’s prophecy since 1876, when the movement’s founder Rev. Charles Taze Russell and Nelson Barbour, a former Millerite, wrote a book predicting the Second Coming for that year.  As Russell’s original International Bible Students Movement morphed into the Watch Tower Tract Society, that prophecy became ever more central to the movement’s hopes.

As church bells rang in the year 1915, though, it became evident even to the most devout Watch Tower follower that Christ had pulled another one of his frequent no-shows. Admitting that your prophecy was just plain wrong is rarely a good career move for an apocalyptic prophet, and the Watch Tower organization had made so much of a ballyhoo about the upcoming end that it couldn’t get away with the usual fallback strategy of ignoring the failure and announcing a new date (though this was tried).  It fell to Russell to come up with a third option, one of the most ingenious in the history of apocalypses.

The Second Coming, he announced to his followers, had indeed occurred—in heaven.  Christ was now reigning in glory there, but the effects would take a little while to filter down to earth, so they just had to be patient.  They’re still waiting patiently; in the 1930s, the movement renamed itself the Jehovah’s Witnesses, and its members are still convinced that the Second Coming took place 98 years ago and its prophesied results will be showing up any day now.

—for more failed end time prophecies, see my book Apocalypse Not
Related Posts Plugin for WordPress, Blogger...