Wednesday, December 30, 2009

Immodest Proposals

Longtime readers of this blog may not be surprised to find that, after three posts explaining in detail why significant reforms are impossible in the current American political system, I propose to spend several more posts talking about significant reforms that might be part of a functional response to the crisis of our time. I freely grant that there’s irony involved, but proposing useful changes that won’t be enacted right away is by no means as pointless as it may seem in an age of just-in-time ordering and instant gratification.

It’s a safe bet that if something can’t last, it won’t, and the political situation in the United States today may just turn into a poster child for that old but by no means outworn maxim. Most of the world’s nations have replaced one political system with another at least once in the course of the last century; the process can certainly be traumatic, and it doesn’t always solve the problems that forced the change in the first place, but it isn’t the end of the world. Whatever crises drive the existing order into its final implosion, and whatever national traumas supervene until some degree of stability returns, there will be a place for new policies when the future government of the United States, or the regional governments that succeed it, get to work picking up the pieces. Nor, of course, is this blog only read by Americans; and for all I know, some of the ideas I plan on discussing here might strike a responsive chord elsewhere.

One word of caution, though: readers expecting me to offer them a ticket to Utopia are going to be disappointed. There’s a common notion that everything that’s wrong in the world is the fault of the institutions or personalities officially in charge, and can be fixed by replacing them with some other set of institutions or personalities. That notion has been tested more thoroughly by history than any other hypothesis I can think of in the social sciences, and it’s failed abjectly every time. Maybe we should finally get around to admitting that people will not behave like angels no matter how (or whether) they are governed, or who (if anyone) does the governing; and, in the process, admit that human beings are incurably human – that is, capable of the full spectrum of good and evil all by themselves – rather than moral puppets who can be expected to dance on command to the tune of a good or evil system.

It’s easy to come up with a perfect system of human society, so long as it doesn’t have to work in the real world, and it’s very easy to compare a perfect system on paper to the failings of a system in the real world, to the latter’s detriment. Nearly always, though, what John Kenneth Galbraith said about innovation in finance is just as true of innovation in political and social institutions: what gets ballyhooed as new and revolutionary thinking is normally the repetition of a fairly small set of fallacies that worked very poorly the last dozen or so times they were tried, and will work just as poorly this time, too. Those systems that function at all are fairly few in number, though there are a lot of minor variations on the basic themes, and the ones we’ve got now – representative democracy in politics, a market system in economics – have certain advantages. Though the current examples are troubled, corrupt, and at very high risk of being overwhelmed by the consequences of some very bad decisions made over the last few decades, the basic systems are noticeably less dysfunctional most of the time than most of the alternatives.

Thus I’m not going to present a grandiose plan for the complete transformation of everything, of the sort that have been in vogue among social reformers for so many years. Instead, I’m going to suggest a handful of limited, tightly focused changes that I think have a real chance, if they were to be implemented, of canceling out some of the self-defeating habits of the current system and replacing them with effective incentives toward the sort of habits our society needs to establish. I could start in any number of places, but the one I’ve chosen for this week is the seemingly unpromising field of tax policy.

That’s a subject on which a great deal of nonsense abounds just now. While at the public library here in Cumberland the other day, I found a book titled The End of Prosperity. This – I was about to describe it as meretricious, but that would be unfair to honest prostitutes – this pointless waste of inoffensive trees, then, claims that if the US government raised taxes to a level that might just actually pay for the services it provides, the result would be, well, the end of prosperity. Somehow the authors managed to ignore the fact that in the 1950s, when American prosperity was by many measures at its all-time peak, people in the upper tax brackets paid well over 2/3 of their income to Uncle Sam, and that the country has by most measures become less prosperous, not more, as those tax rates have been lowered.

There’s a reason for that, and it ties back into the distinction I made in several earlier posts about the differences among the primary, secondary, and tertiary economies. The primary economy, which is nature, and the secondary economy, which is the production of goods and services by human labor, are subject to negative feedback loops that tend to hold them in balance. The tertiary economy, which is the exchange of money and other forms of abstract wealth, is subject to positive feedback loops that drive it out of balance in ways that unbalance the other two economies as well.

The core of those feedback loops is the way that accumulations of paper wealth multiply in value, and so anything that drains those accumulations and puts them in the pockets of people who spend their money on groceries, instead of more paper, tends to stabilize the economy. That’s what the high tax rates of the Fifties did, and it’s not accidental that the more drastically tax rates have been cut in the last three decades, and the more tertiary wealth has been shielded from taxes by special capital-gains tax rates and the like, the more drastically the tertiary economy has manifested its usual tendency to run to extremes and blow itself up in the process.

Still, there are arguably less scattershot ways to drain off excesses from the tertiary economy, and the tax code also meshes very poorly with the primary economy, with its emerging reality of scarce resources and overburdened natural cycles. In a world where the accelerating exploitation of natural resources and the accumulation of paper wealth are major sources of problems, while the human labor at the core of the secondary economy is the most renewable resource we’ve got, we arguably tax the wrong things.

At the risk of veering off entirely into fantasy, then, let’s imagine a new tax code that taxes the right things instead. In this imaginary code there are no sales taxes, and no taxes on income from wages, salaries, dividends, and rents – that is, no taxes on the secondary economy at all. Instead, there are two classes of taxation. The first, affecting the primary economy, is on natural goods and services; the second, affecting the tertiary economy, is on interest income, capital gains, and all other forms of money made by money.

The taxes on natural goods and services follow the same rough line of logic as property taxes do at present. The federal government, as trustee for the American people, already effectively owns all the real estate within its borders – when you buy property, what you’re buying is the right to use that property within the limits of the laws and the national interest, which is why China can’t just contract with private landowners to buy a couple of disused fishing harbors on our west coast as bases for its navy. The same principle could reasonably apply to every other resource in the country. When you pump oil from the ground, you’re depleting part of the patrimony of the American people, and you should have to pay the government for that privilege. When you dump smoke out of a tailpipe, equally, you’re using the nation’s atmosphere as the gaseous equivalent of a landfill, and once again you should have to pay to do that. Every natural resource of every kind would be subject to the same sort of tax.

Now of course this would mean that the prices of many goods and services would go up considerably. Since everyone would have the money they wouldn’t have to spend on income and sales taxes, this may be a little less of a burden; but the crucial point is that people can avoid resource taxes by their personal choices. If you buy a hybrid car, you’re going to pay a lot less in petroleum tax, and a lot less in tailpipe tax as well – though the extraction taxes for the rare earth minerals in the batteries and electronics may set you back a bit, as they should. If you don’t own a car at all, you laugh all the way to the bank. Similarly, the price of a product made from metal mined from the earth includes the extraction tax for the mining, but the price of a product made from recycled material doesn’t; thus the manufacturer has a big incentive to use the recycled material and undercut the competition.

The second set of new taxes targets a different problem, one discussed already in this post. Right now, with the tax laws we have, it’s to the economic advantage of businesses to pull their money out of producing goods and services, and put it into blowing bubbles in the tertiary economy of paper wealth. That’s why General Motors manufactured more financial paperwork than cars for quite a few years, until it got into a head-on collision with bubble economics. From a broader perspective, that’s why America produces so few goods and services nowadays, and so much in the way of essentially hallucinatory paper wealth. Taxing financial income, but not earned income, does a fair amount to reverse that equation. If putting your money into bonds or derivatives means any profit you make suffers a significant tax bite, while putting your money into producing goods and services means you pocket the profits tax free, those derivatives and bonds will look a lot less inviting.

Notice also how these two kinds of tax work together to take an even larger bite out of one of the most mistaken economic priorities of our time, the replacement of human labor by machines. In case you haven’t noticed, the US has an unemployment problem; even before the most recent bubble burst, good working class jobs were very hard to come by. There are plenty of reasons why that happened, but tax policy that makes employers pay half again or more of the cost of a worker’s wages in order to hire that worker certainly haven’t helped. Eliminate those taxes, and place taxes on energy and natural raw materials instead, and in a good many cases a worker instead of a machine becomes the most cost-effective way to do the job.

Other arrangements could easily be devised to accomplish the same ends. The point I want to make with this extended exercise in nonfinancial speculation, though, is that some of the ways Americans do business, and pay (or don’t pay) for government services, simply don’t fit the realities of an age of ecological limits. A tax code that burdens the secondary economy – which is the economy that actually produces goods and services, remember – while encouraging the wasteful plundering of nature and the bubble-blowing antics of the tertiary economy is not going to help us weather the storms of the near future. A tax code, any tax code, that does the opposite – that makes it more profitable to employ human labor to meet human needs, and less profitable to disrupt the natural cycles that undergird our survival or to feed speculative excesses that pump imbalances into an already troubled economy – could be a very helpful asset in a time of crisis, and could be put in place tolerably easily, without having to tear an entire society to pieces and rebuild it from the ground up.

Would such a new tax code solve all our problems? Of course not. To my mind, though, it’s exactly the attitude that insists that we have to find a single solution that deals with all our problems that helps put any constructive response to those problems out of reach. If we’re to face the difficult future ahead of us with any sort of grace, that worthwhile achievement is much more likely to happen as a result of the tentative, uncertain coping mechanisms sketched out in Warren Johnson’s useful little book Muddling Toward Frugality than it is to unfold from some grandiose plan to reach a perfect world in a single leap. Monkeying with the tax code so that it rewards the economic behaviors that might help us get things of value through the approaching troubles, rather than rewarding the economic behaviors that will only make things worse, is one example of this sort of muddling; others will be brought up for discussion in the weeks to come.

Wednesday, December 23, 2009

The Political Ecology of Collapse, Part Three:

The Bomb at the Heart of the System

The outcome of the Copenhagen climate change talks last week could not have been better suited to illustrate the points I have been trying to make in the last two posts. After all the high hopes and overheated rhetoric, as I (and of course a great many other people) predicted some time ago, what remains in place as the dust settles is business as usual.

The United States and China, who head the main power blocs in the negotiations and also generate more CO2 than anyone else, minted a toothless accord that furthers nobody’s interests but theirs, and proceeded to tell the rest of the world to like it or lump it. A few climate activists are still gamely trying to find grounds for hope in the accord; others are shrilly accusing Barack Obama of betraying the messianic expectations they projected onto him; and a certain amount of stunned silence, in response to the failure of climate activism to have the slightest effect on the proceedings, is also being heard.

It’s probably worth pointing out that the results would not have differed noticeably if John McCain had won last November’s election. The consensus that has been fixed in place since Ronald Reagan’s first term, in other words, still dominates American politics. Despite increasingly desperate efforts on the part of both mainstream parties to appeal to an increasingly disaffected electorate via increasingly overheated rhetoric, it takes a micrometer to measure concrete differences in policy between the parties. Each party has its captive constituencies, to which it makes appropriate noises come election time; Republicans claim they want to ban abortion, Democrats claim they want to protect the environment, but neither party ever gets around to turning any of this talk into action.

The most popular explanation for all this relies on the sheer hypocrisy of politicians, and such a case is not too hard to make, not least because it’s rare for politicians to be any more ethical than the people they represent. Some versions of the case insist that politicians are cynical beasts who are in it purely for the money, and find shilling for various corrupt interests more lucrative than serving the public. Other versions, in the ascendant these days, insist that politicians are puppets of some sinister elite pursuing a totalitarian agenda, and then try to find reasons why every turn of events furthers that agenda.

Now of course it’s tolerably easy to find examples that can be used to support these claims. Some politicians are blatantly corrupt and self-serving; others just as blatantly put the interests of their allies in the business world ahead of the people they are supposed to serve. It furthers many political narratives to portray the situation as an episode of Dudley Do-Right, with some wicked elite or other in the role of Snidely Whiplash, tying the American people to the train tracks, as Dudley Do-Right scoops up an armload of protest signs and position papers and gallops off to the rescue. Still, I’m by no means certain this is really all there is to the matter.

The counterexample that comes to mind is Afghanistan, and specifically Obama’s decision to send another 30,000 troops (and an undisclosed number of “civilian contractors,” the modern military version of disposable temp labor) into that quagmire. To call this decision self-defeating is to understate matters considerably. Afghanistan is where empires go to die; the debacle of the Russian occupation a few years back was only the latest in a long and unbroken history of failed attempts to conquer Afghanistan. Not even Alexander the Great managed the trick, and whatever the personal qualities of the airbrushed machine politician in the Oval Office and the camo-clad bureaucrat who manages his war might be, I confess to a reasonable doubt that anybody in the future will call them Obama or McChrystal the Greater.

Leave aside moral issues for a moment, and it’s tolerably clear that only two strategies could prevent total US failure in Afghanistan. The first is to reinstate the draft, conscript half a million new soldiers, shift the US economy over to a wartime footing, and go into Afghanistan with the same overwhelming force the Chinese deployed successfully on similar terrain in Tibet. The other is to declare a victory and get out. Any other choice means the United States will keep on spending money it doesn’t have and prestige it can’t spare on a war it isn’t going to win.

I doubt that any of this is invisible to the experienced military planners in the Pentagon, or the politicians who give them their marching orders. Why, then, the futile gesture?

The hard fact of the matter is that neither of the two potentially successful strategies is politically possible to an American government today. Exemption from forced military service was part of the price the American middle class exacted in exchange for their abandonment of the radicalism of the 1960s, and no politician is willing to risk the backlash that would follow an attempt to tamper with that bargain. Furthermore, it’s by no means certain that America has the economic strength left to fight a real war at this point, and it’s not hard to name hostile powers who would be happy to use any such opportunity to push us over the edge into national bankruptcy.

Declaring a victory and getting out is a good deal more viable, and it’s the option that Obama’s successor in 2013 will likely be forced to embrace. Accepting it now, though, would offend many constituencies, not all of which have financial motives for supporting the war, and it would require America to give up on intervening in the Great Game of geopolitics now being played in central Asia – a goal many factions in the American political class are unready to abandon.

Behind the decision to send an inadequate force to prop up a losing struggle, in other words, lies the complex nature of political power in contemporary America. A great many people nowadays seem to think that because they don’t have the power to impose their agendas on the country, someone else must have that power, and the increasingly self-defeating decisions coming out of Washington must result from deliberate policy on the part of that someone else. Comforting as that belief may be, the facts don’t support it. A century of political reforms have diffused power so broadly in American society that no one group has a monopoly on power, and any group of would-be leaders has to build alliances and garner support among a great many independent centers of power with agendas of their own.

Now of course it’s quite true, as the left is fond of pointing out, that a great many of these power centers are interested primarily in pursuing their own interests, and are perfectly willing to do it at the expense of the common good. It’s also true that this indictment can be applied to the left as much as to the right. Still, behind the inevitable chicanery found across the political spectrum lies the insoluble dilemma in which the American political system has been caught since the 1970s – the inevitable failure of government by pork barrel in an age of decline.

Like most of the nations that call themselves representative democracies these days, America operates by means of a system not too different from the one that graced, if that’s the right word, the twilight years of the Roman Republic. The ultimate mandate for power comes from popular vote, and so every possible means is used to make sure elections come out as desired. Vote fraud is one such means; propaganda is another; but the most effective is to buy the loyalty of voting blocs with cold hard cash. From defense spending to entitlements to economic stimulus programs, that’s the name of the game, and it pays off handsomely come election time.

There are, however, at least two massive problems with this sort of pork-fed politics. First, the number of groups to be placated tends to rise as the size of the pork barrel increases. In today’s America, any group that can organize and raise money effectively enough to influence elections can usually elbow its way to a place at the feeding trough. (That today’s radicals of left and right alike are, by and large, inept at organizing and fundraising goes a long way to explain their insistence that power is being kept out of their hands by a malevolent elite.) It’s not hard to respond to a changing world when the interests that have to sign on to policy changes are few and clearly defined, as they were fifty years ago, but it becomes much harder when power is diffused through scores of competing factions, and it takes an alliance of a dozen disparate interest groups to get anything done at all.

This happens in the life of nearly all republics, and it plays an important role in the political breakdowns that afflict them at regular intervals. Still, another factor will be familiar to regular readers of this blog: the mismatch between growth and the limits of the environment that provides the basis for growth. In societies that use resources at a steady rate, those limits are always close at hand, and struggles between interest groups over the distribution of pork are recognized as zero-sum games, in which somebody has to lose for somebody else to gain; thus the multiplication of factions tends to be limited by the fixed size of the feeding trough.

In a society that relies on rapidly expanding production of resources, on the other hand, this can be evaded for a time. The first two-thirds of the 20th century thus saw an explosion of factions that spanned the entire upper half of the American class structure, from the ultrarich to unionized labor. The result was a vast number of people who all expected to get financial benefits from the government. Yet the end of America’s real economic expansion in the 1970s meant that these demands had to be paid out of a dwindling supply of real wealth.

One result has been a drastic narrowing of the options available to politicians. A great many simple and necessary reforms that could be enacted without harm to anyone – for example, putting a means test on social security pensions – are completely off the table, because nobody can put together a governing coalition without the support of groups that oppose such measures. Equally, a great many ghastly policies – for example, deliberately inflating financial bubbles – have become political necessities, because they allow governments to get away with the pretense of paying off their supporters. Meanwhile any sector of society not organized enough to defend its interests can basically count on being thrown to the wolves.

The rising spiral of crises that threaten the survival of industrial society might be expected to trump such matters. The problem here, of course, is that prophecies of imminent doomsday have been standard political theater in American public life for more than a century, and most people in politics have long since stopped listening to them. There are plenty of people in politics who still remember, for example, the widespread insistence that the energy crisis of the 1970s was supposed to be permanent; the fact that there were plenty of less shrill predictions that have proven to be much more accurate in retrospect is nothing like as memorable.

Behind all of this lies the central political fact of the limits to growth: the reduction of First World nations to a Third World lifestyle that will be the inevitable result of any transition to a postpetroleum world, whether that transition is deliberate or unplanned. Metaphors about elephants in living rooms don’t begin to touch the political explosiveness of this fact, or the degree to which people at every point on the political spectrum have tried to pretend that it just isn’t so. Still, set aside delusions about miraculous new energy sources that show up basically because we want them to, and it’s impossible to evade.

Let’s walk through the logic. The most reasonable estimates suggest that, given a crash program and the best foreseeable technologies, renewable sources can probably provide the United States with around 15% of the energy it currently gets from fossil fuels. Since every good and service in the economy is the product of energy, it’s a very rough but functional approximation to say that in a green economy, every American will have to get by on the equivalent of 15% of his or her current income. Take a moment to work through the consequences in your own life; if you made $50,000 in 2009, for example, imagine having to live on $7,500 in 2010. That’s quite a respectable income by Third World standards, but it won’t support the kind of lifestyle that the vast majority of Americans, across the political spectrum, believe is theirs by right.

That’s the bomb ticking away at the heart of America’s political system. When it goes off, the entire system of government by pork barrel will explode messily, and it’s only in the fantasies of reformers that what replaces it will likely be an improvement. (My guess? Anything from a military coup followed, after various convulsions, by a new and less centralized constitution, to civil war and the partition of the United States into half a dozen impoverished and quarreling nations.) In the meantime, we can expect to see every possible short term expedient put to use in an attempt to stave off the explosion even for a little while, and any measure that might risk rocking the boat enough to set off the bomb will be quietly roundfiled by all parties.

A meaningful political response to the growing instability of global climate is one such measure, and a meaningful political response to peak oil is another. No such project can be enacted without redirecting a great deal of money and resources away from current expenditures toward the construction of new infrastructure. The proponents of such measures are quick to insist that this means new jobs will be created, and of course this is true, but they neglect to mention that a great many more existing jobs will go away, and the interests that presently lay claim to the money and resources involved are not exactly eager to relinquish those. A political system of centralized power could overcome their resistance readily enough, but a system in which power is diffused and fragmented cannot do so. That the collapse of the entire system is a likely long-term consequence of this inability is simply one of the common ironies of history.

Wednesday, December 16, 2009

The Political Ecology of Collapse

Part Two: Weishaupt’s Fallacy

Nostalgia’s a funny thing; you never know what’s going to fill the place of Proust’s madeleine and catapult you back to memories of some other time. A little over a year ago, I had a reminder of that while visiting the Upland Hills Ecological Awareness Center in Oakland County, Michigan. The path from the parking lot wandered through a lovely autumn woodland, then turned a corner and deposited me back in 1980.

In those days I was passionately interested in the appropriate technology movement, to the extent of spending the better part of three years working part time on an organic farm, learning the uses of cold frames, a solar greenhouse, compost bins, and double-dug garden beds. Every cliché you can imagine about late-70s communes was present and accounted for: wood smoke and mud, naked bodies in a creaky wood-fired sauna, goats and chickens in the pasture, and a handbuilt wind turbine that went whuppeta-whuppeta and churned out a trickle of twelve-volt current whenever the breeze picked up.

The center at Upland Hills was a good deal cleaner, and the goats and the naked bodies were nowhere to be seen, but the esthetic was much the same. Their wind turbine sounded a silky pup-pup-pup atop an honest-to-Fuller octet truss tower, and the center itself was what all of us at the Outback Farm dreamed of inhabiting someday: a big comfortable earth-bermed shelter with passive solar heating and old-fashioned round photovoltaic cells soaking up the sunlight. When we went inside, I half expected to see a circle of scruffy longhairs sitting on pillows around the latest issue of Coevolution Quarterly, excitedly discussing the latest innovations from Zomeworks and the New Alchemy Institute.

Nostalgia aside, there's a lot to be learned from the rise and fall of appropriate tech in the 1970s, and one of its lessons bears directly on the theme of this series of posts. For many of the people involved in it back then, appropriate tech was the inevitable wave of the future; nearly everyone assumed that energy costs would continue to rise as the limits to growth clamped down with increasing force, making anything but Ecotopia tantamount to suicide. A formidable body of thought backed those conclusions, and the core of that body of thought was systems theory.

Nowadays, the only people who pay attention to systems theory are specialists in a handful of obscure fields, and it can be hard to remember that forty years ago systems theory had the same cachet that more recently gathered around fractals and chaos theory. Born of a fusion between ecology, cybernetics, and a current in contemporary philosophy best displayed in Jan Smuts' Holism and Evolution, systems theory argued that complex systems -- all complex systems – shared certain distinctive traits and behaviors, so that insights gained in one field of study could be applied to phenomena in completely different fields that shared a common degree of complexity.

It had its weaknesses, to be sure, but on the whole, systems theory did exactly what theories are supposed to do – it provided a useful toolkit for making sense of part of the universe of human experience, posed plenty of fruitful questions for research, and proved useful in a sizable range of practical applications. As popular theories sometimes do, though, it became associated with a position in the cultural struggles of the time, and as some particularly unfortunate theories do, it got turned into a vehicle for a group of intellectuals who craved power. Once that happened, systems theory became another casualty of Weishaupt's Fallacy.

Those of my readers who don't pay attention to conspiracy theory may not recognize the name of Adam Weishaupt; those who do pay attention to conspiracy theory probably "know" a great deal about him that doesn't happen to be true. He was a professor of law at the University of Ingolstadt in Bavaria in the second half of the eighteenth century, and he found himself in an awkward position between the exciting new intellectual adventures coming out of Paris and the less than exciting intellectual climate in conservative, Catholic Bavaria. In 1776, he and four of his grad students founded a private society for enthusiasts of the new Enlightenment thought; they went through several different names for their club before finally settling on one that stuck: the Bavarian Illuminati.

Yes, those Bavarian Illuminati.

There were a fair number of people interested in avant-garde ideas in and around Bavaria just then, and. before too long, the Illuminati had several hundred members. This gave Weishaupt and his inner circle some grandiose notions about where all this might lead. Pretty soon, they hoped,all the movers and shakers in Bavaria – not to mention the other microkingdoms into which Germany was divided at that time – would join the Illuminati and stuff their heads full of Voltaire and Rousseau, and then the whole country would become, well, illuminated.

They were still telling themselves that when the Bavarian government launched a series of police raids that broke the back of the organization. Weishaupt got out of Bavaria in time, but many of his fellow Illuminati were not so lucky, and a great deal of secret paperwork got scooped up by the police and published in lavish tell-all books that quickly became bestsellers all over Europe. That was the end of the Illuminati, but not of their reputation; reactionaries found that blaming the Illuminati for everything made great copy, not least because they weren't around any more and so could be redefined with impunity – liberal, conservative, Marxist, capitalist, evil space lizards, you name it.

The problem with Professor Weishaupt's fantasy of an illuminated Bavaria was a bit of bad logic that has been faithfully repeated by intellectuals seeking power ever since: the belief, as sincere as it is silly, that if you have the right ideas, you are by definition smarter than the system you are trying to control. That's Weishaupt's Fallacy. Because Weishaupt and his fellow Illuminati were convinced that the conservative forces in Bavaria were a bunch of clueless boors, they were totally unprepared for the counterblow that followed once the Bavarian government figured out who the Illuminati were and what they were after.

For a more recent example, consider the rise and fall of the neoconservative movement, which stormed into power in the United States in 2000 boldly proclaiming the arrival of a "new American century," and proceeded to squander what remained of America's wealth and global reputation in a series of foreign and domestic policy blunders that have set impressive new standards for political fecklessness. The neoconservatives were convinced that they understood the world better than anybody else. That conviction was the single most potent factor behind their failure; when mainstream conservatives (not to mention everybody else!) tried to warn them where their fantasies of remaking the Middle East in America's image would inevitably end, the neoconservatives snorted in derision and marched straight on into the disaster they were making for themselves, and of course for the rest of us as well.

Systems theory was a victim of the same fallacy. The systems movement, to coin a label for the heterogeneous group of thinkers and policy wonks that made systems theory its banner, had ambitions no less audacious than the neoconservatives, though aimed in a completely different direction. Their dream was world systems management. Such leading figures in the movement as Jay Forrester of MIT and Aurelio Peccei of the Club of Rome agreed that humanity's impact on the planet had become so great that methods devised for engineering and corporate management – in which, not coincidentally, they were expert – had to be put to work to manage the entire world.

The study that led to the 1973 publication of The Limits to Growth was one product of this movement. Sponsored by Peccei's Club of Rome and carried out by a team led by one of Forrester's former Ph.D. students, it applied systems theory to the task of making sense of the future, and succeeded remarkably well. As Graham Turner's study "A Comparison Of The Limits to Growth With Thirty Years of Reality" (CSIRO, 2008) points out, the original study's baseline "Standard Run" scenario matches the observed reality of the last three and a half decades far more exactly than rival scenarios.

It's not often remembered, though, that the Club of Rome followed up The Limits to Growth with a series of further studies, all basically arguing that the problems outlined in the original study could be solved by planetary management on the part of a systems-savvy elite. The same notions can be found in dozens of similar books from the same era – indeed, it's hard to think of a systems thinker with any public presence in the 1970s who didn't publish at least one book proposing some kind of worldwide systems management as the only alternative to a very messy future.

It's only fair to stress the role that idealism and the best intentions played in all this. Still, the political dimensions shouldn't be ignored. Forrester, Peccei, and their many allies were, among other things, suggesting that a great deal of effective power be given to them, or to people who shared their values and goals. Since the systems movement was by no means politically neutral – quite the contrary, it aligned itself forcefully with specific ideological positions in the fractured politics of the decade – that suggestion was bound to evoke a forceful response from the entire range of opposing interests.

The Reagan revolution of 1980 saw the opposition seize the upper hand, and the systems movement was among the big losers. Hardball politics have always played a significant role in public funding of research in America, so it should have come as no surprise when Reagan's appointees all but shut off the flow of government grants into the entire range of initiatives that had gathered around the systems theory approach. From appropriate tech to alternative medicine to systems theory itself, entire disciplines found themselves squeezed out of the government feed trough, while scholars who pursued research that could be used against the systems agenda reaped the benefits of that stance. Clobbered in its most vulnerable spot – the pocketbook – the systems movement collapsed in short order.

What made this implosion all the more ironic is that a systems analysis of the systems movement itself, and its relationship to the wider society, might have provided a useful warning. Very few of the newborn institutions in the systems movement were self-funding; from prestigious think tanks to neighborhood energy-conservation schemes, most of them subsisted on government grants, and thus were in the awkward position of depending on the social structures they hoped to overturn. That those structures could respond homeostatically to oppose their efforts might, one would think, be obvious to people who were used to the strange loops and unintended consequences that pervade complex systems.

Still, Weishaupt's Fallacy placed a massive barrier in the way of such a realization. Read books by many of the would-be global managers of the 1970s and you can very nearly count on being bowled over by the scent of intellectual arrogance. The possibility that the system they hoped to manage might, in effect, have been more clever than they were probably crossed very few minds. Yet that's how things turned out; at the end of the day, the complex system that was American society had reacted, exactly as systems theory would predict, to neutralize a force that threatened to push it out of its preferred state.

Unfortunately that reaction slammed the door on resources that might have made the transition ahead of us less difficult. Set aside the hubris that convinced too many systems theorists that they ought to manage the world, and systems theory itself is an extremely effective toolkit of ideas and practices, and a good many of the things that moved in harmony with systems theory – 1970s appropriate tech being a fine example – are well worth dusting off and putting to use right now. At the same time, though, the process that excluded them needs to be understood, and not just because the same process could repeat itself just as easily with some new set of sustainability initiatives. The homeostatic behavior of complex systems also casts an unexpected light on one of the major conundrums of contemporary life, the location of political power in industrial society – the theme of the final part of this series.

Wednesday, December 9, 2009

The Human Ecology of Collapse

Part One: Failure is the Only Option

The old legend of the Holy Grail has a plot twist that’s oddly relevant to the predicament of industrial civilization. A knight who went searching for the Grail, so the story has it, if he was brave and pure, would sooner or later reach an isolated castle in the midst of the desolate Waste Land. There the Grail could be found and the Waste Land made green again, but only if the knight asked the right question. Failing that, he would wake the next morning in a deserted castle, which would vanish behind him as soon as he left, and it might take years of searching to find the castle again.

As we approach the twilight of the age of cheap energy, we’re arguably in a similar situation. It seems to me that a great deal of the confusion that grips the peak oil scene, and even more of the blind commitment to catastrophically misguided policies that reigns outside peak-aware circles, comes from a failure to ask the right questions. A great many people aware of the limits to fossil fuels, for example, have assumed that the question that needs answering is how to sustain a modern industrial society on alternative energy.

Ask that, though, and you’re back in the Waste Land, because any answer you give to that question is wrong. The question that has to be asked is whether a modern industrial society can exist at all without vast and rising inputs of essentially free energy, of the sort only available on this planet from fossil fuels, and the answer is no. Once that’s grasped, other useful questions come to mind – for example, how much of the useful legacy of the last three centuries can be saved, and how – but until you get past the wrong question, you’re stuck chasing the mirage of a replacement for oil that didn’t take a hundred million years or so to come into being.

Other examples could be cited easily enough. As the world’s political leaders busy themselves in Copenhagen for a round of photo ops and brutal backroom politics, though, the unasked question that hangs most visibly in the air is why human societies, faced with choices between survival and collapse, so consistently make the choices that destroy them.

It’s implicit in most discussions of peak oil, climate change, and nearly any other global issue you care to name, that if we all just try hard enough we can overcome the crisis du jour and chug boldly on into the future. Those in the political mainstream tend to insist, in the face of the evidence, that replacing the people currently in charge of political or economic systems with somebody else will solve the problem. Those outside the political mainstream tend to insist, also in the face of the evidence, that swapping out current political or economic systems with others more to their liking will solve the problem.

Nearly all the media coverage of the Copenhagen circus, mainstream or alternative, falls into these camps. While the mainstream right pounds its collective fist on an assortment of lecterns and insists that the polar bears would be just fine if the last round of elections had gone the other way, the mainstream left fills the air with pleas that Obama live up to the nearly messianic fantasy role they projected onto him – will somebody please explain to me someday how a head of state got given the Nobel Peace Prize while he was enthusiastically waging two wars? Meanwhile the socialists are insisting that it’s all capitalism’s fault and can be solved promptly by a socialist revolution, never mind the awkward little fact that the environmental records of socialist countries are by and large even worse than those of capitalist ones; other radicalisms of left and right make the same claim as the socialists, often with even less justification.

Still, I think a great many people are beginning to realize that whatever results come out of Copenhagen, a meaningful response to the increasing instability of global climate will not be among them. James Hansen, among the most prestigious of global warming scientists, has announced to the media that he hopes the Copenhagen talks fail, because none of the options being taken to the talks would have any useful result; we’d be better off, he argues, to start over again from scratch. He’s right about the first point, it seems to me, and wrong about the second, because if we start again from scratch, care to guess where we’ll end up? Right back where we are now, face to face with the yawning gap between those things that are politically possible and those things that would actually deal with the crisis at hand.

Those people who are not in positions of power, and thus don’t have to face the consequences of political decisions, commonly insist that politicians can or should simply leap across chasms of this sort to deliver the goods to their constituents. Copenhagen offers a useful lesson on why such rhetoric is wasted breath. Suppose, for the sake of discussion, that Obama agreed to cut US carbon emissions far enough to make a real impact on global climate change. Would those cuts happen? No, because Congress would have to agree to implement them, and Congress – even though it is controlled by a Democratic majority – has so far been unable to pass even the most ineffectual legislation on the subject.

Suppose the improbable happened, and both Obama and Congress agreed to implement serious carbon emission cuts. What would the result be? Much more likely than not, a decisive Republican victory in the 2010 congressional elections, followed by the repeal of the laws mandating the cuts. Carbon emissions can’t be cut by waving a magic wand; the cuts will cost trillions of dollars at a time when budgets are already strained, and impose steep additional costs throughout the economy. Those latter would be unpopular even if the consensus on climate change were accepted on faith outside the scientific community, which it isn’t. Even those Congresspersons who would most like to see carbon emissions cuts made into law do think about their prospects of remaining in office now and again.

Even between nations, a rough and ready version of the same pattern of checks and balances applies; any nation that accepts serious carbon emission cuts will place itself at a steep economic disadvantage compared to those nations that don’t. Watch the way the competing power blocs at Copenhagen are trying to shove responsibility for emissions cuts onto one another, and you can see this at work. Remember the failed trade negotiations of the last decade, in which Europe and the US tried to browbeat the rising industrial powers of the Third World into accepting a permanent second-class position? They’re at it again, using carbon emission allotments in place of trade treaties, and the Third World is once again having none of it.

Notice that what’s happening in all these cases is that the system is working the way it’s supposed to work. Elected representatives, after all, are supposed to worry about what their constituents back home will think; the excesses of each party are supposed to be held in check by the well-founded worry that the other party can and will make political hay out of any missteps the party in power might happen to make. For that matter, national governments justify their existence by defending the interests of their citizens in international disputes. In most cases, these checks and balances are not only useful but vitally important; unchecked power in any aspect of human life pretty consistently turns into tyranny. In certain cases, though, these otherwise helpful protections turn into barriers that keep necessary decisions from being made.

The belief that none of this matters, and that somebody or other could fix the problem if they wanted to, runs deep. This is why so much of the rhetoric on both sides of the climate debate focuses so obsessively on finding somebody to blame. Of course there has been reprehensible behavior on both sides. Business executives whose companies will bear a large share of the costs of curbing carbon emissions have funded some very dubious science, and some even more dubious publicity campaigns, in order to duck those costs; academics have either tailored their findings to climb onto the climate change bandwagon, or whored themselves out to corporate interests willing to pay handsomely for anyone in a lab coat who will repeat their party line; politicians on both sides of the aisle have distorted facts grotesquely to further their own careers.

All this has been fodder for endless denunciation. Beneath all the yelling, though, are a set of brutal facts nobody is willing to address. Whether or not the current round of climate instability is entirely the product of anthropogenic CO2 emissions is actually not that important, because it’s even more stupid to dump greenhouse gases into a naturally unstable climate system than it would be to dump them into a stable one. Over the long run, the only level of carbon pollution that is actually sustainable is zero net emissions, and getting there any time soon would require something not far from the dismantling of industrial society and its replacement with something much less affluent. Now of course we would have to do this anyway, since the world’s fossil fuel supplies are depleting fast enough that production limits will begin to bite hard in the years and decades ahead, but this simply sharpens the point at issue.

Even if it turns out to be possible to power something like an industrial society on renewable resources, the huge energy, labor, and materials costs needed to develop renewable energy and replace most of the infrastructure of today’s society with new systems geared to new energy sources will have to be paid out of existing supplies; thus everything else would have to be cut to the bone, or beyond. Exactly how big the price tag would be is anybody’s guess just now, but it’s probably not far from the mark to suggest that the population of the industrial world would have to accept a Third World standard of living, and the population of the Third World would have to give up aspirations for a better life for the foreseeable future, for such a gargantuan project to have any chance of working.

I encourage those who think this latter is a politically viable option to try to convince their spouses and friends to take such steps voluntarily. Any politician rash enough to propose such a project would be well advised to kiss his or her next election goodbye. Any president who even took a step in that direction – well, I doubt many people have forgotten what happened to Jimmy Carter. For that matter, I’m sure there must be climate change zealots who have given up their McMansions, sold their cars, and now live in one-room apartments in rat-infested tenements with six other activists so all their spare money can go to building a renewable economy, but I don’t happen to know any who have done so, while I long ago lost track of the number of global warming bumper stickers I’ve seen on the rear ends of SUVs.

Nobody, but nobody, is willing to deal with the harsh reality of what a carbon-neutral society would have to be like. This is what makes the blame game so popular, and it also provides the impetus behind meaningless gestures of the sort that are on the table at Copenhagen. It’s a common piece of rhetoric these days to say that “failure is not an option,” but this sort of feckless thoughtstopper misses the point as totally as any human utterance possibly could. Failure is always an option; when trying to prevent it will lead to highly unpleasant personal consequences, without actually having the least chance of preventing it, a strong case can be made that the most viable option for anyone in a leadership position is to enjoy the party while it lasts, and hope you can duck the blame when it all comes crashing down.

Those who have their doubts about anthropogenic climate change can apply the identical logic to the industrial world’s sustained nonresponse to the peaking of world oil production, or to any of half a dozen other global crises that result from the collision between an economy geared to infinite growth and the relentless limits of a finite planet. In each case, the immediate costs of doing something about the issue are so high, and so unendurable, that very few people in positions of influence are willing to stick their necks out, and those who do so can count on being shortened by a head by others who are more than willing to cash in on their folly.

There’s another way to understand the paradox that makes failure the only viable option, but it will involve a glance backwards over the history of the sustainability movement and the theoretical structure – systems theory – that once undergirded it. That glance, and its implications, will occupy the second part of this series.

Wednesday, December 2, 2009

Hagbard's Law

The dubious statistical measures that were the theme of last week’s Archdruid Report post have had a massive impact on the even more dubious decisions that have backed the United States, and the industrial world more broadly, into its present predicament. When choices are guided by numbers, and the numbers are all going the right way, it takes a degree of insight unusual in contemporary life to remember that the numbers may not reflect what is actually going on in the real world.

You might think that this wouldn’t be the case if the people making the decisions know that the numbers are being fiddled with to make them more politically palatable, as economic statistics in the United States and elsewhere generally are. Still, it’s important to remember that we’ve gone along way past the simplistic tampering with data practiced in, say, the Lyndon Johnson administration. With characteristic Texan straightforwardness, Johnson didn’t leave statistics to chance; he was famous for sending any unwelcome number back to the bureau that produced it, as many times as necessary, until he got a figure he liked.

Nowadays nothing so crude is involved. The president – any president, of any party, or for that matter of any nation – simply expresses a hope that next quarter’s numbers will improve; the head of the bureau in question takes that instruction back to the office; it goes down the bureaucratic food chain, and some anonymous staffer figures out a plausible reason why the way of calculating the numbers should be changed; the new formula is approved by the bureau’s tame academics, rubberstamped by the appropriate officials, and goes into effect in time to boost the next quarter’s numbers. It’s all very professional and aboveboard, and the only sign that anything untoward is involved is that for the last thirty years, every new formulation of official economic statistics has made the numbers look rosier than the one it replaced.

It’s entirely possible, for that matter, that a good many of those changes took place without any overt pressure from the top at all. Hagbard’s Law is a massive factor in modern societies. Coined by Robert Shea and Robert Anton Wilson in their tremendous satire Illuminatus!, Hagbard’s Law states that information can only be communicated between equals, since in a hierarchy, those in inferior positions face very strong incentives to tell their superiors what the superiors want to hear rather than ‘fessing up to the truth. The more levels of hierarchy between the people who gather information and the ones who make decisions, the more communication tends to be blocked by Hagbard’s Law; in today’s governments and corporations, the disconnect between the reality visible on the ground and the numbers viewed from the top of the pyramid is as often as not total.

Many of my readers will be aware that two examples of this sort of figure-juggling surfaced in the last couple of weeks. From somewhere in the bowels of the International Energy Agency (IEA), a bureaucracy created and funded by the world’s industrial nations to provide statistics on energy use, two whistleblowers announced that the 2009 figures that were about to be released had been jiggered, as past figures had been, under pressure from the US government. The reason for the pressure, according to the whistleblowers, was that accurate figures would be bad for the US economy – as indeed they would be, for much the same reason that a diagnosis of terminal illness is bad for one’s prospects of buying life insurance.

Of course news stories about the leaks brought a flurry of denials from the IEA. Doubtless some people were fooled; still, the gaping chasm between the IEA’s rosy predictions of future oil production and the evidence assembled by independent researchers has been a subject of discussion in peak oil circles for some years now, and it was useful to have insiders confirm the presence of fudge factors outside analysts have long since teased out of the data.

The second and much more controversial example came to light when persons unknown dumped onto the internet a very large collection of private emails from a British academic center studying global warming. Like everything else involved with global warming, the contents of the emails became the focus of a raging debate between opposed armies of true believers, but the emails do suggest that a certain amount of data-fudging and scientific misconduct is going on in the large and lucrative scientific industry surrounding climate change.

This sort of thing is all too common in contemporary science. In many fields, ambitious young scientists far outnumber the available grants and tenured positions at universities, and the temptation to misconduct for the sake of professional success is strong. Though overt fakery still risks punishment, less blatant forms of scientific fraud pay off handsomely in papers published, grants awarded, and careers advanced. Since science is expected to police itself, scientific fraud gets the same treatment as, say, sexual abuse among the clergy or malpractice among physicians: except in the most blatant cases, punishing the guilty takes a back seat to getting along with one’s peers and preserving the reputation of one’s institution and profession.

The result is a great deal of faux science that manipulates experimental designs and statistical analyses to support points of view that happen to be fashionable, either within a scientific field or in the broader society. I saw easily half a dozen examples of this sort of thing in action back in my college days, which spanned all of five years and two universities. Still, you don’t need a ringside seat to watch the action: simply pay attention to how often the results of studies just happen to support the interests of whoever provided the funding for them. You don’t need to apply a chi-square test here to watch Hagbard’s Law in action.

There’s good reason to think that the feedback loop by which popular attitudes generate their own supporting evidence via dubious science has distorted the global warming debate. The fingerprints show up all over the weird disconnect between current global warming science and the findings of paleoclimatology, which show that sudden, drastic climate changes have been routine events in Earth’s long history; that the Earth was actually warmer than the temperatures predicted by current doomsday scenarios at the peak of the current interglacial period only six thousand years ago; and that the Earth has been a hothouse jungle planet without ice caps or glaciers for around 80% of the time since multicellular life evolved here. Technically speaking, we’re still in an ice age – the current interglacial is on schedule to end in the next few thousand years, giving way to a new glaciation for a hundred thousand years or so, with several million years of further cycles still in the pipeline – and claims that setting the planetary thermostat a little closer to its normal range will terminate life on Earth are thus at least open to question.

What interests me most about the current global warming debate is that these facts, when they get any air time at all, commonly get treated as ammunition for the denialist side of the debate. This hardly follows. Paleoclimatology shows that the Earth’s climate is unstable, and prone to drastic shifts that can place massive strains on local and regional ecosystems. It’s equally clear that number juggling in a British laboratory does not change the fact that the Arctic ice sheet is breaking up, say, or that a great many parts of the world are seeing their climates warp out of all recognition. Even if natural forces are driving these shifts, this is hardly a good time to dump vast quantities of greenhouse gases into an already unstable atmosphere – you could as well claim that because a forest fire was started by lightning, dumping planeloads of gasoline around its edges can’t possibly cause any harm.

The problem with the global warming debate just now is that tolerably well funded groups on both sides are using dubious science to advance their own agendas and push the debate further toward the extremes. The common habit of thinking in rigid binaries comes into play here; it’s easy enough for global warming believers to insist that anyone who questions their claims must be a global warming denier, while their opponents do the same thing in reverse, and the tumult and the shouting helps bury the idea that the territory between the two polarized extremes might be worth exploring. As a result, moderate views are being squeezed out, as the radicals on one side try to stampede the public toward grandiose schemes of very questionable effect, while the radicals on the other try to stampede the public toward doing nothing at all.

It’s instructive to compare the resulting brouhaha to the parallel, if much less heavily publicized, debate over peak oil. The peak oil scene has certainly seen its share of overblown apocalyptic claims, and it certainly has its own breed of deniers, who insist that the free market, the march of progress, or some other conveniently unquantifiable factor will make infinite material expansion on a finite planet less of an oxymoron than all logic and evidence suggests it will be. Still, most of the action in the peak oil scene nowadays is happening in the wide spectrum between these two extremes. We’ve got ecogeeks pushing alternative energy, Transition Towners building local communities, “preppers” learning survival skills, and more; even if most of these ventures miss their mark, as doubtless most of them will, the chance of finding useful strategies for a difficult future goes up with each alternative explored.

The difference between the two debates extends to the interface between statistics and power discussed earlier in this post. Both sides of the global warming debate, it’s fair to say, have fairly robust political and financial payoffs in view. The established industrial powers of the West and the rising industrial nations elsewhere are each trying to use global warming to impose competitive disadvantages on the other; fossil fuel companies are scrambling to shore up their economic position, while the rapidly expanding renewables industry is trying to elbow its way to the government feed trough; political parties are lining up to turn one side or the other into a captive constituency that can be milked for votes and donations, and so forth.

This hasn’t happened with peak oil. Now it’s true, of course, if the world’s petroleum reserves have already peaked and other fossil fuels are shortly to follow, the game is over and nobody wins. The end of the Age of Abundance promises to tip the world’s industrial economies into permanent contraction, leave political parties without the resources needed to buy support from increasingly needy constituencies, curtail the global military reach of industrial nations, and foreclose most of the options for the future on which industrial society relies. Still, a modest round of global warming – something, let’s say, on the order of the temperature spike at the end of the last glaciation 11,000 years ago, which jolted global temperatures up 13 to15°F in less than a decade, melted continental glaciers as large as the world’s present ice caps, and sent sea level up 300 feet or so, drowning millions of square miles of formerly dry land – would have equally challenging impacts on industrial society.

Why should that possibility become a political football, while peak oil remains all but unmentionable in polite company, and has been embraced so far only by political groups on the outermost fringe? It’s a fascinating question, for which I don’t have a simple answer. The evidence supporting peak oil is at least as strong as the evidence backing claims that global warming will rise to catastrophic levels, and the consequences for industrial society are pretty much the same. The conspiracy-minded can doubtless come up with some reason or other why the currently fashionable incarnations of “Them” would favor one rather than the other.

Still, I find myself wondering if Hagbard’s Law plays a much bigger role here than any deliberate plan. The global warming story, if you boil it down to its bones, is the kind of story our culture loves to tell – a narrative about human power. Look at us, it says, we’re so mighty we can destroy the world! The peak oil story, by contrast, is the kind of story we don’t like – a story about natural limits that apply, yes, even to us. From the standpoint of peak oil, our self-anointed status as evolution’s fair-haired child starts looking like the delusion it arguably is, and it becomes hard to avoid the thought that we may have to settle for the rather less flattering role of just another species that overshot the carrying capacity of its environment and experienced the usual consequences.

It’s hard to think of a less popular claim to make these days. Similar logic may be behind the way that both climate change believers and deniers shy away from the paleoclimatic data that shows just how lively Earth’s climate has been in prehistoric times. A species that’s desperately trying to maintain a fingernail grip on an inflated self-image has enough trouble dealing with the fact that an ordinary thunderstorm releases as much thermal energy as a strategic nuclear warhead; expecting it to grapple with the really spectacular fireworks Gaia likes to put on when she jumps from one climatic state to another is probably too much to ask.

All this, as suggested above, has potent effects on what we can collectively accomplish in the twilight of the Age of Abundance, and those effects show themselves with particular force in the political arena. Hang onto your hats, dear readers, because next week we start talking about the political economy of peak oil.

Wednesday, November 25, 2009

Lies and Statistics

Plenty of difficulties stand in the way of making sense of the economic realities we face at the end of the age of cheap abundant energy. Some of those difficulties are inevitable, to be sure. Our methods of producing goods and services are orders of magnitude more complex than those of previous civilizations, for example, and our economy relies on treating borrowing as wealth to an extent no other society has been harebrained enough to try before; these and other differences make the task of tracing the economic dimensions of the long road of decline and fall ahead of us unavoidably more difficult than they otherwise would be.

Still, there are other sources of difficulty that are entirely voluntary, and I want to talk about some of those self-inflicted blind spots just now. An economy is a system for exchanging goods and services, with all the irreducible variability that this involves. How many potatoes are equal in value to one haircut, for example, depends a good deal on the fact that no two potatoes and no two haircuts are exactly the same, and no two people can be counted on to place quite the same value on either one. Economics, however, is mostly about numbers that measure, in abstract terms, the exchange of potatoes and haircuts (and everything else, of course).

Economists rely implicitly on the claim that those numbers have some meaningful relationship with what’s actually going on when potato farmers get their hair cut and hairdressers order potato salad for lunch. As with any abstraction, a lot gets lost in the process, and sometimes what gets left out proves to be important enough to render the abstraction hopelessly misleading. That risk is hardwired into any process of mathematical modeling, of course, but there are at least two factors that can make it much worse.

The first, of course, is that the numbers can be deliberately juggled to support some agenda that has nothing to do with accurate portrayal of the underlying reality. The second, subtler and even more misleading, is that the presuppositions underlying the model can shape the choice of what’s measured in ways that suppress what’s actually going on in the underlying reality. Combine these two and what you get might best be described as speculative fiction mislabeled as useful data – and the combination of these two is exactly what has happened to the statistics on which too many contemporary economic and political decisions are based.

I suspect most people are aware by now that there’s something seriously askew with the economic statistics cited by government officials and media pundits. Recent rhetoric about “green shoots of recovery” is a case in point. In recent months, I’ve checked in with friends across the US, and nobody seems to be seeing even the slightest suggestion of an upturn in their own businesses or regions. Quite the contrary; all the anecdotal evidence suggests that the Great Recession is tightening its grip on the country as autumn closes in.

There’s a reason for the gap between these reports and the statistics. For decades now, the US government has systematically tinkered with economic figures to make unemployment look lower, inflation milder, and the country more prosperous. The tinkerings in question are perhaps the most enthusiastically bipartisan program in recent memory, encouraged by administrations and congresspeople from both sides of the aisle, and for good reason; life is easier for politicians of every stripe if they can claim to have made the economy work better. As Bernard Gross predicted back in the 1970s, economic indicators have been turned into “economic vindicators” that subordinate information content to public relations gimmickry. These manipulations haven’t been particularly secret, either;visit www.shadowstats.com and you can get the details, along with a nice set of statistics calculated the way the same numbers were done before massaging the figures turned into cosmetic surgery on a scale that would have made the late Michael Jackson gulp in disbelief.

These dubious habits have been duly pilloried in the blogosphere. Still, I'm not at all sure they are as misleading as the second set of distortions I want to discuss. When unemployment figures hold steady or sink modestly, but you and everyone you know are out of a job, it's at least obvious that something has gone haywire. Far more subtle, because less noticeable, are the biases that creep in because people are watching the wrong set of numbers entirely.

Consider the fuss made in economic circles about productivity. When productivity goes up, politicians and executives preen themselves; when it goes down, or even when it doesn't increase as fast as current theory says it ought, the cry goes up for more government largesse to get it rising again. Everyone wants the economy to be more productive, right? The devil, though, has his usual residence among the details, because the statistic used to measure productivity doesn't actually measure how productive the economy is.

Check out A Concise Guide to Macroeconomics by Harvard Business School professor David A. Moss: "The word [productivity] is commonly used as a shorthand for labor productivity, defined as output per worker hour (or, in some cases, as output per worker)." Output, here as always, is measured in dollars – usually, though not always, corrected for inflation – so what "productivity" means in practice is dollars of income per worker hour. Are there ways for a business to cut down on the employee hours per dollar of income without actually becoming more productive in any more meaningful sense? Of course, and most of them have been aggressively pursued in the hope of parading the magic number of a productivity increase before stockholders and the public.

Perhaps the simplest way to increase productivity along these lines is to change over from products that require high inputs of labor per dollar of value to those that require less. As a very rough generalization, manufacturing goods requires more labor input overall than providing services, and the biggest payoff per worker hour of all is in financial services – how much labor does it take, for example, to produce a credit swap with a theoretical value of ten million dollars? An economy that produces more credit swaps and fewer potatoes is in almost any real sense less productive, since the only value credit swaps have is that they can, under certain arbitrary conditions, be converted into funds that can buy concrete goods and services, such as potatoes; by the standards of productivity universal in the industrial world these days, however, replacing potato farmers with whatever you call the people who manufacture credit swaps (other than "bunco artists," that is) counts as an increase in productivity. I suspect this is one reason why the US auto industry got so heavily into finance in the run-up to the recent crash; GMAC's soaring productivity, measured in terms of criminally negligent loans per broker hour, probably did a lot to mask the anemic productivity gains available from the old-fashioned business of making cars.

As important as the misinformation generated by such arbitrary statistical constructs is the void that results because other, arguably more important figures are not being collected at all. In an age that will increasingly be constrained by energy limits, for example, a more useful measure of productivity might be energy productivity – that is, output per barrel of oil equivalent (BOE) of energy consumed. An economy that produces more value with less energy input is arguably an economy better suited to the downslope of Hubbert's peak, and the relative position of different nations, to say nothing of the trendline of their energy productivity over time, would provide useful information to governments, investors, and the general public alike. For all I know, somebody already calculates this figure, but I'm still waiting to see a politician or an executive crowing over the fact that the country now produces 2% more output per unit of energy.

Now it's true that a simplistic measurement of energy productivity would still make the production of credit swaps look like a better deal. This is one of the many places where the distinction already made in these essays between primary, secondary, and tertiary economies becomes crucial. To recap, the primary economy is nature itself, or specifically the natural processes that provide the human economy with about 3/4 of its total value; the secondary economy is the application of human labor to the production of goods and services; and the tertiary economy is the exchange of abstract units of value, such as money and credit, which serve to regulate the distribution of the goods and services produced by the secondary economy.

The economic statistics used today ignore the primary economy completely, measure the secondary economy purely in terms of the tertiary – calculating production in dollars, say, rather than potatoes and haircuts – and focus obsessively on the tertiary. This fixation means that if an economic policy boosts the tertiary economy, it looks like a good thing, even if that policy does actual harm to the secondary or the primary economies, as it very often does these days. Thus the choice of statistics to track isn't a neutral factor, or a simple one; if it echoes inaccurate presuppositions – for example, the fantasy that the human economy is independent of nature – it can feed those presuppositions right back in as a distorting factor into every economic decision we make.

How might this be corrected? One useful option, it seems to me, is to divide up several of the most important economic statistics into primary, secondary, and tertiary factors. (Of course the first step is to get honest numbers in the first place; governments aren't going to do this any time soon, for obvious reasons, but there's no reason why people and organizations outside of government can't make a start.) Consider, as a good example, what might be done with the gross domestic product.

To start with, it's probably a good idea to consider going back to the gross national product; this was quietly dropped in favor of the current measure some years back, because it puts a politically uncomfortable spotlight on America's economic dependence on the rest of the world. Whichever way that decision goes, the statisticians of some imaginary Bureau of Honest Figures might sort things out something like this:

The gross primary product or GPP might be the value of all unprocessed natural products at the moment they enter the economy – oil as it reaches the wellhead, coal as it leaves the mine, grain as it tumbles into the silo, and so on – minus all the costs incurred in drilling, mining, growing, and so on. (Those belong to the secondary economy.)

The gross secondary product or GSP might be the value of all goods and services in the economy, except for raw materials from nature and financial goods and services.

The gross tertiary product or GTP might be the value of all financial goods and services, and all money or money equivalents, produced by the economy.

The value of having these three separate numbers, instead of one gross domestic (or national) product, is that they can be compared to one another, and their shifts relative to one another can be tracked. If the GTP balloons and the other two products stay flat or decline, for example, that doesn't mean the country is getting wealthier; it means that the tertiary economy is inflating, and needs to have some air let out of it before it pops. If the GSP increases while the GPP stays flat, and the cost of extracting natural resources isn't soaring out of sight, then the economy is becoming more efficient at using natural resources, in which case the politicians and executives have good reason to preen themselves in public. Other relative shifts have other messages.

The point that has to be grasped, in this as in so many other contexts, is that the three economies, and the three kinds of wealth they produce, are not interchangeable. Trillions of dollars in credit swaps and derivatives will not keep people from starving in the streets if there's no food being grown and no housing being built, or maintained, or offered for sale or rent. The primary economy is fundamental to survival; the secondary economy is the source of real wealth; the tertiary economy is simply a way of measuring wealth and managing its distribution; and treating these three very different things as though they are one and the same makes rank economic folly almost impossible to avoid.

Now it deserves to be said that the chance that any such statistical scheme will be adopted in the United States under current political and social arrangements is effectively nonexistent. Far too many sacred cows would have to be put out to pasture or rounded up for slaughter first. Still, current political and social arrangements may turn out to be a good deal less permanent than they sometimes seem. What might replace them, here and elsewhere, is a topic I plan on exploring in a future essay here.

Wednesday, November 18, 2009

How Relocalization Worked

One of the points that I’ve tried to make repeatedly in these essays is the place of history as a guide to what works. It’s a point that deserves repetition. A good many worldsaving plans now in circulation, however new the rhetoric that surrounds them, simply rehash proposals that were tried in the past and failed repeatedly; trying them yet again may thus not be the best use of our limited resources and time.

Of course there’s another side to history that’s more hopeful: something that worked well in the past can be a useful guide to what might work well in the future. I’d like to spend a little time discussing one example of this, partly because it ties into the theme of the current series of posts – the abject failure of current economic notions, and the options for replacing them with ideas that actually make sense – and partly because it addresses one of the more popular topics in the ongoing peak oil discussion, the need for economic relocalization as the age of cheap abundant energy comes to an end.

That relocalization needs to happen, and will happen, is clear. Among other things, it’s clear from history; when complex societies overshoot their resource bases and decline, one of the things that consistently happens is that centralized economic arrangements fall apart, long distance trade declines sharply, and the vast majority of what we now call consumer goods get made at home, or very close to home. Now of course that violates some of the conventional wisdom that governs economic decisions these days; centralized economic arrangements are thought to yield economies of scale that make them more profitable by definition than decentralized local arrangements.

When history conflicts with theory, though, it’s not history that’s wrong, so a second look at the conventional wisdom is in order. The economies of scale and resulting profits of centralized economic arrangements don’t happen by themselves. They depend, among other things, on transportation infrastructure. This doesn’t happen by itself, either; it happens because governments pay for it, for purposes of their own. The Roman roads that made the tightly integrated Roman economy possible, for example, and the interstate highway system that does the same thing for America, were not produced by entrepreneurs; they were created by central governments for military purposes. (The legislation that launched the interstate system in the US, for example, was pushed by the Department of Defense, which wrestled with transportation bottlenecks all through the Second World War.)

Government programs of this kind subsidize economic centralization. The same thing is true of other requirements for centralization – for example, the maintenance of public order, so that shipments of consumer goods can get from one side of the country to the other without being looted. Governments don’t establish police forces and defend their borders for the purpose of allowing businesses to ship goods safely over long distances, but businesses profit mightily from these indirect subsidies nonetheless.

When civilizations come unglued, in turn, all these indirect subsidies for economic centralization go away. Roads are no longer maintained, harbors silt up, bandits infest the countryside, migrant nations invade and carve out chunks of territory for their own, and so on. Centralization stops being profitable, because the indirect subsidies that make it profitable aren’t there any more.

Ugo Bardi has written a very readable summary of how this process unfolded in one of the best documented cases, the fall of the Roman Empire. The end of Rome was a process of radical relocalization, and the result was the Middle Ages. The Roman Empire handled defense by putting huge linear fortifications along its frontiers; the Middle Ages replaced this with fortifications around every city and baronial hall. The Roman Empire was a political unity where decisions affecting every person within its borders were made by bureaucrats in Rome. Medieval Europe was the antithesis of this, a patchwork of independent feudal kingdoms the size of a Roman province, which were internally divided into self-governing fiefs, those into still smaller fiefs, and so on, to the point that a single village with a fortified manor house could be an autonomous political unit with its own laws and the recognized right to wage war on its neighbors.

The same process of radical decentralization affected the economy as well. The Roman economy was just as centralized as the Roman polity; in major industries such as pottery, mass production at huge regional factories was the order of the day, and the products were shipped out via sea and land for anything up to a thousand miles to the end user. That came to a screeching halt when the roads weren’t repaired any more, the Mediterranean became pirate heaven, and too many of the end users were getting dispossessed, and often dismembered as well, by invading Visigoths. The economic system that evolved to fill the void left by Rome’s implosion was thus every bit as relocalized as a feudal barony, and for exactly the same reasons.

Here’s how it worked. Each city – and “city” in this context means anything down to a town of a few thousand people – was an independent economic center; it might have a few industries of more than local fame, but most of its business consisted of manufacturing and selling things to its own citizens and the surrounding countryside. The manufacturing and selling was managed by guilds, which were cooperatives of master craftsmen. To get into a guild-run profession, you had to serve an apprenticeship, usually seven years, during which you got room and board in exchange for learning the craft and working for your master; you then became a journeyman, and worked for a master for wages, until you could produce your masterpiece – yes, that’s where the word came from – which was an example of craftwork fine enough to convince the other masters to accept you as an equal. Then you became a master, with voting rights in the guild.

The guild had the legal responsibility under feudal municipal laws to establish minimum standards for the quality of goods, to regulate working hours and conditions, and to control prices. The economic theory of the time held that there was a “just price” for any good or service, usually the price that had been customary in the region since time out of mind, and the municipal authorities could be counted on to crack down on attempts to push prices above the just price unless there was some very pressing reason for it. Most forms of competition between masters were off limits; if you made your apprentices and journeymen work evenings and weekends to outproduce your competitors, for example, or sold goods below the just price, you’d get in trouble with the guild, and could be barred from doing business in the town. The only form of competition that was encouraged was to make and sell a superior product.

This was the secret weapon of the guild economy, and it helped drive an age of technical innovation. As Jean Gimpel showed conclusively in The Medieval Machine, the stereotype of the Middle Ages as a period of technological stagnation is completely off the mark. Medieval craftsmen invented the clock, the cannon, and the movable-type printing press, perfected the magnetic compass and the water wheel, and made massive improvements in everything from shipbuilding and steelmaking to architecture and windmills, just for starters. The competition between masters and guilds for market share in a legal setting that made quality and innovation the only fields of combat wasn’t the only force behind these transformations, to be sure – the medieval monastic system, which put a good fraction of intellectuals of both genders in settings where they could use their leisure for just about any purpose that could be chalked up to the greater glory of God, was also a potent factor – but it certainly played a massive role.

The guild system has nonetheless been a whipping boy for mainstream economists for a long time now. The person who started that fashion was none other than Adam Smith, whose The Wealth of Nations castigates the guilds of his time for what we’d now call antitrust violations. From within his own perspective, Smith had a point. The guilds were structured in a way that limited the total number of people who could work in any given business in any given town, and of course the just price principle kept prices from fluctuating along with supply and demand. Thus the prices paid for the goods or services produced by that business were higher, all things considered, than they would have been under the free market regime Smith advocated.

The problem with Smith’s analysis is that there are crucial issues involved that he didn’t address. He lived at a time when transportation was rapidly expanding, public order was more or less guaranteed, and the conditions for economic centralization were coming back into play. Thus the very different realities of limited, localized markets did not enter into his calculations. In the context of localized economics, a laissez-faire free market approach doesn’t produce improved access to better and cheaper goods and services, as Smith argued it should; instead, it makes it impossible to produce many kinds of goods and services at all.

Let’s take a specific example for the sake of clarity. A master blacksmith in a medieval town of 5000 people, say, was in no position to specialize in only one kind of ironwork. He might be better at fancy ironmongery than anyone else in town, for example, but most of the business that kept his shop open, his apprentices fed and clothed, and his journeymen paid was humbler stuff: nails, hinges, buckles, and the like. Most of this could be done by people with much less skill than our blacksmith; that’s why he had his apprentices make nails while he sat upstairs at the table with the local abbot and discussed the ironwork for a dizzyingly complex new cutting-edge technology, just introduced from overseas, called a clock.

The fact that most of his business could be done by relatively unskilled labor, though, left our blacksmith vulnerable to competition. His shop, with its specialized tools and its staff of apprentices and journeymen, was expensive to maintain. If somebody else who could only make nails, hinges, and buckles could open a smithy next door, and offer goods at a lower price, our blacksmith could be driven out of business, since the specialized work that only he could do wouldn’t be enough to pay his bills. The cut-rate blacksmith then becomes the only game in town – at least, until someone who limited his work to even cheaper products made at even lower costs cut into his profits. The resulting race to the bottom, in a small enough market, might end with nobody able to make a living as a blacksmith at all.

Thus in a restricted market where specialization is limited, a free market in which prices are set by supply and demand, and there are no barriers to entry, can make it impossible for many useful specialties to be economically viable at all. This is the problem that the guild system evolved to counter. By restricting the number of people who could enter any given trade, the guilds made sure that the income earned by master craftsmen was high enough to allow them to produce specialty products that were not needed in large enough quantities to provide a full time income. Since most of the money earned by a master craftsman was spent in the town and surrounding region – our blacksmith and his family would have needed bread from the baker, groceries from the grocer, meat from the butcher, and so on – the higher prices evened out; since nearly everyone in town was charging guild prices and earning guild incomes, no one was unfairly penalized.

Now of course the guild system did finally break down; by Adam Smith’s time, the economic conditions that made it the best option were a matter of distant memory, and other arrangements were arguably better suited to the new reality of easy transport and renewed economies of scale. Still, it’s interesting that in recent years, the same race to the bottom in which quality goods become unavailable and local communities suffer has taken place in nearly the same way in most of small-town America.

A torrent of cheap shoddy goods funneled through Wal-Mart and its ilk, in a close parallel to the cheap blacksmiths of the example, have driven local businesses out of existence and made the superior products and services once provided by those businesses effectively unavailable to a great many Americans. In theory, this produces a business environment that is more efficient and innovative; in practice, the efficiencies are by no means clear and the innovation seems mostly to involve the creation of ever more exotic and unstable financial instruments: not necessarily the sort of thing that our society is better off encouraging.

Advocates of relocalization in the age of peak oil may thus find it useful to keep the medieval example and its modern equivalent in mind while planning for the economics of the future. Relocalized communities must be economically viable or they will soon cease to exist, and while viable local communities will be possible in the future – just as they were in the Middle Ages – the steps that will be necessary to make them viable may require some serious rethinking of the habits that now shape our economic lives.

Wednesday, November 11, 2009

A Gesture from the Invisible Hand

It’s been a long road, but we’ve finally reached the point in these essays at which it’s possible to start talking about some of the consequences of the primary economic fact of our time, the arrival of geological limits to increasing fossil fuel production. That’s as challenging a topic to discuss as it will be to live through, because it cannot be understood effectively from within the presuppositions that structure most of today’s economic thinking.

It’s common, for example, to hear well-intentioned people insist that the market, as a matter of course, will respond to restricted fossil fuel production by channeling investment funds either in more effective means of producing fossil fuels, on the one hand, or new energy sources on the other. The logic seems impeccable at first glance: as the price of oil, for example, goes up, the profit to be made by bringing more oil or oil substitutes onto the market goes up as well; investors eager to maximize their profits will therefore pour money into ventures producing oil and oil substitutes, and production will rise accordingly until the price comes back down.

That’s the logic of the invisible hand, first made famous by Adam Smith in The Wealth of Nations more than two centuries ago, and still central to most mainstream ideas of market economics. That logic owes much of its influence to the fact that in many cases, markets do in fact behave this way. Like any rule governing complex systems, though, it is far from foolproof, and it needs to be balanced by an awareness of the places where it fails to work.

Energy is one of those places: in some ways, the most important of all. Energy is not simply one commodity among others; it is the ur-commodity, the foundation for all economic activity. It follows laws of its own – the laws of thermodynamics, notably – which are not the same as the laws of economics, and when the two sets of laws come into conflict, the laws of thermodynamics win every time.

Consider an agrarian civilization that runs on sunlight, as every human society did until the rise of industrialism some three centuries ago. In energetic terms, part of the annual influx of solar energy is collected via agriculture, stored in the form of grain, and transformed into mechanical energy by feeding the grain to human laborers and draft animals. It's an efficient and resilient system, and under suitable conditions it can deploy astonishing amounts of energy; the Great Pyramid is one of the more obvious pieces of evidence for this fact.

Such civilizations normally develop thriving market economies in which a wide range of goods and services are exchanged. They also normally develop intricate social abstractions that manage the distribution of these goods and services, as well as the primary wealth that comes through agriculture from the sun, among their citizens. Both these, however, depend on the continued energy flow from sun to fields to granaries to human and animal labor forces. If something interrupts this flow -- say, a failure of the harvest -- the only option that allows for collective survival is to have enough solar energy stored in the granaries to take up the slack.

This is necessary because energy doesn't follow the ordinary rules of economic exchange. Most other commodities still exist after they've been exchanged for something else, and this makes exchanges reversible; for example, if you sell gold to buy marble, you can normally turn around and sell marble to buy gold. The invisible hand works here; if marble is in short supply, those who have gold and want marble may have to offer more gold for their choice of building materials, but the marble quarries will be working overtime to balance things out.

Energy is different. Once you turn the energy content of a few million bushels of grain into a pyramid, say, by using the grain to feed workers who cut and haul the stones, that energy is gone, and you cannot turn the pyramid back into grain; all you can do is wait until the next harvest. If that harvest fails, and the stored energy in the granaries has already been turned into pyramids, neither the market economy of goods and services or the abstract system of distributing goods and services can make up for it. Nor, of course, can you send an extra ten thousand workers into the fields if you don't have the grain to keep them alive.

The peoples of agrarian civilizations generally understood this. It's part of the tragedy of the modern world that most people nowadays do not, even though our situation is not all that different from theirs. We're just as dependent on energy inputs from nature, though ours include vast quantities of prehistoric sunlight, in the form of fossil fuels, as well as current solar energy in various forms; we've built atop that foundation our own kind of markets to exchange goods and services; and our abstract system for managing the distribution of goods and services -- money -- is as heavily wrapped in mythology as anything in the archaic civilizations of the past.

The particular form taken by money in the modern world has certain effects, however, not found in ancient systems. In the old agrarian civilizations, wealth consisted primarily of farmland and its products. The amount of farmland in a kingdom might increase slightly through warfare or investment in canal systems, though it might equally decrease if a war went badly or canals got wrecked by sandstorms; everybody hoped when the seed grain went into the fields that the result would be a bumper crop, but no one imagined that the grain stockpiled in the granaries would somehow multiply itself over time. Nowadays, by contrast, it's assumed as a matter of course that money ought automatically to produce more money.

That habit of thought has its roots in the three centuries of explosive economic growth that followed the birth of the industrial age. In an expanding economy, the amount of money in circulation needs to expand fast enough to roughly match the expansion in the range of goods and services for sale; when this fails to occur, the shortfall drives up interest rates (the cost of using money) and can cause economic contractions. This was a serious and recurring problem in the late 19th century, and led the reformers of the Progressive era to reshape industrial economies in ways that permitted the money supply to expand over time to match the expectation of growth. Once again, the invisible hand was at work, with some help from legislators: a demand for more money eventually give rise to a system that produced more money.

It's been pointed out by a number of commentators in the peak oil blogosphere that the most popular method for expanding the money supply -- the transformation of borrowing at interest from an occasional bad habit of the imprudent to the foundation of modern economic life -- has outlived its usefulness once an expanding economy driven by increasing fossil fuel production gives way to a contracting economy limited by decreasing fossil fuel production. This is quite true in an abstract sense, but there's a trap in the way of putting that sensible realization into practice.

The arrival of geological limits to increasing fossil fuel production places a burden on the economy, because the cost in energy, labor, and materials (rather than money) to extract fossil fuels does not depend on market forces. On average, it goes up over time, as easily accessible reserves are depleted and have to be replaced by those more difficult and costly to extract. Improved efficiencies and new technologies can counter that to a limited extent, but both these face the familiar problem of diminishing returns as the laws of thermodynamics, and other physical laws, come into play.

As a society nears the geological limits to production, in other words, a steadily growing fraction of its total supply of energy, resources, and labor have to be devoted to the task of bringing in the energy that keeps the entire economy moving. This percentage may be small at first, but it's effectively a tax in kind on every productive economic activity, and as it grows it makes productive economic activity less profitable. The process by which money produces more money consumes next to no energy, by contrast, and so financial investments don't lose ground due to rising energy costs.

This makes financial investments, on average, relatively more profitable than investing in the kinds of economic activity that use energy to produce nonfinancial goods and services. The higher the burden imposed by energy costs, the more sweeping the disparity becomes; the result, of course, is that individuals trying to maximize their own economic gains move their money out of investments in the productive economy of goods and services, and into the paper economy of finance.

Ironically, this happens just as a perpetually expanding money supply driven by mass borrowing at interest has become an anachronism unsuited to the new economic reality of energy contraction. It also guarantees that any attempt to limit the financial sphere of the economy will face mass opposition, not only from financiers, but from millions of ordinary citizens whose dream of a comfortable retirement depends on the hope that financial investments will outperform the faltering economy of goods and services. Meanwhile, just as the economy most needs massive reinvestment in productive capacity to retool itself for the very different world defined by contracting energy supplies, investment money seeking higher returns flees the productive economy for the realm of abstract paper wealth.

Nor will this effect be countered, as suggested by the well-intentioned people mentioned toward the beginning of this essay, by a flood of investment money going into energy production and bringing the cost of energy back down. Producing energy takes energy, and thus is just as subject to rising energy costs as any other productive activity; even as the price of oil goes up, the costs of extracting it or making some substitute for it rise in tandem and make investments in oil production or replacement no more lucrative than any other part of the productive economy. Oil that has already been extracted from the ground may be a good investment, and financial paper speculating on the future price of oil will likely be an excellent one, but neither of these help increase the supply of oil, or any oil substitute, flowing into the economy.

One intriguing detail of this scenario is that it has already affected the first major oil producer to reach peak oil -- yes, that would be the United States. It's unlikely to be accidental that in the wake of its own 1972 production peak, the American economy has followed exactly this trajectory of massive disinvestment in the productive economy and massive expansion of the paper economy of finance. Plenty of other factors played a role in that process, no doubt, but I suspect that the unsteady but inexorable rise in energy costs over the last forty years or so may have had much more to do with the gutting of the American economy than most people suspect.

If this is correct, now that petroleum production has encountered the same limits globally that put it into a decline here in the United States, the same pattern of disinvestment in the production of goods and services coupled with metastatic expansion of the financial sector may show up on a much broader scale. There are limits to how far it can go, of course, not least because financiers and retirees alike are fond of consumer goods now and then, but those limits have not been reached yet, not by a long shot. It's all too easy to foresee a future in which industry, agriculture, and every other sector of the economy that produces goods and services suffer from chronic underinvestment, energy costs continue rising, and collapsing infrastructure becomes a dominant factor in daily life, while the Wall Street Journal (printed in Shanghai by then) announces the emergence of the first half dozen quadrillionaires in the derivatives-of-derivatives-of-derivatives market.

Perhaps the most important limit in the way of such a rush toward economic absurdity is the simple fact that not every economy uses the individual decisions of investors pursuing private gain to allocate investment capital. It may not be accidental that quite a few of the world's most successful economies just now, with China well in the lead, make their investment decisions based at least in part on political, military, and strategic grounds, while the nation that preens itself most proudly on its market economy -- yes, that would be the United States again -- is lurching from one economic debacle to another.

It is unfortunately also the case that many of the nations that have extracted their investment decisions from the hands of a self-terminating market system are not exactly noted for their delicate care for human rights. If that proves to be the wave of the future -- and it may be worth noting that Oswald Spengler, among others, predicted that outcome -- then the invisible hand may end up giving us all the finger.
Related Posts Plugin for WordPress, Blogger...