Wednesday, December 31, 2014

The Cold Wet Mackerel of Reality

To misuse a bit of prose from Charles Dickens, it was neither the best of times nor the worst of times, but I know very few people who will object when, a few hours from now, 2014 gets dragged off to the glue factory.  This has not been a good year for most people in the United States. By all accounts, this year’s holiday season was an economic flop of considerable scale—the US media, which normally treats cheery corporate press releases with the same enforced credulity that Pravdaused to give to pronouncements from the Politburo in the Soviet era, has had to admit that Black Friday sales were down hard this year, even counting the internet—and plenty of Americans outside the echo chambers of the media have very good reasons to worry about what 2015 will bring.
 
Mind you, cheerleading of a distinctly Pravda-esque variety can still be heard from those pundits and politicians who are still convinced that people can be talked into ignoring their own experience if they can only be force-fed enough spin-doctored malarkey. That sort of enthusiasm for the glorious capitalist banker’s paradise has plenty of company just now; I’m thinking in particular of the steady drumbeat of articles and essays in the US mass media wondering aloud why so many Americans haven’t noticed the economic recovery of the last four years, and are still behaving as though there’s a recession on.

Of course there’s an explanation for that awkward fact, which is that the recovery in question never happened—outside, that is, of the abstract otherworld of numerical jugglery and spin doctoring that passes for official economic statistics these days. For most Americans, the last four years have been a bleak era of soaring expenses, shrinking incomes and benefits, rising economic insecurity, and increasingly frequent and bitter struggles with dysfunctional institutions that no longer bother even to pretend to serve the public good. That’s the reality people in the United States face when they get out of bed each morning, but it’s not a reality that’s welcome in the American mass media, so endless ingenuity has been expended in explaining why so many people in the US these days haven’t noticed the alleged economic recovery that’s allegedly burgeoning all around them.

I expect to see a good deal more of this sort of twaddle in the weeks immediately ahead, as those mass media pundits who haven’t yet trotted out their predictions for the new year get around to that annual task. For that matter, it’s doubtless safe to assume that out here on the fringes where archdruids lurk, there will be plenty of predictions of a different kind or, rather, several different kinds. There will be another round of claims that this is the year when the global economy will seize up suddenly and leave us all to starve in the dark; there will be another round of claims that this is the year when this or that or the other hot new technology will come swooping in to save the day and let the middle classes maintain their privileged lifestyles; there will be—well, those of my readers who have been following the blogosphere for any length of time can fill in the blanks themselves.

I’ve noted in previous years just how many of these latter predictions get rehashed every single January in the serene conviction that nobody will notice how often they’ve flopped before. Popular though that habit may be, it seems counterproductive to me, since—at least in theory—predictions of the sort we’re discussing is intended to be something more than light entertainment. With this in mind, I’d like to engage in the annual ritual of glancing back over the predictions I posted here at the beginning of the year now ending, and see how well I did. Here’s what I said:

“My prediction for 2014, in turn, is that we’ll see more of the same:  another year, that is, of uneven but continued downward movement along the same arc of decline and fall, while official statistics here in the United States will be doctored even more extravagantly than before to manufacture a paper image of prosperity. The number of Americans trying to survive without a job will continue to increase, the effective standard of living for most of the population will continue to decline, and what used to count as the framework of ordinary life in this country will go on unraveling a thread at a time. Even so, the dollar, the Euro, the stock market, and the Super Bowl will still be functioning as 2015 begins; there will still be gas in the gas pumps and food on grocery store shelves, though fewer people will be able to afford to buy either one.

“The fracking bubble has more than lived up to last year’s expectations, filling the mass media with vast amounts of meretricious handwaving about the coming era of abundance:  the same talk, for all practical purposes, that surrounded the equally delusional claims made for the housing bubble, the tech bubble, and so on all the way back to the Dutch tulip bubble of 1637. That rhetoric will prove just as dishonest as its predecessors, and the supposed new era of prosperity will come tumbling back down to earth once the bubble pops, taking a good chunk of the American economy with it. Will that happen in 2014? That’s almost impossible to know in advance. Timing the collapse of a bubble is one of the trickiest jobs in economic life; no less a mind than Isaac Newton’s was caught flatfooted by the collapse of the South Sea Bubble in 1720, and the current bubble is far more opaque. My guess is that the collapse will come toward the end of 2014, but it could have another year or so to run first.

“It’s probably a safe bet that weather-related disasters will continue to increase in number and severity. If we get a whopper on the scale of Katrina or Sandy, watch the Federal response; it’s certain to fall short of meeting the needs of the survivors and their communities, but the degree to which it falls short will be a useful measure of just how brittle and weak the national government has become. One of these years—just possibly this year, far more likely later on—that weakness is going to become one of the crucial political facts of our time, and responses to major domestic disasters are among the few good measures we’ll have of how close we are to the inevitable crisis.

“Meanwhile, what won’t happen is at least as important as what will. Despite plenty of enthusiastic pronouncements and no shortage of infomercials disguised as meaningful journalism, there will be no grand breakthroughs on the energy front. Liquid fuels—that is to say, petroleum plus anything else that can be thrown into a gas tank—will keep on being produced at something close to 2013’s rates, though the fraction of the total supply that comes from expensive alternative fuels with lower net energy and higher production costs will continue to rise, tightening a noose around the neck of every other kind of economic activity. Renewables will remain as dependent on government subsidies as they’ve been all along, nuclear power will remain dead in the water, fusion will remain a pipe dream, and more exotic items such as algal biodiesel will continue to soak up their quotas of investment dollars before going belly up in the usual way. Once the fracking bubble starts losing air, expect something else to be scooped up hurriedly by the media and waved around to buttress the claim that peak oil won’t happen, doesn’t matter, and so on; any of my readers who happen to guess correctly what that will be, and manage their investments accordingly, may just make a great deal of money.

“Sudden world-ending catastrophes will also be in short supply in 2014, though talk about them will be anything but...Both the grandiose breakthroughs that never happen and the equally gaudy catastrophes that never happen will thus continue to fill their current role as excuses not to think about, much less do anything about, what’s actually happening around us right now—the long ragged decline and fall of industrial civilization that I’ve called the Long Descent. Given the popularity of both these evasive moves, we can safely assume that one more thing won’t happen in 2014:  any meaningful collective response to the rising spiral of crises that’s shredding our societies and our future. As before, anything useful that’s going to happen will be the work of individuals, families, and community groups, using the resources on hand to cope with local conditions.”

As I write these words, the US media is still parroting the fantasy of a fracking-driven “Saudi America” with a mindless repetitiveness that puts broken records to shame, and so the next shiny distraction disguised as a marvelous new energy breakthrough hasn’t yet been trotted out for the usual round of carefully choreographed oohs and aahs. Other than that, once again, I think it’s fair to say I called it. Continuing economic decline, check; a fracking bubble heading toward a world-class bust, check; climate-related disasters on the rise, with government interventions doing less and less to help those affected, check; and a continuing shortage of game-changing breakthroughs, world-ending catastrophes, and meaningful collective responses to the crisis of our age, check-check-check. If this were a bingo game, I’d be walking up to the front of the room with a big smile on my face.

Now of course a case could be made that I’m cheating. After all, it doesn’t take any particular insight to point out that continuing trends tend to continue, or to choose trends that are pretty clearly ongoing and predict that they’ll keep on going for another year. While this is true, it’s also part of the point I’ve been trying to make here for getting on for nine years now:  in the real world, by and large, history is what happened when you weren’t looking. Under some circumstances, sudden jarring discontinuities can hit societies like a cold wet mackerel across the face, but close attention to the decade or so before things changed routinely shows that the discontinuity itself was the product of long-established trends, and could have been anticipated if anyone was willing to do so.

That’s a particularly relevant issue just now, because the sort of long-established trends that can lead to sudden jarring discontinuities have been more and more evident in the United States in recent years, and one of the things that made 2014 so wretched for everyone outside the narrowing circle of the privileged well-to-do is precisely that several of those trends seem to be moving toward a flashpoint. I’d like to sketch out a couple of examples, because my predictions for 2015 will center on them.

The first and most obvious is the headlong collapse of the fracking bubble, which I discussed at some length in a post earlier this month. For most of the last decade, Wall Street has been using the fracking industry in all the same ways it used the real estate industry in the runup to the 2008 crash, churning out what we still laughably call “securities” on the back of a rapidly inflating speculative bubble. As the slumping price of oil kicks the props out from under the fracking boom, the vast majority of that paper—the junk bonds issued by fracking-industry firms, the securitized loans those same firms used to make up for the fact that they lost money every single quarter, the chopped and packaged shale leases, the volumetric production agreements, and all the rest of it—will revert to its actual value, which in most cases approximates pretty closely to zero.

It’s important in this context to remember that those highly insecure securities haven’t been cooped up in the financial equivalent of the dog pound where they belong; quite the contrary, they’ve gone roaming all over the neighborhood, leaving an assortment of messes behind. Banks, investment firms, pension funds, university endowments, and many other institutions in the US and abroad snapped this stuff up in gargantuan amounts, because it offered something like what used to count as a normal rate of return on investment. As a result, as the fracking boom goes belly up, it’s not just firms in the fracking industry that will be joining it in that undignified position. In the real estate bust, a great many businesses and institutions that seemingly had nothing to do with real estate found themselves in deep financial trouble; in the fracking bust, we can count on the same thing happening—and a great deal of the resulting bankruptcies, defaults, and assorted financial chaos will likely hit in 2015.

Thus one of the entertainments 2015 has in store for us is a thumping economic crisis here in the US, and in every other country that depends on our economy for its bread and butter. The scale of the crash depends on how many people bet how much of their financial future on the fantasy of an endless frack-propelled boom, but my guess is it’ll be somewhere around the scale of the 2008 real estate bust.

It probably has to be said that this doesn’t work out to the kind of fast-crash fantasy that sees the global economy grind to a sudden stop in a matter of weeks, leaving supermarket shelves bare and so on. The events of the 2008 crash proved, if there was ever any doubt on that score, that the governments of the world are willing to do whatever it takes to keep economic activity going, and if bailing out their pals in the big banks is what’s needed, hey, that’s all in a day’s work. Now of course bailing out the big banks won’t stop the bankruptcies, the layoffs, the steep cuts to pensions, the slashing of local and state government services, and the rest of it, any more than the same thing did in the wake of the 2008 crisis, but it does guarantee that the perfect storms and worst case scenarios beloved of a certain category of collapsitarian thinkers will remain imaginative fictions.

Something else that’s baked into the baby new year’s birthday cake at this point is a rising spiral of political unrest here in the United States. The mass protests over the extrajudicial executions of nonwhite Americans by police were pretty much inevitable, as pressures on the American underclass have been building toward an explosion for decades now.  There’s a certain bleak amusement to be had from watching financially comfortable white Americans come up with reasons to insist that this can’t possibly be the case, or for that matter, from hearing them contrive ways to evade the awkward fact that American police seem to have much less difficulty subduing belligerent suspects in nonlethal ways when the skins of the suspects in question are white.

Behind the killings and the protests, though, lies an explosive tangle that nobody on either side of the picket lines seems willing to address. Morale in many police departments across the United States resembles nothing so much as morale among American enlisted men in Vietnam in the last years of US involvement; after decades of budget cuts, grandstanding politicians, bungled reforms, an imploding criminal justice system, and ongoing blowback from misguided economic and social policies, a great many police officers feel that they’re caught between an enemy they can’t defeat and a political leadership that’s more than willing to throw them to the wolves for personal advantage. That the “enemy” they think they’re fighting is indistinguishable from the people they’re supposed to be protecting just adds to the list of troubling parallels.

In Vietnam, collapsing morale led to war crimes, “fragging” of officers by their own men, and worried reports to the Pentagon warning of the possibility of armed mutinies among US troops.  We haven’t yet gotten to the fragging stage this time, though the response of New York police to Mayor De Blasio suggests that we’re closer to that than most people think.  The routine extrajudicial execution of nonwhite suspects—there are scores if not hundreds of such executions a year—is the My Lai of our era, one of the few warnings that gets through the Five O’Clock Follies of the media to let the rest of us know that the guys on the front lines are cracking under the strain.

The final bitter irony here is that the federal government has been busily worsening the situation by encouraging the militarization of police departments across the United States, to the extent of equipping them with armored personnel carriers and other pieces of hardware that don’t have any possible use in ordinary policing. This is one of a good many data points that has me convinced that the US government is frantically gearing up to fight a major domestic insurgency. What’s more, they’re almost certainly going to get one. For decades now, since the post-Soviet “color revolutions,” the US has been funding and directing mass movements and rebellions against governments we don’t like, with Syria and Ukraine the two most recent beneficiaries of that strategy. We’ve made a lot of enemies in the process; it’s a safe bet that some of those enemies are eager to give us a heaping dose of our own medicine, and there are certainly nations with the means, motive, and opportunity to do just that.

Will an American insurgency funded by one or more hostile foreign powers get under way in 2015? I don’t think so, though I’m prepared to be wrong. More likely, I think, is another year of rising tensions, political gridlock, scattered gunfire, and rhetoric heated to the point of incandescence, while the various players in the game get into position for actual conflict:  the sort of thing the United States last saw in the second half of the 1850s, as sectional tensions built toward the bloody opening rounds of the Civil War.  One sign to watch for is the first outbreaks of organized violence—not just the shooting of one individual by another, but paramilitary assaults by armed groups—equivalent, more or less, to the fighting in “bleeding Kansas” that did so much to help make the Civil War inevitable.

Another thing to watch for, along the same lines, are glorifications of revolutionary violence on the part of people who haven’t previously taken that plunge. To some extent, that’s already happening. I’m thinking here especially of a recent essay by Rebecca Solnit, which starts off with lavish praise of the French Revolution: “It’s popular to say that the experiment failed,” she says, “but that’s too narrow an interpretation. France never again regressed to an absolutist monarchy”—a statement that will surprise anyone who’s heard of Napoleon, Louis XVII, or Napoleon III. In holding up the French Revolution as a model for today’s radicals, Ms. Solnit also doesn’t happen to mention the Terror, the tyranny of the Directorate, the Napoleonic wars, or any of the other problematic outcomes of the uprising in question. That sort of selective historical memory is common in times like these, and has played a very large role in setting the stage for some of history’s most brutal tragedies.

Meanwhile, back behind these foreground events, the broader trends this blog has been tracking since its outset are moving relentlessly on their own trajectories. The world’s finite supplies of petroleum, along with most other resources on which industrial civilization depends for survival, are depleting further with each day that passes; the ecological consequences of treating the atmosphere as an aerial sewer for the output of our tailpipes and smokestacks, along with all the other frankly brainless ways our civilization maltreats the biosphere that sustains us all, builds apace; caught between these two jaws of a tightening vise, industrial civilization has entered the rising spiral of crisis about which so many environmental scientists tried to warn the world back in the 1970s, and only a very small minority of people out on the fringes of our collective discourse has shown the least willingness to recognize the mess we’re in and start changing their own lives in response: the foundation, it bears repeating, of any constructive response to the crisis of our era. 

I’ve heard quite a few people insist hopefully that since 2014 was so bad, 2015 has to be better. I’m sorry to say, though, that I don’t see much likelihood of that, at least here in the US. Quite the contrary, I think that when people recall 2015, they may just think of it as the year in which America got slapped across its collective face with the cold wet mackerel of reality. Come New Year’s Day of 2016, I expect to find the dollar, the Euro, the stock market, and the Super Bowl still functioning, gas in the pumps and products for sale on the grocery store shelves—but the nation in which these things exist will have passed through a traumatic and crisis-ridden year, and the chances of avoiding an even more difficult 2016 don’t seem good just now. Still, we’ll see.

Wednesday, December 24, 2014

Waiting for the Sunrise

By the time many of my readers get to this week’s essay here on The Archdruid Report, it will be Christmas Day. Here in America, that means that we’re finally most of the way through one of the year’s gaudiest orgies of pure vulgar greed, the holiday shopping season, which strikes me as rather an odd way to celebrate the birth of someone whose teachings so resolutely critiqued the mindless pursuit of material goodies. If, as that same person pointed out, it’s impossible to serve both God and Mammon, the demon of wealth, it’s pretty clear which of those two personages most Americans—including no small number who claim to be Christians—really consider the reason for the season.
 
A long time before that stable in Bethlehem received its most famous tenants, though, the same day was being celebrated across much of the northern temperate zone. The reason has to do with one of those details everyone knew before the invention of electric lighting and few people remember now, the movement of the apparent point of sunrise along the eastern horizon during the year. Before the printing press made calendars ubiquitous, that was a standard way of gauging the changing seasons: the point of sunrise swings from southeast to northeast as winter in the northern hemisphere gives way to summer and from northeast back to southeast as summer gives way again to winter, and if you have a way to track the apparent motion, you can follow the yearly cycle with a fair degree of precision.

This movement is like the swing of a pendulum: it’s very fast in the middle of the arc, and slows to a dead stop on either end. That makes the spring and fall equinoxes easy to identify—if you have a couple of megaliths lined up just right, for example, the shadow of one will fall right on the foot of the other on the days of the equinoxes, and a little to one side or the other one day before or after—but the summer and winter solstices are a different matter. At those times of year, the sun seems to grind to a halt around the 17th of June or December, you wait for about a week, and then finally the sun comes up a little further south on June 25th or a little further north on December 25th, and you know for a fact that the wheel of the seasons is still turning.

That’s why Christmas is when it is. I’ve read, though I don’t have the reference handy, that shepherds in the Levant back in the day kept watch over their flocks in the fields in late summer, not in December, and so—if the New Testament narrative is to be believed—Jesus was something like four months old when the first Christmas rolled around. As far as I know, nobody knows exactly how the present date got put in place, but I suspect the old solar symbolism had a lot to do with it; in those days, the Christian church was less prone to the rigid literalism that’s become common in recent centuries, and also quite aware of seasonal and astronomical cycles—consider the complicated rules for setting the date of Easter, in which movements of the sun and moon both play a part.

I’ve been thinking quite a bit about such things as the holiday shopping season stumbles toward its end and a troubled, weary, and distracted nation prepares to bid a hearty good riddance to 2014. Of course Druids generally think about such things; the seasonal cycle has had an important role in our traditions since those were revived in the eighteenth century. Even so, it’s been more on my mind than usual.  In particular, as I think about the shape of things in the world right now, what keeps coming to mind is the image of the old loremasters, waiting in the darkness at the end of a cold winter’s night to see the sunrise begin swinging back in the direction of spring.

Those of my readers who see such an image as hopelessly out of place just now have, I grant, quite a bit of evidence on their side. Most notably, the coming of 2015 marks a full decade since production of conventional petroleum worldwide hit its all-time peak and began to decline. Those who were around in the early days of the peak oil scene, as I was, will doubtless recall how often and eagerly the more optimistic members of that scene insisted that once the peak arrived, political and business interests everywhere would be forced to come to terms with the end of the age of cheap abundant energy. Once that harsh but necessary awakening took place, they argued, the transition to sustainable societies capable of living within the Earth’s annual budget of sunlight would finally get under way.

Of course that’s not what happened.  Instead, political and business interests responded to the peak by redefining what counts as crude oil, pouring just about any flammable liquid they could find into the world’s fuel tank—ethanol, vegetable oil, natural gas liquids, “dilbit” (diluted bitumen) from tar sands, you name it—while scraping the bottom of the barrel for petroleum via hydrofracturing, ultradeep offshore wells, and other extreme extraction methods. All of those require much higher inputs of fossil fuel energy per barrel produced than conventional crude does, so that a growing fraction of the world’s fossil fuel supply has had to be burned just to produce more fossil fuel. Did any whisper of this far from minor difficulty find its way into the cheery charts of “all liquids” and the extravagantly rose-colored projections of future production? Did, for example, any of the official agencies tasked with tracking fossil fuel production consider subtracting an estimate for barrels of oil equivalent used in extraction from the production figures, so that we would have at least a rough idea of the world’s net petroleum production?  Surely you jest.

The need to redirect an appreciable fraction of the world’s fossil fuel supply into fossil fuel production, in turn, had significant economic costs. Those were shown by the simultaneous presence of prolonged economic dysfunction and sky-high oil prices: a combination, please note, that last appeared during the energy crises of the 1970s, and should have served as a warning sign that something similar was afoot. Instead of paying attention, political and business interests around the world treated the abrupt fraying of the economy as a puzzling anomaly to be drowned in a vat of cheap credit—when, that is, they didn’t treat it as a public relations problem that could be solved by proclaiming a recovery that didn’t happen to exist. Economic imbalances accordingly spun out of control; paper wealth flowed away from those who actually produce goods and service into the hands of those who manipulate fiscal abstractions; the global economy was whipsawed by convulsive fiscal crisis in 2009 and 2009, and shows every sign of plunging into a comparable round of turmoil right now.

I wish I could say that the alternative energy side of the equation had responded to any of this in a way that might point toward a better future, but no such luck. With embarrassingly few exceptions, the things that got funding, or even any significant amount of discussion, were the sorts of overpriced white-elephant systems that only make economic sense in the presence of lavish government subsidies, and are utterly dependent on a technostructure that’s only viable given exactly the sort of cheap abundant fossil fuels that those systems are theoretically going to replace. Grid-tied photovoltaic systems, gargantuan wind turbines, and vast centralized solar-thermal facilities soaked up the attention and the funding, while simple, affordable, thoroughly proven technologies such as solar water heating got another decade of malign neglect. As for using less—the necessary foundation for anything approaching a sustainable future—that remained utterly taboo in polite company.

Back in 2005, a then-famous study done for the Department of Energy by a team headed by Robert Hirsch showed that to get through declining oil supplies without massive crisis, preparations for the descent would have to begin twenty years before the peak arrived. Since the peak of conventional crude oil production had already arrived in 2005, this warning was perhaps a little tardy, but a crash program focusing on conservation and the conversion of energy-intensive infrastructure to less vulnerable technologies might still have done much. Instead, we collectively wasted another decade on daydreams—and all the while, week after dreary week, the mainstream media has kept up a steady drumbeat of articles claiming to prove that this or that or the other thing has disproved peak oil. Given all this, is there any reason to expect anything other than a continuation of the same dysfunctional behavior, with the blind leading the blind until they all tumble together down the long bitter slope ahead?

As it happens, I think there is.

Part of it, oddly enough, is the steady drumbeat of articles just referred to, each claiming to have disproved peak oil once and for all. The last time the subject was shouted down, in the early 1980s, there wasn’t that kind of ongoing barrage; after a few blandly confident denunciations, the subject just got dropped from the media so hard it would have left a dent on a battleship’s armored deck, and was consigned thereafter to a memory hole straight out of George Orwell. Presumably that was the intention this time, too, but something has shifted.  In the early 1980s, when the media started spouting the same sort of cornucopian drivel they’re engaged in this time, the vast majority of the people who claimed to be concerned about energy and the environment trotted along after them with scarcely a dissenting bleat. That hasn’t happened in the present case; if I may indulge in a bit of very edgy irony here, this is one of the very few ways in which it really is different this time.

It’s worth glancing back over how that difference unfolded. To be sure, the brief heyday during which media reports took the end of the age of cheap abundant energy seriously stopped abruptly when puffing up the fracking bubble became the order of the day; the aforementioned drumbeat of alleged disproofs got going; those of us who kept on talking about peak oil started getting pressure from mainstream (that is, corporate-funded) environmentalists to drop the subject, get on board with the climate change bandwagon, and join them in the self-defeating rut that’s kept the environmental movement from accomplishing anything worthwhile for the last thirty years. In response, a certain number of bloggers and speakers who had been involved in peak oil discussions did in fact drop the subject, and those peak oil organizations that had committed themselves to a grant-funded organizational model fell on hard times. A fair number of us stayed the course, though.  Far more significantly, so did a very substantial portion of our audience.

That latter point is the thing that I find most encouraging. Over the last decade, in the teeth of constant propaganda from the mass media and a giddy assortment of other sources, the number of people in the United States and elsewhere who are aware of the ongoing decline of industrial society, who recognize the impossibility of infinite growth on a finite planet, and who are willing to make changes in their own lives in response to these things, somehow failed to dwindle away to near-irrelevance, as it did the last time around. If anything—though I don’t have hard statistics to back this perception, just a scattering of suggestive proxy measurements—that number seems to have increased.

When I speak to audiences about catabolic collapse and the twilight of the industrial age these days, for example, I don’t get anything like as many blank looks or causal dismissals as those concepts routinely fielded even a few years ago. Books on peak oil and related topics, mine among them, remain steady sellers, and stats on this blog have zigzagged unevenly but relentlessly upwards over the years, regularly topping 300,000 page views a month this autumn. Less quantifiable but more telling, at least to me, are the shifts I’ve watched in people I know. Some who used to reject the whole idea of imminent and ongoing decline with scornful laughter have slowly come around to rueful admissions that, well, maybe we really are in trouble; others, starting from the same place, now denounce any such notion with the sort of brittle rage that you normally see in people who are losing the ability to make themselves keep believing the dogma they’ve committed themselves to defending.

Even more telling are the young people I meet who have sized up the future with cold eyes, and walked away from the officially approved options spread before them like so many snares by a society whose easy promises a great many of them no longer believe.  Each year that passes brings me more encounters with people in their late teens and twenties who have recognized that the rules that shaped their parents’ and grandparents’ lives don’t work any more, that most of the jobs they have been promised either don’t exist or won’t exist for much longer, that a college education these days is a one-way ticket to decades of debt peonage, and that most of the other  institutions that claim to be there to help them don’t have their best interests in mind.  They’re picking up crafts and skilled trades, living with their parents or with groups of other young people, and learning to get by on less, because the price of doing otherwise is more than they’re willing to pay.

More broadly, more and more people seem to be turning their backs on the American dream, or more precisely on the bleak waking nightmare into which the American dream has metastasized over the last few decades. A growing number of people have walked away from the job market and found ways to support themselves outside a mainstream economy that’s increasingly stacked against them. Even among those who are still in the belly of the beast, the sort of unthinking trust in business as usual that used to be tolerably common straight through American society is increasingly rare these days. Outside the narrowing circle of those who benefit from the existing order of society, a crisis of legitimacy is in the making, and it’s not simply the current US political system that’s facing the brunt of that crisis—it’s the entire crumbling edifice of American collective life.

That crisis of legitimacy won’t necessarily lead to better things. It could all too easily head in directions no sane person would wish to go. I’ve written here more than once about the possibility that the abject and ongoing failure of constructive leadership in contemporary America could lay the groundwork for the rise of something closely akin to the fascist regimes of Depression-era Europe, as people desperate for an alternative to the Republicratic consensus frozen into place inside the Washington DC beltway turn to a charismatic demagogue who promises to break the gridlock and bring change. Things could also go in even more destructive directions; a nation that ships tens of thousands of its young people in uniform to an assortment of Middle Eastern countries, teaches them all the latest trends in  counterinsurgency warfare, and then dumps them back home in a collapsing economy without the benefits they were promised, has only itself to blame if some of them end up applying their skills in the service of a domestic insurgency against the present US government.

Those possibilities are real, and so are a galaxy of other potential outcomes that are considerably worse than what exists in America today. That said, constructive change is also a possibility. The absurd extravagances that most Americans still think of as an ordinary standard of living were always destined to be a short-term phenomenon, and we’re decades past the point at which a descent from those giddy heights could have been made without massive disruptions; no possible combination of political, social, economic, and environmental transformations at this point can change those unwelcome facts. Even so, there’s much worth doing that can still be done. We can at least stop making things worse than they have to be; we can begin shifting, individually and collectively, to technologies and social forms that will still make sense in a world of tightly constrained energy and resource supplies; we can preserve things of value to the near, middle, and far future that might otherwise be lost; we might, given luck and hard work, be able to revive enough of the moribund traditions of American democracy and voluntary association to provide an alternative down the road to the naked rule of force and fraud.

None of that will be easy, but then all the easy options went whistling down the wind a long time ago. No doubt there will still be voices insisting that Americans can have the lifestyles to which they think they’re entitled if only this, or that, or the other thing were to happen; no doubt the collapse of the fracking bubble will be followed by some equally gaudy and dishonest set of cargo-cult rhetoric meant to convince the rubes that happy days will shortly be here again, just as soon as billions of dollars we don’t happen to have are poured down whatever the next rathole du jour happens to be. If enough of us ignore those claims and do what must be done—and “enough” in this context does not need to equal a majority, or even a large minority, of Americans—there’s still much of value that can be accomplished in the time before us.

To return to the metaphor that opened this post, that first slight shift of sunrise north along the horizon from the solstice point, faint as it is, is a reminder that winter doesn’t last forever, even though the coldest nights and the worst of the winter storms come after that point is past. In the same way, bleak as the immediate prospects may be, there can still be a future worth having on the far side of the crisis of our age, and our actions here and now can further the immense task of bringing such a future into being. In the new year, as I continue the current series of posts on the American future, I plan on talking at quite some length about some of the things that can be done and some of the possibilities that those actions might bring within reach.

And with that, I would like to wish my Christian readers a very merry Christmas, my readers of other faiths, a blessed holiday season, and to all my readers, a happy New Year.

Monday, December 22, 2014

Fleetingly Festive

So, as you may have noticed, we're not really here right now. Or at least, less so than usual.

Jon is capturing light, somewhere in the world - I'm never really sure where he is but I'm pretty close to getting some kind of electronic bell that goes around his neck and periodically texts me the words, "Hello, I am alive and I am at these coordinates, please do not tell the government."

And I am finishing (reading and writing) books, at home in Cape Town, a place you should visit if you ever get the chance. I have family visiting so I have an excuse to do the things that tourists do and even though I've lived here on and off since 1999, I am always amazed at how blisteringly beautiful it is and how lucky I am. I hope you occasionally find yourself surrounded by beautiful things.

If poetry teaches us anything, it's that finding a muse is not nearly as important as losing one. Every year around this time, I lose you all and there's a kind of comfortable silence we all must live in, while we acknowledge the fact that the giant spaceship has one again successfully navigated a course around the ball of fire in the sky that gives us all life. I hope it's been a good journey for you.

We walk the dogs and we are amazed that there are owls in the park and the dogs are amazed that there are squirrels and certain members of the older generation of my family are amazed at apple TV and aren't quite sure how hard to press the buttons on the remote. I hope you find things to be amazed by.

Someone once told me that it always rains somewhere pretty this time of year. I hope wherever you are at this time of year, the sky does something pretty for you.

My best,

Me

Wednesday, December 17, 2014

Déjà Vu All Over Again

Over the last few weeks, a number of regular readers of The Archdruid Report have asked me what I think about the recent plunge in the price of oil and the apparent end of the fracking bubble. That interest seems to be fairly widespread, and has attracted many of the usual narratives; the  blogosphere is full of claims that the Saudis crashed the price of oil to break the US fracking industry, or that Obama got the Saudis to crash the price of oil to punish the Russians, or what have you.
 
I suspect, for my part, that what’s going on is considerably more important. To start with, oil isn’t the only thing that’s in steep decline. Many other major commodities—coal, iron ore, and copper among them—have registered comparable declines over the course of the last few months. I have no doubt that the Saudi government has its own reasons for keeping their own oil production at full tilt even though the price is crashing, but they don’t control the price of those other commodities, or the pace of commercial shipping—another thing that has dropped steeply in recent months.

What’s going on, rather, is something that a number of us in the peak oil scene have been warning about for a while now. Since most of the world’s economies run on petroleum products, the steep oil prices of the last few years have taken a hefty bite out of all economic activities.  The consequences of that were papered over for a while by frantic central bank activities, but they’ve finally begun to come home to roost in what’s politely called “demand destruction”—in less opaque terms, the process by which those who can no longer afford goods or services stop buying them.

That, in turn, reminded me of the last time prolonged demand destruction collided with a boom in high-priced oil production, and sent me chasing after a book I read almost three decades ago. A few days ago, accordingly,  the excellent interlibrary loan service we have here in Maryland brought me a hefty 1985 hardback by financial journalist Philip Zweig, with the engaging title Belly Up: The Collapse of the Penn Square Bank. Some of my readers may never have heard of the Penn Square Bank; others may be scratching their heads, trying to figure out why the name sounds vaguely familiar. Those of my readers who belong to either category may want to listen up, because the same story seems to be repeating itself right now on an even larger scale.

The tale begins in the middle years of the 1970s, when oil prices shot up to unprecedented levels, and reserves of oil and natural gas that hadn’t been profitable before suddenly looked like winning bets. The deep strata of Oklahoma’s Anadarko basin were ground zero for what many people thought was a new era in natural gas production, especially when a handful of deep wells started bringing in impressive volumes of gas. The only missing ingredient was cash, and plenty of it, to pay for the drilling and hardware. That’s where the Penn Square Bank came into the picture.

The Penn Square Bank was founded in 1960. At that time, as a consequence of hard-earned suspicions about big banks dating back to the Populist era, Oklahoma state banking laws prohibited banks from owning more than one branch, and so there were hundreds of little one-branch banks scattered across the state, making a modest return from home mortgages, auto loans, and the like. That’s what Penn Square was; it had been organized by the developer of the Penn Square shopping mall, in the northern suburbs of Oklahoma City, to provide an additional draw to retailers and customers. There it sat, in between a tobacconist and Shelley’s Tall Girl’s Shop, doing ordinary retail banking, until 1975.

In that year it was bought by a group of investors headed by B.P. “Beep” Jennings, an Oklahoma City banker who had been passed over for promotion at one of the big banks in town. Jennings pretty clearly wanted to prove that he could run with the big dogs; he was an excellent salesman, but not particularly talented at the number-crunching details that make for long-term success in banking, and he proceeded to demonstrate his strengths and weaknesses in an unforgettable manner. He took the little shopping mall bank and transformed it into a big player in the Oklahoma oil and gas market, which was poised—or so a chorus of industry voices insisted—on the brink of one of history’s great energy booms.

Now of course this involved certain difficulties, which had to be overcome. A small shopping center bank doesn’t necessarily have the financial resources to become a big player in a major oil and gas market, for example. Fortunately for Beep Jennings, one of the grand innovations that has made modern banking what it is today had already occurred; by his time, loans were no longer seen as money that was collected from depositors and loaned out to qualified borrowers, in the expectation that it would be repaid with interest. Rather, loans were (and are) assets, which could (and can) be sold, for cash, to other banks. This is what Penn Square did, and since their loans charged a competitive interest rate and thus promised competitive profits, they were eagerly snapped up by Chase Manhattan, Continental Illinois, Seattle First, and a great many other large and allegedly sophisticated banks. So Penn Square Bank started issuing loans to Oklahoma oil and gas entrepreneurs, a flotilla of other banks around the country proceeded to fund those loans, and to all intents and purposes, the energy boom began.

At least that’s what it looked like. There was a great deal of drilling going on, certainly; the economists insisted that the price of oil and gas would just keep on rising; the local and national media promptly started featuring giddily enthusiastic stories about the stunning upside opportunities in the booming Oklahoma oil and gas business. What’s more, Oklahoma oil and gas entrepreneurs were spending money like nobody’s business, and not just on drilling leases, steel pipe, and the other hardware of the trade. Lear jets, vacation condos in fashionable resorts, and such lower-priced symbols of nouveau richesse as overpriced alligator-hide cowboy boots were much in evidence; so was the kind of high-rolling crassness that only the Sunbelt seems to inspire. Habitués of the Oklahoma oilie scene used to reminisce about one party where one of the attendees stood at the door with a stack of crisp $100 bills in his hand and asked every woman who entered how much she wanted for her clothes: every stitch, then and there, piled up in the entry. Prices varied, but apparently none of them turned down the offer.

It’s only fair to admit that there were a few small clouds marring the otherwise sunny vistas of the late 1970s Oklahoma oil scene. One of them was the difficulty the banks buying loans from Penn Square—the so-called “upstream” banks—had in getting Penn Square to forward all the necessary documents on those loans. Since their banks were making loads of money off the transactions, the people in charge at the upstream banks were unwilling to make a fuss about it, and so their processing staff just had to put up with such minor little paperwork problems as missing or contradictory statements concerning collateral, payments of interest and principal, and so on. 

Mind you, some of the people in charge at those upstream banks seem to have had distinctly personal reasons for not wanting to make a fuss about those minor little paperwork problems. They were getting very large loans from Penn Square on very good terms, entering into partnerships with Penn Square’s favorite oilmen, and in at least some cases attending the clothing-optional parties just mentioned. No one else in the upstream banks seems to have been rude enough to ask too many questions about these activities; those who wondered aloud about them were told, hey, that’s just the way Oklahoma oilmen do business, and after all, the banks were making loads of money off the boom.

All in all, the future looked golden just then. In 1979, the Iranian revolution drove the price of oil up even further; in 1980, Jimmy Carter’s troubled presidency—with its indecisive but significant support for alternative energy and, God help us all, conservation—was steamrollered by Reagan’s massively funded and media-backed candidacy. As the new president took office in January of 1981, promising “morning in America,” the Penn Square bankers, their upstream counterparts, their clients in the Oklahoma oil and gas industry, and everyone else associated with the boom felt confident that happy days were there to stay. After all, the economists insisted that the price of oil and gas would just keep rising for decades to come, the most business-friendly and environment-hostile administration in living memory was comfortably ensconced in the White House; and investors were literally begging to be allowed to get a foot in the door in the Oklahoma boom. What could possibly go wrong?

Then, in 1981, without any fuss at all, the price of oil and natural gas peaked and began to decline.

In retrospect, it’s not difficult to see what happened, though a lot of people since then have put a lot of effort into leaving the lessons of those years unlearnt.  Energy is so central to a modern economy that when the price of energy goes up, every other sector of the economy ends up taking a hit. The rising price of energy functions, in effect, as a hidden tax on all economic activity outside the energy sector, and sends imbalances cascading through every part of the economy. As a result, other economic sectors cut their expenditures on energy as far as they can, either by conservation measures or by such tried and true processes as shedding jobs, cutting production, or going out of business. All this had predictable effects on the price of oil and gas, even though very few people predicted them.

As oil and gas prices slumped, investors started backing away from fossil fuel investments, including the Oklahoma boom. Upstream banks, in turn, started to have second thoughts about the spectacular sums of money they’d poured into Penn Square Bank loans. For the first time since the boom began, hard questions—the sort of questions that, in theory, investors and bankers are supposed to ask as a matter of course when people ask them for money—finally got asked. That’s when the problems began in earnest, because a great many of those questions didn’t have any good answers.

It took until July 5, 1982 for the boom to turn definitively into a bust. That’s the day that  federal bank regulators, after several years of inconclusive fumbling and a month or so of increasing panic, finally shut down the Penn Square Bank. What they discovered, as they dug through the mass of fragmentary, inaccurate, and nonexistent paperwork, was that Penn Square had basically been lending money to anybody in the oil and gas industry who wanted some, without taking the trouble to find out if the borrowers would ever be able to repay it. When payments became a problem, Penn Square obligingly loaned out the money to make their payments, and dealt with loans that went bad by loaning deadbeat borrowers even more money, so they could clear their debts and maintain their lifestyles.

The oil and gas boom had in fact been nothing of the kind, as a good many of the firms that had been out there producing oil and gas had been losing money all along.  Rather, it was a Ponzi scheme facilitated by delusional lending practices.  All those Lear jets, vacation condos, alligator-skin cowboy boots, heaps of slightly used women’s clothing, and the rest of it? They were paid for by money from investors and upstream banks, some of it via the Penn Square Bank, the rest from other banks and investors. The vast majority of the money was long gone; the resulting crash brought half a dozen major banks to their knees, and plunged Oklahoma and the rest of the US oil belt into a savage recession that gripped the region for most of a decade.

That was the story chronicled in Zweig’s book, which I reread  over a few quiet evenings last week. Do any of the details seem familiar to you? If not, dear reader, you need to get out more.

As far as I know, the fracking bubble that’s now well into its denouement didn’t have a single ineptly run bank at its center, as the Oklahoma oil and gas bubble did. Most of the other details of that earlier fiasco, though, were present and accounted for. Sky-high fuel prices, check; reserves unprofitable at earlier prices that suddenly looked like a winning deal, check; a media frenzy that oversold the upside and completely ignored the possibility of a downside, check; vast torrents of money and credit from banks and investors too dazzled by the thought of easy riches to ask the obvious questions, check; a flurry of drilling companies that lost money every single quarter but managed to stay in business by heaping up mountains of unpayable debt, check. Pretty much every square on the bingo card marked “ecoomic debacle” has been filled in with a pen dipped in fracking fluid.

Now of course a debacle of the Penn Square variety requires at least one other thing, which is a banking industry so fixated on this quarter’s profits that it can lose track of the minor little fact that lending money to people who can’t pay it back isn’t a business strategy with a long shelf life. I hope none of my readers are under the illusion that this is lacking just now. With interest rates stuck around zero and people and institutions that live off their investments frantically hunting for what used to count as a normal rate of return, the same culture of short-term thinking and financial idiocy that ran the global economy into the ground in the 2008 real estate crash remains firmly in place, glued there by the refusal of the Obama administration and its equivalents elsewhere to prosecute even the most egregious cases of fraud and malfeasance.

Now that the downturn in oil prices is under way, and panic selling of energy-related junk bonds and lower grades of unconventional crude oil has begun in earnest, it seems likely that we’ll learn just how profitable the fracking fad of the last few years actually was. My working guess, which is admittedly an outsider’s view based on limited data and historical parallels, is that it was a money-losing operation from the beginning, and looked prosperous—as the Oklahoma boom did—only because it attracted a flood of investment money from people and institutions who were swept up in the craze. If I’m right, the spike in domestic US oil production due to fracking was never more than an artifact of fiscal irresponsibility in the first place, and could not have been sustained no matter what. Still, we’ll see.

The more immediate question is just how much damage the turmoil now under way will do to a US and global economy that have never recovered from the body blow inflicted on them by the real estate bubble that burst in 2008. Much depends on exactly who sunk how much money into fracking-related investments, and just how catastrophically those investments come unraveled.  It’s possible that the result could be just a common or garden variety recession; it’s possible that it could be quite a bit more. When the tide goes out, as Warren Buffet has commented, you find out who’s been swimming naked, and just how far the resulting lack of coverage will extend is a question of no small importance.

At least three economic sectors outside the fossil fuel industry, as I see it, stand to suffer even if all we get is an ordinary downturn. The first, of course, is the financial sector. A vast amount of money was loaned to the fracking industry; another vast amount—I don’t propose to guess how it compares to the first one—was accounted for by issuing junk bonds, and there was also plenty of ingenious financial architecture of the sort common in the housing boom. Those are going to lose most or all of their value in the months and years ahead. No doubt the US government will bail out its pals in the really big banks again, but there’s likely to be a great deal of turmoil anyway, and midsized and smaller players may crash and burn in a big way. One way or another, it promises to be entertaining.

The second sector I expect to take a hit is the renewable energy sector.  In the 1980s, as prices of oil and natural gas plunged, they took most of the then-burgeoning solar and wind industries with them. There were major cultural shifts at the same time that helped feed the abandonment of renewable energy, but the sheer impact of cheap oil and natural gas needs to be taken into account. If, as seems likely, we can expect several years of lowerr energy prices, and several years of the kind of economic downdraft that makes access to credit for renewable-energy projects a real challenge, a great many firms in the green sector will struggle for survival, and some won’t make it.

Those renewable-energy firms that pull through will find a substantial demand for their services further down the road, once the recent talk about Saudi America finds its proper home in the museum of popular delusions next to perpetual motion machines and Piltdown Man, and the US has to face a future without the imaginary hundred-year reserve of fracked natural gas politicians were gabbling about not that long ago. Still, it’s going to take some nimble footwork to get there; my guess is that those firms that get ready to do without government subsidies and tax credits, and look for ways to sell low-cost homescale systems in an era of disintegrating energy infrastructure, will do much better than those that cling to the hope of government subsidies and big corporate contracts.

The third sector I expect to land hard this time around is the academic sector. Yes, I know, it’s not fashionable to talk of the nation’s colleges and universities as an economic sector, but let’s please be real; in today’s economy, the academic industry functions mostly as a sales office for predatory loans, which are pushed on unwary consumers using deceptive marketing practices. The vast majority of people who are attending US universities these days, after all, will not prosper as a result; in fact, they will never recover financially from the burden of their student loans, since the modest average increase in income that will come to those graduates who actually manage to find jobs will be dwarfed by the monthly debt service they’ll have to pay for decades after graduation.

One of the core reasons why the academic industry has become so vulnerable to a crash is that most colleges and universities rely on income from their investments to pay their operating expenses, and income from investments has taken a double hit in the last decade. First, the collapse of interest rates to near-zero (and in some cases, below-zero) levels has hammered returns across the spectrum of investment vehicles. As a result, colleges and universities have increasingly put their money into risky investments that promise what used to be ordinary returns, and this drove the second half of the equation; in the wake of the 2008 real estate crash, many colleges and universities suffered massive losses of endowment funds, and most of these losses have never been made good.

Did the nation’s colleges and universities stay clear of the fracking bubble?  That would have required, I think, far more prudence and independent thinking than the academic industry has shown of late. Those institutions that had the common sense to get out of fossil fuels for ecological reasons may end up reaping a surprising benefit; the rest, well, here again we’ll have to wait and see. My working guess, which is once again an outsider’s guess based on limited data and historical parallels, is that a great many institutions tried to bail themselves out from the impact of the real estate bust by doubling down on fracking. If that’s what happened, the looming crisis in American higher education—a crisis driven partly by the predatory loan practices mentioned earlier, partly by the jawdropping inflation in the price of a college education in recent decades, and partly by rampant overbuilding of academic programs—will be hitting shortly, and some very big names in the academic industry may not survive the impact.

As Yogi Berra liked to point out, it’s hard to make predictions, especially about the future. Still, it looks as though we may be in the opening stages of a really ugly fiscal crisis, and I’d encourage my readers to take that possibility seriously and act accordingly.

Wednesday, December 10, 2014

Dark Age America: The Sharp Edge of the Shell

One of the interesting features of blogging about the twilight of science and technology these days is that there’s rarely any need to wait long for a cogent example. One that came my way not long ago via a reader of this blog—tip of the archdruidical hat to Eric S.—shows that not even a science icon can get away with asking questions about the rising tide of financial corruption and dogmatic ideology that’s drowning the scientific enterprise in our time.

Many of my readers will recall Bill Nye the Science Guy, the star of a television program on science in the 1990s and still a vocal and entertaining proponent of science education. In a recent interview, Nye was asked why he doesn’t support the happy-go-lucky attitude toward dumping genetically modified organisms into the environment that’s standard in the United States and a few other countries these days. His answer  is that their impact on ecosystems is a significant issue that hasn’t been adequately addressed. Those who know their way around today’s pseudoskeptic scene won’t be surprised by the reaction from one of Discover Magazine’s bloggers: a tar and feathers party, more or less, full of the standard GMO industry talking points and little else.

Nye’s point, as it happens, is as sensible as it is scientific: ecosystems are complex wholes that can be thrown out of balance by relatively subtle shifts, and since human beings depend for their survival and prosperity on the products of natural ecosystems, avoiding unnecessary disruption to those systems is arguably a good idea. This eminently rational sort of thinking, though, is not welcomed in corporate boardrooms just now.  In the case under discussion, it’s particularly unwelcome in the boardrooms of  corporations heavily invested in genetic modification, which have a straightforward if shortsighted financial interest in flooding the biosphere with as many GMOs as they can sell.

Thus it’s reasonable that Monsanto et al. would scream bloody murder in response to Nye’s comment. What interests me is that so many believers in science should do the same, and not only in this one case. Last I checked, “what makes the biggest profit for industry must be true” isn’t considered a rule of scientific reasoning, but that sort of thinking is remarkably common in what passes for skepticism these days. To cite an additional example, it’s surely not accidental that there’s a 1.00 correlation between the health care modalities that make money for the medical and pharmaceutical industries and the health care modalities that the current crop of soi-disant skeptics consider rational and science-based, and an equal 1.00 correlation between those modalities that don’t make money for the medical and pharmaceutical industries and those that today’s skeptics dismiss as superstitious quackery.

To some extent, this is likely a product of what’s called “astroturfing,” the manufacture of artificial grassroots movements to support the agendas of an industrial sector or a political faction. The internet, with its cult of anonymity and its less than endearing habit of letting every discussion plunge to the lowest common denominator of bullying and abuse, was tailor-made for that sort of activity; it’s pretty much an open secret at this point, or so I’m told by the net-savvy, that most significant industries these days maintain staffs of paid flacks who spend their working hours searching the internet for venues to push messages favorable to their employers and challenge opposing views. Given the widespread lack of enthusiasm for GMOs, Monsanto and its competitors would have to be idiots to neglect such an obvious and commonly used marketing tactic.

Still, there’s more going on here than ordinary media manipulation in the hot pursuit of profits. There are plenty of people who have no financial stake in the GMO industry who defend it fiercely from even the least whisper of criticism, just as there are plenty of people who denounce alternative medicine in ferocious terms even though they don’t happen to make money from the medical-pharmaceutical industrial complex. I’ve discussed in previous posts here, and in a forthcoming book, the way that faith in progress was pressed into service as a substitute for religious belief during the nineteenth century, and continues to fill that role for many people today. It’s not a transformation that did science any good, but its implications as industrial civilization tips over into decline and fall are considerably worse than the ones I’ve explored in previous essays. I want to talk about those implications here, because they have a great deal to say about the future of science and technology in the deindustrializing world of the near future.

It’s important, in order to make sense of those implications, to grasp that science and technology function as social phenomena, and fill social roles, in ways that have more than a little in common with the intellectual activities of civilizations of the past. That doesn’t mean, as some postmodern theorists have argued, that science and technology are purely social phenomena; both of them have to take the natural world into account, and so have an important dimension that transcends the social. That said, the social dimension also exists, and since human beings are social mammals, that dimension has an immense impact on the way that science and technology function in this or any other human society.

From a social standpoint, it’s thus not actually all that relevant that that the scientists and engineers of contemporary industrial society can accomplish things with matter and energy that weren’t within the capacities of Babylonian astrologer-priests, Hindu gurus, Chinese literati, or village elders in precontact New Guinea. Each of these groups have been assigned a particular social role, the role of interpreter of Nature, by their respective societies, and each of them are accorded substantial privileges for fulfilling the requirements of their role. It’s therefore possible to draw precise and pointed comparisons between the different bodies of people filling that very common social role in different societies.

The exercise is worth doing, not least because it helps sort out the far from meaningless distinction between the aspects of modern science and technology that unfold from their considerable capacities for doing things with matter and energy, and the aspects of modern science and technology that unfold from the normal dynamics of social privilege.  What’s more, since modern science and technology wasn’t around in previous eras of decline and fall but privileged intellectual castes certainly were, recognizing the common features that unite today’s scientists, engineers, and promoters of scientific and technological progress with equivalent groups in past civilizations makes it a good deal easier to anticipate the fate of science and technology in the decades and centuries to come.

A specific example will be more useful here than any number of generalizations, so let’s consider the fate of philosophy in the waning years of the Roman world. The extraordinary intellectual adventure we call classical philosophy began in the Greek colonial cities of Ionia around 585 BCE, when Thales of Miletus first proposed a logical rather than a mythical explanation for the universe, and proceeded through three broad stages from there. The first stage, that of the so-called Presocratics, focused on the natural world, and the questions it asked and tried to answer can more or less be summed up as “What exists?”  Its failures and equivocal successes led the second stage, which extended from Socrates through Plato and Aristotle to the Old Academy and its rivals, to focus their attention on different questions, which can be summed up just as neatly as “How can we know what exists?”

That was an immensely fruitful shift in focus. It led to the creation of classical logic—one of the great achievements of the human mind—and it also drove the transformations that turned mathematics from an assortment of rules of thumb to an architecture of logical proofs, and thus laid the foundations on which Newtonian physics and other quantitative sciences eventually built.  Like every other great intellectual adventure of our species, though, it never managed to fulfill all the hopes that had been loaded onto it; the philosopher’s dream of human society made wholly subject to reason turned out to be just as unreachable as the scientist’s of the universe made wholly subject to the human will. As that failure became impossible to ignore, classical philosophy shifted focus again, to a series of questions and attempted answers that amounted to “given what we know about what exists, how should we live?”

That’s the question that drove the last great age of classical philosophy, the age of the Epicureans, the Stoics, and the Neoplatonists, the three philosophical schools I discussed a few months back as constructive personal responses to the fall of our civilization. At first, these and other schools carried on lively and far-reaching debates, but as the Roman world stumbled toward its end under the burden of its own unsolved problems, the philosophers closed ranks; debates continued, but they focused more and more tightly on narrow technical issues within individual schools. What’s more, the schools themselves closed ranks; pure Stoic, Aristotelian, and Epicurean philosophy gradually dropped out of fashion, and by the fourth century CE, a Neoplatonism enriched with bits and pieces of all the other schools stood effectively alone, the last school standing in the long struggle Thales kicked off ten centuries before.

Now I have to confess to a strong personal partiality for the Neoplatonists. It was from Plotinus and Proclus, respectively the first and last great figures in the classical tradition, that I first grasped why philosophy matters and what it can accomplish, and for all its problems—like every philosophical account of the world, it has some—Neoplatonism still makes intuitive sense to me in a way that few other philosophies do. What’s more, the men and women who defended classical Neoplatonism in its final years were people of great intellectual and personal dignity, committed to proclaming the truth as they knew it in the face of intolerance and persecution that ended up costing no few of them their lives.

The awkward fact remains that classical philosophy, like modern science, functioned as a social phenomenon and filled certain social roles. The intellectual power of the final Neoplatonist synthesis and the personal virtues of its last proponents have to be balanced against its blind support of a deeply troubled social order; in all the long history of classical philosophy, it never seems to have occurred to anyone that debates about the nature of justice might reasonably address, say, the ethics of slavery. While a stonecutter like Socrates could take an active role in philosophical debate in Athens in the fourth century BCE, furthermore, the institutionalization of philosophy meant that by the last years of classical Neoplatonism, its practice was restricted to those with ample income and leisure, and its values inevitably became more and more closely tied to the social class of its practitioners.

That’s the thing that drove the ferocious rejection of philosophy by the underclass of the age, the slaves and urban poor who made up the vast majority of the population throughout the Roman empire, and who received little if any benefit from the intellectual achievements of their society. To them, the subtleties of Neoplatonist thought were irrelevant to the increasingly difficult realities of life on the lower end of the social pyramid in a brutally hierarchical and increasingly dysfunctional world. That’s an important reason why so many of them turned for solace to a new religious movement from the eastern fringes of the empire, a despised sect that claimed that God had been born on earth as a mere carpenter’s son and communicated through his life and death a way of salvation that privileged the poor and downtrodden above the rich and well-educated.

It was as a social phenomenon, filling certain social roles, that Christianity attracted persecution from the imperial government, and it was in response to Christianity’s significance as a social phenomenon that the imperial government executed an about-face under Constantine and took the new religion under its protection. Like plenty of autocrats before and since, Constantine clearly grasped that the real threat to his position and power came from other members of his own class—in his case, the patrician elite of the Roman world—and saw that he could undercut those threats and counter potential rivals through an alliance of convenience with the leaders of the underclass. That’s the political subtext of the Edict of Milan, which legalized Christianity throughout the empire and brought it imperial patronage.

The patrician class of late Roman times, like its equivalent today, exercised power through a system of interlocking institutions from which outsiders were carefully excluded, and it maintained a prickly independence from the central government.  By the fourth century, tensions between the bureaucratic imperial state and the patrician class, with its local power bases and local loyalties, were rising toward a flashpoint.  The rise of Christianity thus gave Constantine and his successors an extraordinary opportunity.  Most of the institutions that undergirded patrician power linked to Pagan religion; local senates, temple priesthoods, philosophical schools, and other elements of elite culture normally involved duties drawn from the traditional faith. A religious pretext to strike at those institutions must have seemed as good as any other, and the Christian underclass offered one other useful feature: mobs capable of horrific acts of violence against prominent defenders of the patrician order.

That was why, for example, a Christian mob in 415 CE dragged the Neoplatonist philosopher Hypatia from her chariot as she rode home from her teaching gig at the Academy in Alexandria, cudgeled her to death, cut the flesh from her bones with sharpened oyster shells—the cheap pocket knives of the day—and burned the bloody gobbets to ashes. What doomed Hypatia was not only her defense of the old philosophical traditions, but also her connection to Alexandria’s patrician class; her ghastly fate was as much the vengeance of the underclass against the elite as it was an act of religious persecution. She was far from the only victim of violence driven by those paired motives, either. It was as a result of such pressures that, by the time the emperor Justinian ordered the last academies closed in 529 CE, the classical philosophical tradition was essentially dead.

That’s the sort of thing that happens when an intellectual tradition becomes too closely affiliated with the institutions, ideologies, and interests of a social elite. If the elite falls, so does the tradition—and if it becomes advantageous for anyone else to target the elite, the tradition can be a convenient target, especially if it’s succeeded in alienating most of the population outside the elite in question.

Modern science is extremely vulnerable to such a turn of events. There was a time when the benefits of scientific research and technological development routinely reached the poor as well as the privileged, but that time has long since passed; these days, the benefits of research and development move up the social ladder, while the costs and negative consequences move down. Nearly all the jobs eliminated by automation, globalization, and the computer revolution, for example, used to hire from the bottom end of the job market. In the same way, changes in US health care in recent decades have benefited the privileged while subjecting most others to substandard care at prices so high that medical bills are the leading cause of bankruptcy in the US today.

It’s all very well for the promoters of progress to gabble on about science as the key to humanity’s destiny; the poor know that the destiny thus marketed isn’t for them.  To the poor, progress means fewer jobs with lower pay and worse conditions, more surveillance and impersonal violence carried out by governments that show less and less interest in paying even lip service to the concept of civil rights, a rising tide of illnesses caused by environmental degradation and industrial effluents, and glimpses from afar of an endless stream of lavishly advertised tech-derived trinkets, perks and privileges that they will never have. Between the poor and any appreciation for modern science stands a wall made of failed schools, defunded libraries, denied opportunities, and the systematic use of science and technology to benefit other people at their expense. Such a wall, it probably bears noting, makes a good surface against which to sharpen oyster shells.

It seems improbable that anything significant will be done to change this picture until it’s far too late for such changes to have any meaningful effect. Barring dramatic transformations in the distribution of wealth, the conduct of public education, the funding for such basic social amenities as public libraries, and a great deal more, the underclass of the modern industrial world can be expected to grow more and more disenchanted with science as a social phenomenon in our culture, and to turn instead—as their equivalents in the Roman world and so many other civilizations did—to some tradition from the fringes that places itself in stark opposition to everything modern scientific culture stands for. Once that process gets under way, it’s simply a matter of waiting until the corporate elite that funds science, defines its values, and manipulates it for PR purposes, becomes sufficiently vulnerable that some other power center decides to take it out, using institutional science as a convenient point of attack.

Saving anything from the resulting wreck will be a tall order. Still, the same historical parallel discussed above offers some degree of hope. The narrowing focus of classical philosophy in its last years meant, among other things, that a substantial body of knowledge that had once been part of the philosophical movement was no longer identified with it by the time the cudgels and shells came out, and much of it was promptly adopted by Christian clerics and monastics as useful for the Church. That’s how classical astronomy, music theory, and agronomy, among other things, found their way into the educational repertoire of Christian monasteries and nunneries in the dark ages. What’s more, once the power of the patrician class was broken, a carefully sanitized version of Neoplatonist philosophy found its way into Christianity; in some denominations, it’s still a living presence today.

That may well happen again. Certainly today’s defenders of science are doing their best to shove a range of scientific viewpoints out the door; the denunciation meted out to Bill Nye for bringing basic concepts from ecology into a discussion where they were highly relevant is par for the course these days. There’s an interesting distinction between the sciences that get this treatment and those that don’t: on the one hand, those that are being flung aside are those that focus on observation of natural systems rather than control of artificial ones; on the other, any science that raises doubts about the possibility or desirability of infinite technological expansion can expect to find itself shivering in the dark outside in very short order. (This latter point applies to other fields of intellectual endeavor as well; half the angry denunciations of philosophy you’ll hear these days from figures such as Neil DeGrasse Tyson, I’m convinced, come out of the simple fact that the claims of modern science to know objective truths about nature won’t stand up to fifteen minutes of competent philosophical analysis.)

Thus it’s entirely possible that observational sciences, if they can squeeze through the bottleneck imposed by the loss of funding and prestige, will be able to find a new home in whatever intellectual tradition replaces modern scientific rationalism in the deindustrial future. It’s at least as likely that such dissident sciences as ecology, which has always raised challenging questions about the fantasies of the manipulative sciences, may find themselves eagerly embraced by a future intellectual culture that has no trouble at all recognizing the futility of those fantasies. That said, it’s still going to take some hard work to preserve what’s been learnt in those fields—and it’s also going to take more than the usual amount of prudence and plain dumb luck not to get caught up in the conflict when the sharp edge of the shell gets turned on modern science.

Wednesday, December 3, 2014

Dark Age America: The Fragmentation of Technology

It was probably inevitable that last week’s discussion of the way that contemporary science is offering itself up as a sacrifice on the altar of corporate greed and institutional arrogance would field me a flurry of responses that insisted that I must hate science.  This is all the more ironic in that the shoddy logic involved in that claim also undergirded George W. Bush’s famous and fatuous insistence that the Muslim world is riled at the United States because “they hate our freedom.”

In point of fact, the animosity felt by many Muslims toward the United States is based on specific grievances concerning specific acts of US foreign policy. Whether or not those grievances are justified is a matter I don’t propose to get into here; the point that’s relevant to the current discussion is that the grievances exist, they relate to identifiable actions on the part of the US government, and insisting that the animosity in question is aimed at an abstraction instead is simply one of the ways that Bush, or for that matter his equally feckless successor, have tried to sidestep any discussion of the means, ends, and cascading failures of US policy toward the Middle East and the rest of the Muslim world.

In the same way, it’s very convenient to insist that people who ask hard questions about the way that contemporary science has whored itself out to economic and political interests, or who have noticed gaps between the claims about reality made by the voices of the scientific mainstream and their own lived experience of the world, just hate science. That evasive strategy makes it easy to brush aside questions about the more problematic dimensions of science as currently practiced. This isn’t a strategy with a long shelf life; responding to a rising spiral of problems by insisting that the problems don’t exist and denouncing those who demur is one of history’s all-time bad choices, but intellectuals in falling civilizations all too often try to shore up the crumbling foundations of their social prestige and privilege via that foredoomed approach.

Central to the entire strategy is a bit of obfuscation that treats “science” as a monolithic unity, rather than the complex and rather ramshackle grab-bag of fields of study, methods of inquiry, and theories about how different departments of nature appear to work. There’s no particular correlation between, let’s say, the claims made for the latest heavily marketed and dubiously researched pharmaceutical, on the one hand, and the facts of astronomy, evolutionary biology, or agronomy on the other; and someone can quite readily find it impossible to place blind faith in the pharmaceutical and the doctor who’s pushing it on her, while enjoying long nights observing the heavens through a telescope, delighting in the elegant prose and even more elegant logic of Darwin’s The Origin of Species, or running controlled experiments in her backyard on the effectiveness of compost as a soil amendment. To say that such a person “hates science” is to descend from meaningful discourse to thoughtstopping noise.

The habit of insisting that science is a single package, take it or leave it, is paralleled by the equivalent and equally specious insistence that there is this single thing called “technology,” that objecting to any single component of that alleged unity amounts to rejecting all of it, and that you’re not allowed to pick and choose among technologies—you have to take all of it or reject it all. I field this sort of nonsense all the time. It so happens, for example, that I have no interest in owning a cell phone, never got around to playing video games, and have a sufficiently intense fondness for books printed on actual paper that I’ve never given more than a passing thought to the current fad for e-books.

I rarely mention these facts to those who don’t already know them, because it’s a foregone conclusion that if I do so, someone will ask me whether I hate technology.  Au contraire, I’m fond of slide rules, love rail travel, cherish an as yet unfulfilled ambition to get deep into letterpress printing, and have an Extra class amateur radio license; all these things entail enthusiastic involvement with specific technologies, and indeed affection for them; but if I mention these points in response to the claim that I must hate technology, the responses I get range from baffled incomprehension to angry dismissal.

“Technology,” in the mind of those who make such claims, clearly doesn’t mean what the dictionary says it means.  To some extent, of course, it amounts to whatever an assortment of corporate and political marketing firms want you to buy this week, but there’s more to it than that. Like the word “science,” “technology” has become a buzzword freighted with a vast cargo of emotional, cultural, and (whisper this) political meanings.  It’s so densely entangled with passionately felt emotions, vast and vague abstractions, and frankly mythic imagery that many of those who use the word can’t explain what they mean by it, and get angry if you ask them to try.

The flattening out of the vast diversity of technologies, in the plural, into a single monolithic shape guarded by unreasoning emotions would be problematic under any conditions. When a civilization that depends on the breakneck exploitation of nonrenewable resources is running up against the unyielding limits of a finite planet, with resource depletion and pollution in a neck-and-neck race to see which one gets to bring the industrial project to an end first, it’s a recipe for disaster. A sane response to the predicament of our time would have to start by identifying the technological suites that will still be viable in a resource-constrained and pollution-damaged environment, and then shift as much vital infrastructure to those as possible with the sharply limited resources we have left. Our collective thinking about technology is so muddled by unexamined emotions, though, that it doesn’t matter now obviously necessary such a project might be: it remains unthinkable.

Willy-nilly, though, the imaginary monolith of “technology” is going to crumble, because different technologies have wildly varying resource requirements, and they vary just as drastically in terms of their importance to the existing order of society. As resource depletion and economic contraction tighten their grip on the industrial world, the stock of existing and proposed technologies face triage in a continuum defined by two axes—the utility of the technology, on the one hand, and its cost in real (i.e., nonfinancial) terms on the other. A chart may help show how this works.


 This is a very simplified representation of the frame in which decisions about technology are made. Every kind of utility from the demands of bare survival to the whims of fashion is lumped in together and measured on the vertical axis, and every kind of nonfinancial cost from energy and materials straight through to such intangibles as opportunity cost is lumped in together and measured on the horizontal axis. In an actual analysis, of course, these variables would be broken out and considered separately; the point of a more schematic view of the frame, like this one, is that it allows the basic concepts to be grasped more easily.

The vertical and horizontal lines that intersect in the middle of the graph are similarly abstractions from a complex reality. The horizontal line represents the boundary between those technologies which have enough utility to be worth building and maintaining, which are above the line, and those which have too little utility to be worth the trouble, which are below it. The vertical line represents the boundary between those technologies which are affordable and those that are not. In the real world, those aren’t sharp boundaries but zones of transition, with complex feedback loops weaving back and forth among them, but again, this is a broad conceptual model.

The intersection of the lines divides the whole range of technology into four categories, which I’ve somewhat unoriginally marked with the first four letters of the alphabet. Category A consists of things that are both affordable and useful, such as indoor plumbing. Category B consists of things that are affordable but useless, such as electrically heated underwear for chickens. Category C consists of things that are useful but unaffordable, such as worldwide 30-minute pizza delivery from low earth orbit. Category D, rounding out the set, consists of things that are neither useful nor affordable, such as—well, I’ll let my readers come up with their own nominees here.

Now of course the horizontal and vertical lines aren’t fixed; they change position from one society to another, from one historical period to another, and indeed from one community, family, or individual to another. (To me, for example, cell phones belong in category B, right next to the electrically heated chicken underwear; other people would doubtless put them in somewhere else on the chart.) Every society, though, has a broad general consensus about what goes in which category, which is heavily influenced by but by no means entirely controlled by the society’s political class.  That consensus is what guides its collective decisions about funding or defunding technologies.


With the coming of the industrial revolution, both of the lines shifted substantially from their previous position, as shown in the second chart. Obviously, the torrent of cheap abundant energy gave the world’s industrial nations access to an unparalleled wealth of resources, and this pushed the dividing line between what was affordable and what was unaffordable quite a ways over toward the right hand side of the chart. A great many things that had been desirable but unaffordable to previous civilizations swung over from category C into category A as fossil fuels came on line. This has been discussed at great length here and elsewhere in the peak oil blogosphere.

Less obviously, the dividing line between what was useful and what was useless also shifted quite a bit toward the bottom of the chart, moving a great many things from category B into category A. To follow this, it’s necessary to grasp the concept of technological suites. A technological suite is a set of interdependent technologies that work together to achieve a common purpose. Think of the relationship between cars and petroleum drilling, computer chips and the clean-room filtration systems required for their manufacture, or commercial airliners and ground control radar. What connects each pair of technologies is that they belong to the same technological suite. If you want to have the suite, you must either have all the elements of the suite in place, or be ready to replace any absent element with something else that can serve the same purpose.

For the purpose of our present analysis, we can sort out the component technologies of a technological suite into three very rough categories. There are interface technologies, which are the things with which the end user interacts—in the three examples just listed, those would be private cars, personal computers, and commercial flights to wherever you happen to be going. There are support technologies, which are needed to produce, maintain, and operate the output technologies; they make up far and away the majority of technologies in a technological suite—consider the extraordinary range of  technologies it takes to manufacture a car from raw materials, maintain it, fuel it, provide it with roads on which to drive, and so on. Some interface technologies and most support technologies can be replaced with other technologies as needed, but some of both categories can’t; we can put those that can’t be replaced bottleneck technologies, for reasons that will become clear shortly.

What makes this relevant to the charts we’ve been examining is that most support technologies have no value aside from the technological suites to which they belong and the interface technologies they serve. Without commercial air travel, for example, most of the specialized technologies found at airports are unnecessary. Thus a great many things that once belonged in category B—say, automated baggage carousels—shifted into category A with the emergence of the technological suite that gave them utility. Thus category A balloons with the coming of industrialization, and it kept getting bigger as long as energy and resource use per capita in the industrial nations kept on increasing.

Once energy and resource use per capita peak and begin their decline, though, a different reality comes into play, leading over time to the situation shown in the third chart.


 As cheap abundant energy runs short, and it and all its products become expensive, scarce, or both, the vertical line slides inexorably toward the left. That’s obvious enough. Less obviously, the horizontal line also slides upwards. The reason, here again, is the interrelationship of individual technologies into technological suites. If commercial air travel stops being economically viable, the support technologies that belong to that suite are no longer needed. Even if they’re affordable enough to stay on the left hand side of the vertical line, the technologies needed to run automated baggage carousels thus no longer have enough utility to keep them above the horizontal line, and down they drop into category B.

That’s one way that a technology can drop out of use. It’s just as possible, of course, for something that would still have ample utility to cost too much in terms of real wealth to be an option in a contracting society, and slide across the border into category C. Finally, it’s possible for something to do both at once—to become useless and unaffordable at something like the same time, as economic contraction takes away the ability to pay for the technology and the ability to make use of it at the same time.

It’s also possible for a technology that remains affordable, and participates in a technological suite that’s still capable of meeting genuine needs, to tumble out of category A into one of the others. This can happen because the cost of different technologies differ qualitatively, and not just quantitatively. If you need small amounts of niobium for the manufacture of blivets, and the handful of niobium mines around the world stop production—whether this happens because the ore has run out, or for some other reason, environmental, political, economic, cultural, or what have you—you aren’t going to be able to make blivets any more. That’s one kind of difficulty if it’s possible to replace blivets with something else, or substitute some other rare element for the niobium; it’s quite another, and much more challenging, if blivets made with niobium are the only thing that will work for certain purposes, or the only thing that makes those purposes economically viable.

It’s habitual in modern economics to insist that such bottlenecks don’t exist, because there’s always a viable alternative. That sort of thinking made a certain degree of sense back when energy per capita was still rising, because the standard way to get around material shortages for a century now has been to throw more energy, more technology, and more complexity into the mix. That’s how low-grade taconite ores with scarcely a trace of iron in them have become the mainstay of today’s iron and steel industry; all you have to do is add fantastic amounts of cheap energy, soaring technological complexity, and an assortment of supply and resource chains reaching around the world and then some, and diminishing ore quality is no problem at all.

It’s when you don’t have access to as much cheap energy, technological complexity, and baroque supply chains as you want that this sort of logic becomes impossible to sustain. Once this point is reached, bottlenecks become an inescapable feature of life. The bottlenecks, as already suggested, don’t have to be technological in nature—a bottleneck technology essential to a given technological suite can be perfectly feasible, and still out of reach for other reasons—but whatever generates them, they throw a wild card into the process of technological decline that shapes the last years of a civilization on its way out, and the first few centuries of the dark age that follows.

The crucial point to keep in mind here is that one bottleneck technology, if it becomes inaccessible for any reason, can render an entire technological suite useless, and compromise other technological suites that depend on the one directly affected. Consider the twilight of ceramics in the late Roman empire. Rome’s ceramic industry operated on as close to an industrial scale as you can get without torrents of cheap abundant energy; regional factories in various places, where high-quality clay existed, produced ceramic goods in vast amounts and distributed them over Roman roads and sea lanes to the far corners of the empire and beyond it. The technological suite that supported Roman dishes and roof tiles thus included transport technologies, and those turned out to be the bottleneck: as long-distance transport went away, the huge ceramic factories could no longer market their products and shut down, taking with them every element of their technological suite that couldn’t be repurposed in a hurry.

The same process affected many other technologies that played a significant role in the Roman world, and for that matter in the decline and fall of every other civilization in history. The end result can best be described as technological fragmentation: what had been a more or less integrated whole system of technology, composed of many technological suites working together more or less smoothly, becomes a jumble of disconnected technological suites, nearly all of them drastically simplified compared to their pre-decline state, and many of them jerry-rigged to make use of still-viable fragments of technological suites whose other parts didn’t survive their encounter with one bottleneck or another.  In places where circumstances permit, relatively advanced technological suites can remain in working order long after the civilization that created them has perished—consider the medieval cities that got their water from carefully maintained Roman aqueducts a millennium after Rome’s fall—while other systems operate at far simpler levels, and other regions and communities get by with much simpler technological suites.

All this has immediate practical importance for those who happen to live in a civilization that’s skidding down the curve of its decline and fall—ours, for example. In such a time, as noted above, one critical task is to identify the technological suites that will still be viable in the aftermath of the decline, and shift as much vital infrastructure as possible over to depend on those suites rather than on those that won’t survive the decline. In terms of the charts above, that involves identifying those technological suites that will still be in category A when the lines stop shifting up and to the left, figuring out how to work around any bottleneck technologies that might otherwise cripple them, and get the necessary knowledge into circulation among those who might be able to use it, so that access to information doesn’t become a bottleneck of its own

That sort of analysis, triage, and salvage is among the most necessary tasks of our time, especially for those who want to see viable technologies survive the end of our civilization, and it’s being actively hindered by the insistence that the only possible positive attitude toward technology is sheer blind faith. For connoisseurs of irony, it’s hard to think of a more intriguing spectacle. The impacts of that irony on the future, though, are complex, and will be the subject of several upcoming posts here.
Related Posts Plugin for WordPress, Blogger...