AODA Blog

The Era of Impact

Wed, 2015-05-20 15:03
Of all the wistful superstitions that cluster around the concept of the future in contemporary popular culture, the most enduring has to be the notion that somehow, sooner or later, something will happen to shake the majority out of its complacency and get it to take seriously the crisis of our age. Week after week, I field comments and emails that presuppose that belief. People want to know how soon I think the shock of awakening will finally hit, or wonder whether this or that event will do the trick, or simply insist that the moment has to come sooner or later.
To all such inquiries and expostulations I have no scrap of comfort to offer. Quite the contrary, what history shows is that a sudden awakening to the realities of a difficult situation is far and away the least likely result of what I’ve called the era of impact, the second of the five stages of collapse. (The first, for those who missed last week’s post, is the era of pretense; the remaining three, which will be covered in the coming weeks, are the eras of response, breakdown, and dissolution.)
The era of impact is the point at which it becomes clear to most people that something has gone wrong with the most basic narratives of a society—not just a little bit wrong, in the sort of way that requires a little tinkering here and there, but really, massively, spectacularly wrong. It arrives when an asset class that was supposed to keep rising in price forever stops rising, does its Wile E. Coyote moment of hang time, and then drops like a stone. It shows up when an apparently entrenched political system, bristling with soldiers and secret police, implodes in a matter of days or weeks and is replaced by a provisional government whose leaders look just as stunned as everyone else. It comes whenever a state of affairs that was assumed to be permanent runs into serious trouble—but somehow it never seems to succeed in getting people to notice just how temporary that state of affairs always was.
Since history is the best guide we’ve got to how such events work out in the real world, I want to take a couple of examples of the kind just outlined and explore them in a little more detail. The stock market bubble of the 1920s makes a good case study on a relatively small scale. In the years leading up to the crash of 1929, stock values in the US stock market quietly disconnected themselves from the economic fundamentals and began what was, for the time, an epic climb into la-la land. There were important if unmentionable reasons for that airy detachment from reality; the most significant was the increasingly distorted distribution of income in 1920s America, which put more and more of the national wealth in the hands of fewer and fewer people and thus gutted the national economy.
It’s one of the repeated lessons of economic history that money in the hands of the rich does much less good for the economy as a whole than money in the hands of the working classes and the poor. The reasoning here is as simple as it is inescapable. Industrial economies survive and thrive on consumer expenditures, but consumer expenditures are limited by the ability of consumers to buy the things they want and need. As money is diverted away from the lower end of the economic pyramid, you get demand destruction—the process by which those who can’t afford to buy things stop buying them—and consumer expenditures fall off. The rich, by contrast, divert a large share of their income out of the consumer economy into investments; the richer they get, the more of the national wealth ends up in investments rather than consumer expenditures; and as consumer expenditures falter, and investments linked to the consumer economy falter in turn, more and more money ends up in illiquid speculative vehicles that are disconnected from the productive economy and do nothing to stimulate demand.
That’s what happened in the 1920s. All through the decade in the US, the rich got richer and the poor got screwed, speculation took the place of productive investment throughout the US economy, and the well-to-do wallowed in the wretched excess chronicled in F. Scott Fitzgerald’s The Great Gatsby while most other people struggled to get by. The whole decade was a classic era of pretense, crowned by the delusional insistence—splashed all over the media of the time—that everyone in the US could invest in the stock market and, since the market was of course going to keep on rising forever, everyone in the US would thus inevitably become rich.
It’s interesting to note that there were people who saw straight through the nonsense and tried to warn their fellow Americans about the inevitable consequences. They were denounced six ways from Sunday by all right-thinking people, in language identical to that used more recently on those of us who’ve had the effrontery to point out that an infinite supply of oil can’t be extracted from a finite planet.  The people who insisted that the soaring stock values of the late 1920s were the product of one of history’s great speculative bubbles were dead right; they had all the facts and figures on their side, not to mention plain common sense; but nobody wanted to hear it.
When the stock market peaked just before the Labor Day weekend in 1929 and started trending down, therefore, the immediate response of all right-thinking people was to insist at the top of their lungs that nothing of the sort was happening, that the market was simply catching its breath before its next great upward leap, and so on. Each new downward lurch was met by a new round of claims along these lines, louder, more dogmatic, and more strident than the one that preceded it, and nasty personal attacks on anyone who didn’t support the delusional consensus filled the media of the time.
People were still saying those things when the bottom dropped out of the market.
Tuesday, October 29, 1929 can reasonably be taken as the point at which the era of pretense gave way once and for all to the era of impact. That’s not because it was the first day of the crash—there had been ghastly slumps on the previous Thursday and Monday, on the heels of two months of less drastic but still seriously ugly declines—but because, after that day, the pundits and the media pretty much stopped pretending that nothing was wrong. Mind you, next to nobody was willing to talk about what exactly had gone wrong, or why it had gone wrong, but the pretense that the good fairy of capitalism had promised Americans happy days forever was out the window once and for all.
It’s crucial to note, though, that what followed this realization was the immediate and all but universal insistence that happy days would soon be back if only everyone did the right thing. It’s even more crucial to note that what nearly everyone identified as “the right thing”—running right out and buying lots of stocks—was a really bad idea that bankrupted many of those who did it, and didn’t help the imploding US economy at all.
It’s probably necessary to talk about this in a little more detail, since it’s been an article of blind faith in the United States for many decades now that it’s always a good idea to buy and hold stocks. (I suspect that stockbrokers have had a good deal to do with the promulgation of this notion.) It’s been claimed that someone who bought stocks in 1929 at the peak of the bubble, and then held onto them, would have ended up in the black eventually, and for certain values of “eventually,” this is quite true—but it took the Dow Jones industrial average until the mid-1950s to return to its 1929 high, and so for a quarter of a century our investor would have been underwater on his stock purchases.
What’s more, the Dow isn’t necessarily a good measure of stocks generally; many of the darlings of the market in the 1920s either went bankrupt in the Depression or never again returned to their 1929 valuations. Nor did the surge of money into stocks in the wake of the 1929 crash stave off the Great Depression, or do much of anything else other than provide a great example of the folly of throwing good money after bad. The moral to this story? In an era of impact, the advice you hear from everyone around you may not be in your best interest.
That same moral can be shown just as clearly in the second example I have in mind, the French Revolution. We talked briefly in last week’s post about the way that the French monarchy and aristocracy blinded themselves to the convulsive social and economic changes that were pushing France closer and closer to a collective explosion on the grand scale, and pursued business as usual long past the point at which business as usual was anything but a recipe for disaster. Even when the struggle between the Crown and the aristocracy forced Louis XVI to convene the États-Généraux—the rarely-held national parliament of France, which had powers more or less equivalent to a constitutional convention in the US—next to nobody expected anything but long rounds of political horse-trading from which some modest shifts in the balance of power might result.
That was before the summer of 1789. On June 17, the deputies of the Third Estate—the representatives of the commoners—declared themselves a National Assembly and staged what amounted to a coup d’etat; on July 14, faced with the threat of a military response from the monarchy, the Parisian mob seized the Bastille, kickstarting a wave of revolt across the country that put government and military facilities in the hands of the revolutionary National Guard and broke the back of the feudal system; on August 4, the National Assembly abolished all feudal rights and legal distinctions between the classes. Over less than two months, a political and social system that had been welded firmly in place for a thousand years all came crashing to the ground.
Those two months marked the end of the era of pretense and the arrival of the era of impact. The immediate response, with a modest number of exceptions among the aristocracy and the inner circles of the monarchy’s supporters, was frantic cheering and an insistence that everything would soon settle into a wonderful new age of peace, prosperity, and liberty. All the overblown dreams of the philosophes about a future age governed by reason were trotted out and treated as self-evident fact. Of course that’s not what happened; once it was firmly in power, the National Assembly used its unchecked authority as abusively as the monarchy had once done; factional struggles spun out of control, and before long mob rule and the guillotine were among the basic facts of life in Revolutionary France. 
Among the most common symptoms of an era of impact, in other words, is the rise of what we may as well call “crackpot optimism”—the enthusiastic and all but universal insistence, in the teeth of the evidence, that the end of business as usual will turn out to be the door to a wonderful new future. In the wake of the 1929 stock market crash, people were urged to pile back into the market in the belief that this would cause the economy to boom again even more spectacularly than before, and most of the people who followed this advice proceeded to lose their shirts. In the wake of the revolution of 1789, likewise, people across France were encouraged to join with their fellow citizens in building the shining new utopia of reason, and a great many of those who followed that advice ended up decapitated or, a little later, dying of gunshot or disease in the brutal era of pan-European warfare that extended almost without a break from the cannonade of Valmy in 1792 to the battle of Waterloo in 1815.
And the present example? That’s a question worth exploring, if only for the utterly pragmatic reason that most of my readers are going to get to see it up close and personal.
That the United States and the industrial world generally are deep in an era of pretense is, I think, pretty much beyond question at this point. We’ve got political authorities, global bankers, and a galaxy of pundits insisting at the top of their lungs that nothing is wrong, everything is fine, and we’ll be on our way to the next great era of prosperity if we just keep pursuing a set of boneheaded policies that have never—not once in the entire span of human history—brought prosperity to the countries that pursued them. We’ve got shelves full of books for sale in upscale bookstores insisting, in the strident language usual to such times, that life is wonderful in this best of all possible worlds, and it’s going to get better forever because, like, we have technology, dude! Across the landscape of the cultural mainstream, you’ll find no shortage of cheerleaders insisting at the top of their lungs that everything’s going to be fine, that even though they said ten years ago that we only have ten years to do something before disaster hits, why, we still have ten years before disaster hits, and when ten more years pass by, why, you can be sure that the same people will be insisting that we have ten more.
This is the classic rhetoric of an era of pretense. Over the last few years, though, it’s seemed to me that the voices of crackpot optimism have gotten more shrill, the diatribes more fact-free, and the logic even shoddier than it was in Bjorn Lomborg’s day, which is saying something. We’ve reached the point that state governments are making it a crime to report on water quality and forbidding officials from using such unwelcome phrases as “climate change.” That’s not the action of people who are confident in their beliefs; it’s the action of a bunch of overgrown children frantically clenching their eyes shut, stuffing their fingers in their ears, and shouting “La, la, la, I can’t hear you.”
That, in turn, suggests that the transition to the era of impact may be fairly close. Exactly when it’s likely to arrive is a complex question, and exactly what’s going to land the blow that will crack the crackpot optimism and make it impossible to ignore the arrival of real trouble is an even more complex one. In 1929, those who hadn’t bought into the bubble could be perfectly sure—and in fact, a good many of them were perfectly sure—that the usual mechanism that brings bubbles to a catastrophic end was about to terminate the boom of the 1920s with extreme prejudice, as indeed it did. In the last decades of the French monarchy, it was by no means clear exactly what sequence of events would bring the Ancien Régime crashing down, but such thoughtful observers as Talleyrand knew that something of the sort was likely to follow the crisis of legitimacy then under way.
The problem with trying to predict the trigger that will bring our current situation to a sudden stop is that we’re in such a target-rich environment. Looking over the potential candidates for the sudden shock that will stick a fork in the well-roasted corpse of business as usual, I’m reminded of the old board game Clue. Will Mr. Boddy’s killer turn out to be Colonel Mustard in the library with a lead pipe, Professor Plum in the conservatory with a candlestick, or Miss Scarlet in the dining room with a rope?
In much the same sense, we’ve got a global economy burdened to the breaking point with more than a quadrillion dollars of unpayable debt; we’ve got a global political system coming apart at the seams as the United States slips toward the usual fate of empires and its rivals circle warily, waiting for the kill; we’ve got a domestic political system here in the US entering a classic prerevolutionary condition under the impact of a textbook crisis of legitimacy; we’ve got a global climate that’s hammered by our rank stupidity in treating the atmosphere as a gaseous sewer for our wastes; we’ve got a global fossil fuel industry that’s frantically trying to pretend that scraping the bottom of the barrel means that the barrel is full, and the list goes on. It’s as though Colonel Mustard, Professor Plum, Miss Scarlet, and the rest of them all ganged up on Mr. Boddy at once, and only the most careful autopsy will be able to determine which of them actually dealt the fatal blow.
In the midst of all this uncertainty, there are three things that can, I think, be said for certain about the end of the current era of pretense and the coming of the era of impact. The first is that it’s going to happen. When something is unsustainable, it’s a pretty safe bet that it won’t be sustained indefinitely, and a society that keeps on embracing policies that swap short-term gains for long-term problems will sooner or later end up awash in the consequences of those policies. Timing such transitions is difficult at best; it’s an old adage among stock traders that the market can stay irrational longer than you can stay solvent. Still, points made above—especially the increasingly shrill tone of the defenders of the existing order—suggest to me that the era of impact may be here within a decade or so at the outside.
The second thing that can be said for certain about the coming era of impact is that it’s not the end of the world. Apocalyptic fantasies are common and popular in eras of pretense, and for good reason; fixating on the supposed imminence of the Second Coming, human extinction, or what have you, is a great way to distract yourself from the real crisis that’s breathing down your neck. If the real crisis in question is partly or wholly a result of your own actions, while the apocalyptic fantasy can be blamed on someone or something else, that adds a further attraction to the fantasy.
The end of industrial civilization will be a long, bitter, painful cascade of conflicts, disasters, and  accelerating decline in which a vast number of people are going to die before they otherwise would, and a great many things of value will be lost forever. That’s true of any falling civilization, and the misguided decisions of the last forty years have pretty much guaranteed that the current example is going to have an extra helping of all these unwelcome things. I’ve discussed at length, in earlier posts in the Dark Age America sequence here and in other sequences as well, why the sort of apocalyptic sudden stop beloved of Hollywood scriptwriters is the least likely outcome of the predicament of our time; still, insisting on the imminence and inevitability of some such game-ending event will no doubt be as popular as usual in the years immediately ahead.
The third thing that I think can be said for certain about the coming era of impact, though, is the one that counts. If it follows the usual pattern, as I expect it to do, once the crisis hits there will be serious, authoritative, respectable figures telling everyone exactly what they need to do to bring an end to the troubles and get the United States and the world back on track to renewed peace and prosperity. Taking these pronouncements seriously and following their directions will be extremely popular, and it will almost certainly also be a recipe for unmitigated disaster. If forewarned is forearmed, as the saying has it, this is a piece of firepower to keep handy as the era of pretense winds down. In next week’s post, we’ll talk about comparable weaponry relating to the third stage of collapse—the era of response.

The Era of Pretense

Wed, 2015-05-13 17:00
I've mentioned in previous posts here on The Archdruid Report the educational value of the comments I receive from readers in the wake of each week’s essay. My post two weeks ago on the death of the internet was unusually productive along those lines.  One of the comments I got in response to that post gave me the theme for last week’s essay, but there was at least one other comment calling for the same treatment. Like the one that sparked last week’s post, it appeared on one of the many other internet forums on which The Archdruid Report, and it unintentionally pointed up a common and crucial failure of imagination that shapes, or rather misshapes, the conventional wisdom about our future.
Curiously enough, the point that set off the commenter in question was the same one that incensed the author of the denunciation mentioned in last week’s post: my suggestion in passing that fifty years from now, most Americans may not have access to electricity or running water. The commenter pointed out angrily that I’d claimed that the twilight of industrial civilization would be a ragged arc of decline over one to three centuries. Now, he claimed, I was saying that it was going to take place in the next fifty years, and this apparently convinced him that everything I said ought to be dismissed out of hand.
I run into this sort of confusion all the time. If I suggest that the decline and fall of a civilization usually takes several centuries, I get accused of inconsistency if I then note that one of the sharper downturns included in that process may be imminent.  If I point out that the United States is likely within a decade or two of serious economic and political turmoil, driven partly by the implosion of its faltering global hegemony and partly by a massive crisis of legitimacy that’s all but dissolved the tacit contract between the existing order of US society and the masses who passively support it, I get accused once again of inconsistency if I then say that whatever comes out the far side of that crisis—whether it’s a battered and bruised United States or a patchwork of successor states—will then face a couple of centuries of further decline and disintegration before the deindustrial dark age bottoms out.
Now of course there’s nothing inconsistent about any of these statements. The decline and fall of a civilization isn’t a single event, or even a single linear process; it’s a complex fractal reality composed of many different events on many different scales in space and time. If it takes one to three centuries, as usual, those centuries are going to be taken up by an uneven drumbeat of wars, crises, natural disasters, and assorted breakdowns on a variety of time frames with an assortment of local, regional, national, or global effects. The collapse of US global hegemony is one of those events; the unraveling of the economic and technological framework that currently provides most Americans with electricity and running water is another, but neither of those is anything like the whole picture.
It’s probably also necessary to point out that any of my readers who think that being deprived of electricity and running water is the most drastic kind of collapse imaginable have, as the saying goes, another think coming. Right now, in our oh-so-modern world, there are billions of people who get by without regular access to electricity and running water, and most of them aren’t living under dark age conditions. A century and a half ago, when railroads, telegraphs, steamships, and mechanical printing presses were driving one of history’s great transformations of transport and information technology, next to nobody had electricity or running water in their homes. The technologies of 1865 are not dark age technologies; in fact, the gap between 1865 technologies and dark age technologies is considerably greater, by most metrics, than the gap between 1865 technologies and the ones we use today.
Furthermore, whether or not Americans have access to running water and electricity may not have as much to say about the future of industrial society everywhere in the world as the conventional wisdom would suggest.  I know that some of my American readers will be shocked out of their socks to hear this, but the United States is not the whole world. It’s not even the center of the world. If the United States implodes over the next two decades, leaving behind a series of bankrupt failed states to squabble over its territory and the little that remains of its once-lavish resource base, that process will be a great source of gaudy and gruesome stories for the news media of the world’s other continents, but it won’t affect the lives of the readers of those stories much more than equivalent events in Africa and the Middle East affect the lives of Americans today.
As it happens, over the next one to three centuries, the benefits of industrial civilization are going to go away for everyone. (The costs will be around a good deal longer—in the case of the nuclear wastes we’re so casually heaping up for our descendants, a good quarter of a million years, but those and their effects are rather more localized than some of today’s apocalyptic rhetoric likes to suggest.) The reasoning here is straightforward. White’s Law, one of the fundamental principles of human ecology, states that economic development is a function of energy per capita; the immense treasure trove of concentrated energy embodied in fossil fuels, and that alone, made possible the sky-high levels of energy per capita that gave the world’s industrial nations their brief era of exuberance; as fossil fuels deplete, and remaining reserves require higher and higher energy inputs to extract, the levels of energy per capita the industrial nations are used to having will go away forever.
It’s important to be clear about this. Fossil fuels aren’t simply one energy source among others; in terms of concentration, usefulness, and fungibility—that is, the ability to be turned into any other form of energy that might be required—they’re in a category all by themselves. Repeated claims that fossil fuels can be replaced with nuclear power, renewable energy resources, or what have you sound very good on paper, but every attempt to put those claims to the test so far has either gone belly up in short order, or become a classic subsidy dumpster surviving purely on a diet of government funds and mandates.
Three centuries ago, the earth’s fossil fuel reserves were the largest single deposit of concentrated energy in this part of the universe; now we’ve burnt through nearly all the easily accessible reserves, and we’re scrambling to keep the tottering edifice of industrial society going by burning through the dregs that remain. As those run out, the remaining energy resources—almost all of them renewables—will certainly sustain a variety of human societies, and some of those will be able to achieve a fairly high level of complexity and maintain some kinds of advanced technologies. The kind of absurd extravagance that passes for a normal standard of living among the more privileged inmates of the industrial nations is another matter, and as the fossil fuel age sunsets out, it will end forever.
The fractal trajectory of decline and fall mentioned earlier in this post is simply the way this equation works out on the day-to-day scale of ordinary history. Still, those of us who happen to be living through a part of that trajectory might reasonably be curious about how it’s likely to unfold in our lifetimes. I’ve discussed in a previous series of posts, and in my book Decline and Fall: The End of Empire and the Future of Democracy in 21st Century America, how the end of US global hegemony is likely to unfold, but as already noted, that’s only a small portion of the broader picture. Is a broader view possible?
Fortunately history, the core resource I’ve been using to try to make sense of our future, has plenty to say about the broad patterns that unfold when civilizations decline and fall. Now of course I know all I have to do is mention that history might be relevant to our present predicament, and a vast chorus of voices across the North American continent and around the world will bellow at rooftop volume, “But it’s different this time!” With apologies to my regular readers, who’ve heard this before, it’s probably necessary to confront that weary thoughtstopper again before we proceed.
As I’ve noted before, claims that it’s different this time are right where it doesn’t matter and wrong where it counts.  Predictions made on the basis of history—and not just by me—have consistently predicted events over the last decade or so far more accurately than predictions based on the assumption that history doesn’t matter. How many times, dear reader, have you heard someone insist that industrial civilization is going to crash to ruin in the next six months, and then watched those six months roll merrily by without any sign of the predicted crash? For that matter, how many times have you heard someone insist that this or that policy that’s never worked any other time that it’s been tried, or this or that piece of technological vaporware that’s been the subject of failed promises for decades, will inevitably put industrial society back on its alleged trajectory to the stars—and how many times has the policy or the vaporware been quietly shelved, and something else promoted using the identical rhetoric, when it turned out not to perform as advertised?
It’s been a source of wry amusement to me to watch the same weary, dreary, repeatedly failed claims of imminent apocalypse and inevitable progress being rehashed year after year, varying only in the fine details of the cataclysm du jour and the techno-savior du jour, while the future nobody wants to talk about is busily taking shape around us. Decline and fall isn’t something that will happen sometime in the conveniently distant future; it’s happening right now in the United States and around the world. The amusement, though, is tempered with a sense of familiarity, because the period in which decline is under way but nobody wants to admit that fact is one of the recurring features of the history of decline.
There are, very generally speaking, five broad phases in the decline and fall of a civilization. I know it’s customary in historical literature to find nice dull labels for such things, but I’m in a contrary mood as I write this, so I’ll give them unfashionably colorful names: the eras of pretense, impact, response, breakdown, and dissolution. Each of these is complex enough that it’ll need a discussion of its own; this week, we’ll talk about the era of pretense, which is the one we’re in right now.
Eras of pretense are by no means limited to the decline and fall of civilizations. They occur whenever political, economic, or social arrangements no longer work, but the immediate costs of admitting that those arrangements don’t work loom considerably larger in the collective imagination than the future costs of leaving those arrangements in place. It’s a curious but consistent wrinkle of human psychology that this happens even if those future costs soar right off the scale of frightfulness and lethality; if the people who would have to pay the immediate costs don’t want to do so, in fact, they will reliably and cheerfully pursue policies that lead straight to their own total bankruptcy or violent extermination, and never let themselves notice where they’re headed.
Speculative bubbles are a great setting in which to watch eras of pretense in full flower. In the late phases of a bubble, when it’s clear to anyone who has two spare neurons to rub together that the boom du jour is cobbled together of equal parts delusion and chicanery, the people who are most likely to lose their shirts in the crash are the first to insist at the top of their lungs that the bubble isn’t a bubble and their investments are guaranteed to keep on increasing in value forever. Those of my readers who got the chance to watch some of their acquaintances go broke in the real estate bust of 2008-9, as I did, will have heard this sort of self-deception at full roar; those who missed the opportunity can make up for the omission by checking out the ongoing torrent of claims that the soon-to-be-late fracking bubble is really a massive energy revolution that will make America wealthy and strong again.
The history of revolutions offers another helpful glimpse at eras of pretense. France in the decades before 1789, to cite a conveniently well-documented example, was full of people who had every reason to realize that the current state of affairs was hopelessly unsustainable and would have to change. The things about French politics and economics that had to change, though, were precisely those things that the French monarchy and aristocracy were unwilling to change, because any such reforms would have cost them privileges they’d had since time out of mind and were unwilling to relinquish.
Louis XIV, who finished up his long and troubled reign a supreme realist, is said to have muttered “Après moi, le déluge”—“Once I’m gone, this sucker’s going down” may not be a literal translation, but it catches the flavor of the utterance—but that degree of clarity was rare in his generation, and all but absent in those of his increasingly feckless successors. Thus the courtiers and aristocrats of the Old Regime amused themselves at the nation’s expense, dabbled in avant-garde thought, and kept their eyes tightly closed to the consequences of their evasions of looming reality, while the last opportunities to excuse themselves from a one-way trip to visit the guillotine and spare France the cataclysms of the Terror and the Napoleonic wars slipped silently away.
That’s the bitter irony of eras of pretense. Under most circumstances, they’re the last period when it would be possible to do anything constructive on the large scale about the crisis looming immediately ahead, but the mass evasion of reality that frames the collective thinking of the time stands squarely in the way of any such constructive action. In the era of pretense before a speculative bust, people who could have quietly cashed in their positions and pocketed their gains double down on their investments, and guarantee that they’ll be ruined once the market stops being liquid. In the era of pretense before a revolution, in the same way, those people and classes that have the most to lose reliably take exactly those actions that ensure that they will in fact lose everything. If history has a sense of humor, this is one of the places that it appears in its most savage form.
The same points are true, in turn, of the eras of pretense that precede the downfall of a civilization. In a good many cases, where too few original sources survive, the age of pretense has to be inferred from archeological remains. We don’t know what motives inspired the ancient Mayans to build their biggest pyramids in the years immediately before the Terminal Classic period toppled over into a savage political and demographic collapse, but it’s hard to imagine any such project being set in motion without the usual evasions of an era of pretense being involved  Where detailed records of dead civilizations survive, though, the sort of rhetorical handwaving common to bubbles before the bust and decaying regimes on the brink of revolution shows up with knobs on. Thus the panegyrics of the Roman imperial court waxed ever more lyrical and bombastic about Rome’s invincibility and her civilizing mission to the nations as the Empire stumbled deeper into its terminal crisis, echoing any number of other court poets in any number of civilizations in their final hours.
For that matter, a glance through classical Rome’s literary remains turns up the remarkable fact that those of her essayists and philosophers who expressed worries about her survival wrote, almost without exception, during the Republic and the early Empire; the closer the fall of Rome actually came, the more certainty Roman authors expressed that the Empire was eternal and the latest round of troubles was just one more temporary bump on the road to peace and prosperity. It took the outsider’s vision of Augustine of Hippo to proclaim that Rome really was falling—and even that could only be heard once the Visigoths sacked Rome and the era of pretense gave way to the age of impact.
The present case is simply one more example to add to an already lengthy list. In the last years of the nineteenth century, it was common for politicians, pundits, and mass media in the United States, the British empire, and other industrial nations to discuss the possibility that the advanced civilization of the time might be headed for the common fate of nations in due time. The intellectual history of the twentieth century is, among other things, a chronicle of how that discussion was shoved to the margins of our collective discourse, just as the ecological history of the same century is among other things a chronicle of how the worries of the previous era became the realities of the one we’re in today. The closer we’ve moved toward the era of impact, that is, the more unacceptable it has become for anyone in public life to point out that the problems of the age are not just superficial.
Listen to the pablum that passes for political discussion in Washington DC or the mainstream US media these days, or the even more vacuous noises being made by party flacks as the country stumbles wearily toward yet another presidential election. That the American dream of upward mobility has become an American nightmare of accelerating impoverishment outside the narrowing circle of the kleptocratic rich, that corruption and casual disregard for the rule of law are commonplace in political institutions from local to Federal levels, that our medical industry charges more than any other nation’s and still provides the worst health care in the industrial world, that our schools no longer teach anything but contempt for learning, that the national infrastructure and built environment are plunging toward Third World conditions at an ever-quickening pace, that a brutal and feckless foreign policy embraced by both major parties is alienating our allies while forcing our enemies to set aside their mutual rivalries and make common cause against us: these are among the issues that matter, but they’re not the issues you’ll hear discussed as the latest gaggle of carefully airbrushed candidates go through their carefully scripted elect-me routines on their way to the 2016 election.
If history teaches anything, though, it’s that eras of pretense eventually give way to eras of impact. That doesn’t mean that the pretense will go away—long after Alaric the Visigoth sacked Rome, for example, there were still plenty of rhetors trotting out the same tired clichés about Roman invincibility—but it does mean that a significant number of people will stop finding the pretense relevant to their own lives. How that happens in other historical examples, and how it might happen in our own time, will be the theme of next week’s post.

The Whisper of the Shutoff Valve

Wed, 2015-05-06 18:35
Last week’s post on the impending decline and fall of the internet fielded a great many responses. That was no surprise, to be sure; nor was I startled in the least to find that many of them rejected the thesis of the post with some heat. Contemporary pop culture’s strident insistence that technological progress is a clock that never runs backwards made such counterclaims inevitable.
Still, it’s always educational to watch the arguments fielded to prop up the increasingly shaky edifice of the modern mythology of progress, and the last week was no exception. A response I found particularly interesting from that standpoint appeared on one of the many online venues where Archdruid Report posts appear. One of the commenters insisted that my post should be rejected out of hand as mere doom and gloom; after all, he pointed out, it was ridiculous for me to suggest that fifty years from now, a majority of the population of the United States might be without reliable electricity or running water.
I’ve made the same prediction here and elsewhere a good many times. Each time, most of my readers or listeners seem to have taken it as a piece of sheer rhetorical hyperbole. The electrical grid and the assorted systems that send potable water flowing out of faucets are so basic to the rituals of everyday life in today’s America that their continued presence is taken for granted.  At most, it’s conceivable that individuals might choose not to connect to them; there’s a certain amount of talk about off-grid living here and there in the alternative media, for example.  That people who want these things might not have access to them, though, is pretty much unthinkable.
Meanwhile, in Detroit and Baltimore, tens of thousands of residents are in the process of losing their access to water and electricity.
The situation in both cities is much the same, and there’s every reason to think that identical headlines will shortly appear in reference to other cities around the nation. Not that many decades ago, Detroit and Baltimore were important industrial centers with thriving economies. Along with more than a hundred other cities in America’s Rust Belt, they were thrown under the bus with the first wave of industrial offshoring in the 1970s.  The situation for both cities has only gotten worse since that time, as the United States completed its long transition from a manufacturing economy producing goods and services to a bubble economy that mostly produces unpayable IOUs.
These days, the middle-class families whose tax payments propped up the expansive urban systems of an earlier day have long since moved out of town. Most of the remaining residents are poor, and the ongoing redistribution of wealth in America toward the very rich and away from everyone else has driven down the income of the urban poor to the point that many of them can no longer afford to pay their water and power bills. City utilities in Detroit and Baltimore have been sufficiently sensitive to political pressures that large-scale utility shutoffs have been delayed, but shifts in the political climate in both cities are bringing the delays to an end; water bills have increased steadily, more and more people have been unable to pay them, and the result is as predictable as it is brutal.
The debate over the Detroit and Baltimore shutoffs has followed the usual pattern, as one side wallows in bash-the-poor rhetoric while the other side insists plaintively that access to utilities is a human right. Neither side seems to be interested in talking about the broader context in which these disputes take shape. There are two aspects to that broader context, and it’s a tossup which is the more threatening.
The first aspect is the failure of the US economy to recover in any meaningful sense from the financial crisis of 2008. Now of course politicians from Obama on down have gone overtime grandstanding about the alleged recovery we’re in. I invite any of my readers who bought into that rhetoric to try the following simple experiment. Go to your favorite internet search engine and look up how much the fracking industry has added to the US gross domestic product each year from 2009 to 2014. Now subtract that figure from the US gross domestic product for each of those years, and see how much growth there’s actually been in the rest of the economy since the real estate bubble imploded.
What you’ll find, if you take the time to do that, is that the rest of the US economy has been flat on its back gasping for air for the last five years. What makes this even more problematic, as I’ve noted in several previous posts here, is that the great fracking boom about which we’ve heard so much for the last five years was never actually the game-changing energy revolution its promoters claimed; it was simply another installment in the series of speculative bubbles that has largely replaced constructive economic activity in this country over the last two decades or so.
What’s more, it’s not the only bubble currently being blown, and it may not even be the largest. We’ve also got a second tech-stock bubble, with money-losing internet corporations racking up absurd valuations in the stock market while they burn through millions of dollars of venture capital; we’ve got a student loan bubble, in which billions of dollars of loans that will never be paid back have been bundled, packaged, and sold to investors just like all those no-doc mortgages were a decade ago; car loans are getting the same treatment; the real estate market is fizzing again in many urban areas as investors pile into another round of lavishly marketed property investments—well, I could go on for some time. It’s entirely possible that if all the bubble activity were to be subtracted from the last five years or so of GDP, the result would show an economy in freefall.
Certainly that’s the impression that emerges if you take the time to check out those economic statistics that aren’t being systematically jiggered by the US government for PR purposes. The number of long-term unemployed in America is at an all-time high; roads, bridges, and other basic infrastructure is falling to pieces; measurements of US public health—generally considered a good proxy for the real economic condition of the population—are well below those of other industrial countries, heading toward Third World levels; abandoned shopping malls litter the landscape while major retailers announce more than 6000 store closures. These are not things you see in an era of economic expansion, or even one of relative stability; they’re markers of decline.
The utility shutoffs in Detroit and Baltimore are further symptoms of the same broad process of economic unraveling. It’s true, as pundits in the media have been insisting since the story broke, that utilities get shut off for nonpayment of bills all the time. It’s equally true that shutting off the water supply of 20,000 or 30,000 people all at once is pretty much unprecedented. Both cities, please note, have had very large populations of poor people for many decades now.  Those who like to blame a “culture of poverty” for the tangled relationship between US governments and the American poor, and of course that trope has been rehashed by some of the pundits just mentioned, haven’t yet gotten around to explaining how the culture of poverty all at once inspired tens of thousands of people who had been paying their utility bills to stop doing so.
There are plenty of good reasons, after all, why poor people who used to pay their bills can’t do so any more. Standard business models in the United States used to take it for granted that the best way to run the staffing dimensions of any company, large or small, was to have as many full-time positions as possible and to use raises and other practical incentives to encourage employees who were good at their jobs to stay with the company. That approach has been increasingly unfashionable in today’s America, partly due to perverse regulatory incentives that penalize employers for offering full-time positions, partly to the emergence of attitudes in corner offices that treat employees as just another commodity. (I doubt it’s any kind of accident that most corporations nowadays refer to their employment offices as “human resource departments.” What do you do with a resource? You exploit it.)
These days, most of the jobs available to the poor are part-time, pay very little, and include nasty little clawbacks in the form of requirements that employees pay out of pocket for uniforms, equipment, and other things that employers used to provide as a matter of course. Meanwhile housing prices and rents are rising well above their post-2008 dip, and a great many other necessities are becoming more costly—inflation may be under control, or so the official statistics say, but anyone who’s been shopping at the same grocery store for the last eight years knows perfectly well that prices kept on rising anyway.
So you’ve got falling incomes running up against rising costs for food, rent, and utilities, among other things. In the resulting collision, something’s got to give, and for tens of thousands of poor Detroiters and Baltimoreans, what gave first was the ability to keep current on their water bills. Expect to see the same story playing out across the country as more people on the bottom of the income pyramid find themselves in the same situation. What you won’t hear in the media, though it’s visible enough if you know where to look and are willing to do so, is that people above the bottom of the income pyramid are also losing ground, being forced down toward economic nonpersonhood. From the middle classes down, everyone’s losing ground.
That process doesn’t continue any further than the middle class, to be sure. It’s been pointed out repeatedly that over the last four decades or so, the distribution of wealth in America has skewed further and further out of balance, with the top 20% of incomes taking a larger and larger share at the expense of everybody else. That’s an important factor in bringing about the collision just described. Some thinkers on the radical fringes of American society, which is the only place in the US you can talk about such things these days, have argued that the raw greed of the well-to-do is the sole reason why so many people lower down the ladder are being pushed further down still.
Scapegoating rhetoric of that sort is always comforting, because it holds out the promise—theoretically, if not practically—that something can be done about the situation. If only the thieving rich could be lined up against a convenient brick wall and removed from the equation in the time-honored fashion, the logic goes, people in Detroit and Baltimore could afford to pay their water bills!  I suspect we’ll hear such claims increasingly often as the years pass and more and more Americans find their access to familiar comforts and necessities slipping away.  Simple answers are always popular in such times, not least when the people being scapegoated go as far out of their way to make themselves good targets for such exercises as the American rich have done in recent decades.
John Kenneth Galbraith’s equation of the current US political and economic elite with the French aristocracy on the eve of revolution rings even more true than it did when he wrote it back in 1992, in the pages of The Culture of Contentment. The unthinking extravagances, the casual dismissal of the last shreds of noblesse oblige, the obsessive pursuit of personal advantages and private feuds without the least thought of the potential consequences, the bland inability to recognize that the power, privilege, wealth, and sheer survival of the aristocracy depended on the system the aristocrats themselves were destabilizing by their actions—it’s all there, complete with sprawling overpriced mansions that could just about double for Versailles. The urban mobs that played so large a role back in 1789 are warming up for their performances as I write these words; the only thing left to complete the picture is a few tumbrils and a guillotine, and those will doubtless arrive on cue.
The senility of the current US elite, as noted in a previous post here, is a massive political fact in today’s America. Still, it’s not the only factor in play here. Previous generations of wealthy Americans recognized without too much difficulty that their power, prosperity, and survival depended on the willingness of the rest of the population to put up with their antics. Several times already in America’s history, elite groups have allied with populist forces to push through reforms that sharply weakened the power of the wealthy elite, because they recognized that the alternative was a social explosion even more destructive to the system on which elite power depends.
I suppose it’s possible that the people currently occupying the upper ranks of the political and economic pyramid in today’s America are just that much more stupid than their equivalents in the Jacksonian, Progressive, and New Deal eras. Still, there’s at least one other explanation to hand, and it’s the second of the two threatening contextual issues mentioned earlier.
Until the nineteenth century, fresh running water piped into homes for everyday use was purely an affectation of the very rich in a few very wealthy and technologically adept societies. Sewer pipes to take dirty water and human wastes out of the house belonged in the same category. This wasn’t because nobody knew how plumbing works—the Romans had competent plumbers, for example, and water faucets and flush toilets were to be found in Roman mansions of the imperial age. The reason those same things weren’t found in every Roman house was economic, not technical.
Behind that economic issue lay an ecological reality.  White’s Law, one of the foundational principles of human ecology, states that economic development is a function of energy per capita. For a society before the industrial age, the Roman Empire had an impressive amount of energy per capita to expend; control over the agricultural economy of the Mediterranean basin, modest inputs from sunlight, water and wind, and a thriving slave industry fed by the expansion of Roman military power all fed into the capacity of Roman society to develop itself economically and technically. That’s why rich Romans had running water and iced drinks in summer, while their equivalents in ancient Greece a few centuries earlier had to make do without either one.
Fossil fuels gave industrial civilization a supply of energy many orders of magnitude greater than any previous human civilization has had—a supply vast enough that the difference remains huge even after the vast expansion of population that followed the industrial revolution. There was, however, a catch—or, more precisely, two catches. To begin with, fossil fuels are finite, nonrenewable resources; no matter how much handwaving is employed in the attempt to obscure this point—and whatever else might be in short supply these days, that sort of handwaving is not—every barrel of oil, ton of coal, or cubic foot of natural gas that’s burnt takes the world one step closer to the point at which there will be no economically extractable reserves of oil, coal, or natural gas at all.
That’s catch #1. Catch #2 is subtler, and considerably more dangerous. Oil, coal, and natural gas don’t leap out of the ground on command. They have to be extracted and processed, and this takes energy. Companies in the fossil fuel industries have always targeted the deposits that cost less to extract and process, for obvious economic reasons. What this means, though, is that over time, a larger and larger fraction of the energy yield of oil, coal, and natural gas has to be put right back into extracting and processing oil, coal, and natural gas—and this leaves less and less for all other uses.
That’s the vise that’s tightening around the American economy these days. The great fracking boom, to the extent that it wasn’t simply one more speculative gimmick aimed at the pocketbooks of chumps, was an attempt to make up for the ongoing decline of America’s conventional oilfields by going after oil that was far more expensive to extract. The fact that none of the companies at the heart of the fracking boom ever turned a profit, even when oil brought more than $100 a barrel, gives some sense of just how costly shale oil is to get out of the ground. The financial cost of extraction, though, is a proxy for the energy cost of extraction—the amount of energy, and of the products of energy, that had to be thrown into the task of getting a little extra oil out of marginal source rock.
Energy needed to extract energy, again, can’t be used for any other purpose. It doesn’t contribute to the energy surplus that makes economic development possible. As the energy industry itself takes a bigger bite out of each year’s energy production, every other economic activity loses part of the fuel that makes it run. That, in turn, is the core reason why the American economy is on the ropes, America’s infrastructure is falling to bits—and Americans in Detroit and Baltimore are facing a transition to Third World conditions, without electricity or running water.
I suspect, for what it’s worth, that the shutoff notices being mailed to tens of thousands of poor families in those two cities are a good working model for the way that industrial civilization itself will wind down. It won’t be sudden; for decades to come, there will still be people who have access to what Americans today consider the ordinary necessities and comforts of everyday life; there will just be fewer of them each year. Outside that narrowing circle, the number of economic nonpersons will grow steadily, one shutoff notice at a time.
As I’ve pointed out in previous posts, the line of fracture between the senile elite and what Arnold Toynbee called the internal proletariat—the people who live within a failing civilization’s borders but receive essentially none of its benefits—eventually opens into a chasm that swallows what’s left of the civilization. Sometimes the tectonic processes that pull the chasm open are hard to miss, but there are times when they’re a good deal more difficult to sense in action, and this is one of these latter times. Listen to the whisper of the shutoff valve, and you’ll hear tens of thousands of Americans being cut off from basic services the rest of us, for the time being, still take for granted.

The Death of the Internet: A Pre-Mortem

Wed, 2015-04-29 17:25
The mythic role assigned to progress in today’s popular culture has any number of odd effects, but one of the strangest is the blindness to the downside that clamps down on the collective imagination of our time once people become convinced that something or other is the wave of the future. It doesn’t matter in the least how many or obvious the warning signs are, or how many times the same tawdry drama has been enacted.  Once some shiny new gimmick gets accepted as the next glorious step in the invincible march of progress, most people lose the ability to imagine that the wave of the future might just do what waves generally do: that is to say, crest, break, and flow back out to sea, leaving debris scattered on the beach in its wake.
It so happens that I grew up in the middle of just such a temporary wave of the future, in the south Seattle suburbs in the 1960s, where every third breadwinner worked for Boeing. The wave in question was the supersonic transport, SST for short: a jetliner that would fly faster than sound, cutting hours off long flights. The inevitability of the SST was an article of faith locally, and not just because Boeing was building one; an Anglo-French consortium was in the lead with the Concorde, and the Soviets were working on the Tu-144, but the Boeing 2707 was expected to be the biggest and baddest of them all, a 300-seat swing-wing plane that was going to make commercial supersonic flight an everyday reality.
Long before the 2707 had even the most ghostly sort of reality, you could buy model kits of the plane, complete with Pan Am decals, at every hobby store in the greater Seattle area. For that matter, take Interstate 5 south from downtown Seattle past the sprawling Boeing plant just outside of town, and you’d see the image of the 2707 on the wall of one of the huge assembly buildings, a big delta-winged shape in white and gold winging its way through the imagined air toward the gleaming future in which so many people believed back then.
There was, as it happened, a small problem with the 2707, a problem it shared with all the other SST projects; it made no economic sense at all. It was, to be precise, what an earlier post here called  a subsidy dumpster: that is, a project that was technically feasible but economically impractical, and existed mostly as a way to pump government subsidies into Boeing’s coffers. Come 1971, the well ran dry: faced with gloomy numbers from the economists, worried calculations from environmental scientists, and a public not exactly enthusiastic about dozens of sonic booms a day rattling plates and cracking windows around major airports, Congress cut the project’s funding.
That happened right when the US economy generally, and the notoriously cyclical airplane industry in particular, were hitting downturns. Boeing was Seattle’s biggest employer in those days, and when it laid off employees en masse, the result was a local depression of legendary severity. You heard a lot of people in those days insisting that the US had missed out on the next aviation boom, and Congress would have to hang its head in shame once Concordes and Tu-144s were hauling passengers all over the globe. Of course that’s not what happened; the Tu-144 flew a handful of commercial flights and then was grounded for safety reasons, and the Concorde lingered on, a technical triumph but an economic white elephant, until the last plane retired from service in 2003.
All this has been on my mind of late as I’ve considered the future of the internet. The comparison may seem far-fetched, but then that’s what supporters of the SST would have said if anyone had compared the Boeing 2707 to, say, the zeppelin, another wave of the future that turned out to make too little economic sense to matter. Granted, the internet isn’t a subsidy dumpster, and it’s also much more complex than the SST; if anything, it might be compared to the entire system of commercial air travel, which we still have with us or the moment. Nonetheless, a strong case can be made that the internet, like the SST, doesn’t actually make economic sense; it’s being propped up by a set of financial gimmickry with a distinct resemblance to smoke and mirrors; and when those go away—and they will—much of what makes the internet so central a part of pop culture will go away as well.
It’s probably necessary to repeat here that the reasons for this are economic, not technical. Every time I’ve discussed the hard economic realities that make the internet’s lifespan in the deindustrial age  roughly that of a snowball in Beelzebub’s back yard, I’ve gotten a flurry of responses fixating on purely  technical issues. Those issues are beside the point.  No doubt it would be possible to make something like the internet technically feasible in a society on the far side of the Long Descent, but that doesn’t matter; what matters is that the internet has to cover its operating costs, and it also has to compete with other ways of doing the things that the internet currently does.
It’s a source of wry amusement to me that so many people seem to have forgotten that the internet doesn’t actually do very much that’s new. Long before the internet, people were reading the news, publishing essays and stories, navigating through unfamiliar neighborhoods, sharing photos of kittens with their friends, ordering products from faraway stores for home delivery, looking at pictures of people with their clothes off, sending anonymous hate-filled messages to unsuspecting recipients, and doing pretty much everything else that they do on the internet today. For the moment, doing these things on the internet is cheaper and more convenient than the alternatives, and that’s what makes the internet so popular. If that changes—if the internet becomes more costly and less convenient than other options—its current popularity is unlikely to last.
Let’s start by looking at the costs. Every time I’ve mentioned the future of the internet on this blog, I’ve gotten comments and emails from readers who think that the price of their monthly internet service is a reasonable measure of the cost of the internet as a whole. For a useful corrective to this delusion, talk to people who work in data centers. You’ll hear about trucks pulling up to the loading dock every single day to offload pallet after pallet of brand new hard drives and other components, to replace those that will burn out that same day. You’ll hear about power bills that would easily cover the electricity costs of a small city. You’ll hear about many other costs as well. Data centers are not cheap to run, there are many thousands of them, and they’re only one part of the vast infrastructure we call the internet: by many measures, the most gargantuan technological project in the history of our species.
Your monthly fee for internet service covers only a small portion of what the internet costs. Where does the rest come from? That depends on which part of the net we’re discussing. The basic structure is paid for by internet service providers (ISPs), who recoup part of the costs from your monthly fee, part from the much larger fees paid by big users, and part by advertising. Content providers use some mix of advertising, pay-to-play service fees, sales of goods and services, packaging and selling your personal data to advertisers and government agencies, and new money from investors and loans to meet their costs. The ISPs routinely make a modest profit on the deal, but many of the content providers do not. Amazon may be the biggest retailer on the planet, for example, and its cash flow has soared in recent years, but its expenses have risen just as fast, and it rarely makes a profit. Many other content provider firms, including fish as big as Twitter, rack up big losses year after year.
How do they stay in business? A combination of vast amounts of investment money and ultracheap debt. That’s very common in the early decades of a new industry, though it’s been made a good deal easier by the Fed’s policy of next-to-zero interest rates. Investors who dream of buying stock in the next Microsoft provide venture capital for internet startups, banks provide lines of credit for existing firms, the stock and bond markets snap up paper of various kinds churned out by internet businesses, and all that money goes to pay the bills. It’s a reasonable gamble for the investors; they know perfectly well that a great many of the firms they’re funding will go belly up within a few years, but the few that don’t will either be bought up at inflated prices by one of the big dogs of the online world, or will figure out how to make money and then become big dogs themselves.
Notice, though, that this process has an unexpected benefit for ordinary internet users: a great many services are available for free, because venture-capital investors and lines of credit are footing the bill for the time being. Boosting the number of page views and clickthroughs is far more important for the future of an internet company these days than making a profit, and so the usual business plan is to provide plenty of free goodies to the public without worrying about the financial end of things. That’s very convenient just now for internet users, but it fosters the illusion that the internet costs nothing.
As mentioned earlier, this sort of thing is very common in the early decades of a new industry. As the industry matures, markets become saturated, startups become considerably riskier, and venture capital heads for greener pastures.  Once this happens, the companies that dominate the industry have to stay in business the old-fashioned way, by earning a profit, and that means charging as much as the market will bear, monetizing services that are currently free, and cutting service to the lowest level that customers will tolerate. That’s business as usual, and it means the end of most of the noncommercial content that gives the internet so much of its current role in popular culture.
All other things being equal, in other words, the internet can be expected to follow the usual trajectory of a maturing industry, becoming more expensive, less convenient, and more tightly focused on making a quick buck with each passing year. Governments have already begun to tax internet sales, removing one of the core “stealth subsidies” that boosted the internet at the expense of other retail sectors, and taxation of the internet will only increase as cash-starved officials contemplate the tidal waves of money sloshing back and forth online. None of these changes will kill the internet, but they’ll slap limits on the more utopian fantasies currently burbling about the web, and provide major incentives for individuals and businesses to back away from the internet and do things in the real world instead.
Then there’s the increasingly murky world of online crime, espionage, and warfare, which promises to push very hard in the same direction in the years ahead.  I think most people are starting to realize that on the internet, there’s no such thing as secure data, and the costs of conducting business online these days include a growing risk of having your credit cards stolen, your bank accounts looted, your identity borrowed for any number of dubious purposes, and the files on your computer encrypted without your knowledge, so that you can be forced to pay a ransom for their release—this latter, or so I’ve read, is the latest hot new trend in internet crime.
Online crime is one of the few fields of criminal endeavor in which raw cleverness is all you need to make out, as the saying goes, like a bandit. In the years ahead, as a result, the internet may look less like an information superhighway and more like one of those grim inner city streets where not even the muggers go alone. Trends in online espionage and warfare are harder to track, but either or both could become a serious burden on the internet as well.
Online crime, espionage, and warfare aren’t going to kill the internet, any more than the ordinary maturing of the industry will. Rather, they’ll lead to a future in which costs of being online are very often greater than the benefits, and the internet is by and large endured rather than enjoyed. They’ll also help drive the inevitable rebound away from the net. That’s one of those things that always happens and always blindsides the cheerleaders of the latest technology: a few decades into its lifespan, people start to realize that they liked the old technology better, thank you very much, and go back to it. The rebound away from the internet has already begun, and will only become more visible as time goes on, making a great many claims about the future of the internet look as absurd as those 1950s articles insisting that in the future, every restaurant would inevitably be a drive-in.
To be sure, the resurgence of live theater in the wake of the golden age of movie theaters didn’t end cinema, and the revival of bicycling in the aftermath of the automobile didn’t make cars go away. In the same way, the renewal of interest in offline practices and technologies isn’t going to make the internet go away. It’s simply going to accelerate the shift of avant-garde culture away from an increasingly bleak, bland, unsafe, and corporate- and government-controlled internet and into alternative venues. That won’t kill the internet, though once again it will put a stone marked R.I.P. atop the grave of a lot of the utopian fantasies that have clustered around today’s net culture.
All other things being equal, in fact, there’s no reason why the internet couldn’t keep on its present course for years to come. Under those circumstances, it would shed most of the features that make it popular with today’s avant-garde, and become one more centralized, regulated, vacuous mass medium, packed to the bursting point with corporate advertising and lowest-common-denominator content, with dissenting voices and alternative culture shut out or shoved into corners where nobody ever looks. That’s the normal trajectory of an information technology in today’s industrial civilization, after all; it’s what happened with radio and television in their day, as the gaudy and grandiose claims of the early years gave way to the crass commercial realities of the mature forms of each medium.
But all other things aren’t equal.
Radio and television, like most of the other familiar technologies that define life in a modern industrial society, were born and grew to maturity in an expanding economy. The internet, by contrast, was born during the last great blowoff of the petroleum age—the last decades of the twentieth century, during which the world’s industrial nations took the oil reserves that might have cushioned the transition to sustainability, and blew them instead on one last orgy of over-the-top conspicuous consumption—and it’s coming to maturity in the early years of an age of economic contraction and ecological blowback.
The rising prices, falling service quality, and relentless monetization of a maturing industry, together with the increasing burden of online crime and the inevitable rebound away from internet culture, will thus be hitting the internet in a time when the global economy no longer has the slack it once did, and the immense costs of running the internet in anything like its present form will have to be drawn from a pool of real wealth that has many other demands on it. What’s more, quite a few of those other demands will be far more urgent than the need to provide consumers with a convenient way to send pictures of kittens to their friends. That stark reality will add to the pressure to monetize internet services, and provide incentives to those who choose to send their kitten pictures by other means.
It’s crucial to remember here, as noted above, that the internet is simply a cheaper and more convenient way of doing things that people were doing long before the first website went live, and a big part of the reason why it’s cheaper and more convenient right now is that internet users are being subsidized by the investors and venture capitalists who are funding the internet industry. That’s not the only subsidy on which the internet depends, though. Along with the rest of industrial society, it’s also subsidized by half a billion years of concentrated solar energy in the form of fossil fuels.  As those deplete, the vast inputs of energy, labor, raw materials, industrial products, and other forms of wealth that sustain the internet will become increasingly expensive to provide, and ways of distributing kitten pictures that don’t require the same inputs will prosper in the resulting competition.
There are also crucial issues of scale. Most pre-internet communications and information technologies scale down extremely well. A community of relatively modest size can have its own public library, its own small press, its own newspaper, and its own radio station running local programming, and could conceivably keep all of these functioning and useful even if the rest of humanity suddenly vanished from the map. Internet technology doesn’t have that advantage. It’s orders of magnitude more complex and expensive than a radio transmitter, not to mention the 14th-century technology of printing presses and card catalogs; what’s more, on the scale of a small community, the benefits of using internet technology instead of simpler equivalents wouldn’t come close to justifying the vast additional cost.
Now of course the world of the future isn’t going to consist of a single community surrounded by desolate wasteland. That’s one of the reasons why the demise of the internet won’t happen all at once. Telecommunications companies serving some of the more impoverished parts of rural America are already letting their networks in those areas degrade, since income from customers doesn’t cover the costs of maintenance.  To my mind, that’s a harbinger of the internet’s future—a future of uneven decline punctuated by local and regional breakdowns, some of which will be fixed for a while.
That said, it’s quite possible that there will still be an internet of some sort fifty years from now. It will connect government agencies, military units, defense contractors, and the handful of universities that survive the approaching implosion of the academic industry here in the US, and it may provide email and a few other services to the very rich, but it will otherwise have a lot more in common with the original DARPAnet than with the 24/7 virtual cosmos imagined by today’s more gullible netheads.
Unless you’re one of the very rich or an employee of one of the institutions just named, furthermore, you won’t have access to the internet of 2065.  You might be able to hack into it, if you have the necessary skills and are willing to risk a long stint in a labor camp, but unless you’re a criminal or a spy working for the insurgencies flaring in the South or the mountain West, there’s not much point to the stunt. If you’re like most Americans in 2065, you live in Third World conditions without regular access to electricity or running water, and you’ve got other ways to buy things, find out what’s going on in the world, find out how to get to the next town and, yes, look at pictures of people with their clothes off. What’s more, in a deindustrializing world, those other ways of doing things will be cheaper, more resilient, and more useful than reliance on the baroque intricacies of a vast computer net.
Exactly when the last vestiges of the internet will sputter to silence is a harder question to answer. Long before that happens, though, it will have lost its current role as one of the poster children of the myth of perpetual progress, and turned back into what it really was all the time: a preposterously complex way to do things most people have always done by much simpler means, which only seemed to make sense during that very brief interval of human history when fossil fuels were abundant and cheap.
***In other news, I’m pleased to announce that the third anthology of deindustrial SF stories from this blog’s “Space Bats” contest, After Oil 3: The Years of Rebirth, is now available in print and e-book formats. Those of my readers who’ve turned the pages of the two previous After Oil anthologies already know that this one has a dozen eminently readable and thought-provoking stories about the world on the far side of the Petroleum Age; the rest of you—why, you’re in for a treat. Those who are interested in contributing to the next After Oil anthology will find the details here.

A Field Guide to Negative Progress

Wed, 2015-04-22 17:23
I've commented before in these posts that writing is always partly a social activity. What Mortimer Adler used to call the Great Conversation, the dance of ideas down the corridors of the centuries, shapes every word in a writer’s toolkit; you can hardly write a page in English without drawing on a shade of meaning that Geoffrey Chaucer, say, or William Shakespeare, or Jane Austen first put into the language. That said, there’s also a more immediate sense in which any writer who interacts with his or her readers is part of a social activity, and one of the benefits came my way just after last week’s post.
That post began with a discussion of the increasingly surreal quality of America’s collective life these days, and one of my readers—tip of the archdruidical hat to Anton Mett—had a fine example to offer. He’d listened to an economic report on the media, and the talking heads were going on and on about the US economy’s current condition of, ahem, “negative growth.” Negative growth? Why yes, that’s the opposite of growth, and it’s apparently quite a common bit of jargon in economics just now.
Of course the English language, as used by the authors named earlier among many others, has no shortage of perfectly clear words for the opposite of growth. “Decline” comes to mind; so does “decrease,” and so does “contraction.” Would it have been so very hard for the talking heads in that program, or their many equivalents in our economic life generally, to draw in a deep breath and actually come right out and say “The US economy has contracted,” or “GDP has decreased,” or even “we’re currently in a state of economic decline”? Come on, economists, you can do it!
But of course they can’t.  Economists in general are supposed to provide, shall we say, negative clarity when discussing certain aspects of contemporary American economic life, and talking heads in the media are even more subject to this rule than most of their peers. Among the things about which they’re supposed to be negatively clear, two are particularly relevant here; the first is that economic contraction happens, and the second is that that letting too much of the national wealth end up in too few hands is a very effective way to cause economic contraction. The logic here is uncomfortably straightforward—an economy that depends on consumer expenditures only prospers if consumers have plenty of money to spend—but talking about that equation would cast an unwelcome light on the culture of mindless kleptocracy entrenched these days at the upper end of the US socioeconomic ladder. So we get to witness the mass production of negative clarity about one of the main causes of negative growth.
It’s entrancing to think of other uses for this convenient mode of putting things. I can readily see it finding a role in health care—“I’m sorry, ma’am,” the doctor says, “but your husband is negatively alive;” in sports—“Well, Joe, unless the Orioles can cut down that negative lead of theirs, they’re likely headed for a negative win;” and in the news—“The situation in Yemen is shaping up to be yet another negative triumph for US foreign policy.” For that matter, it’s time to update one of the more useful proverbs of recent years: what do you call an economist who makes a prediction? Negatively right.
Come to think of it, we might as well borrow the same turn of phrase for the subject of last week’s post, the deliberate adoption of older, simpler, more independent technologies in place of today’s newer, more complex, and more interconnected ones. I’ve been talking about that project so far under the negatively mealy-mouthed label “intentional technological regress,” but hey, why not be cool and adopt the latest fashion? For this week, at least, we’ll therefore redefine our terms a bit, and describe the same thing as “negative progress.” Since negative growth sounds like just another kind of growth, negative progress ought to pass for another kind of progress, right?
With this in mind, I’d like to talk about some of the reasons that individuals, families, organizations, and communities, as they wend their way through today’s cafeteria of technological choices, might want to consider loading up their plates with a good hearty helping of negative progress.
Let’s start by returning to one of the central points raised here in earlier posts, the relationship between progress and the production of externalities. By and large, the more recent a technology is, the more of its costs aren’t paid by the makers or the users of the technology, but are pushed off onto someone else. As I pointed out a post two months ago, this isn’t accidental; quite the contrary, as noted in the post just cited, it’s hardwired into the relationship between progress and market economics, and bids fair to play a central role in the unraveling of the entire project of industrial civilization.
The same process of increasing externalities, though, has another face when seen from the point of view of the individual user of any given technology. When you externalize any cost of a technology, you become dependent on whoever or whatever picks up the cost you’re not paying. What’s more, you become dependent on the system that does the externalizing, and on whoever controls that system. Those dependencies aren’t always obvious, but they impose costs of their own, some financial and some less tangible. What’s more, unlike the externalized costs, a great many of these secondary costs land directly on the user of the technology.
It’s interesting, and may not be entirely accidental, that there’s no commonly used term for the entire structure of externalities and dependencies that stand behind any technology. Such a term is necessary here, so for the present purpose,  we’ll call the structure just named the technology’s externality system. Given that turn of phrase, we can restate the point about progress made above: by and large, the more recent a technology is, the larger the externality system on which it depends.
An example will be useful here, so let’s compare the respective externality systems of a bicycle and an automobile. Like most externality systems, these divide up more or less naturally into three categories: manufacture, maintenance, and use. Everything that goes into fabricating steel parts, for instance, all the way back to the iron ore in the mine, is an externality of manufacture; everything that goes into making lubricating oil, all the way back to drilling for the oil well, is an externality of maintenance; everything that goes into building roads suitable for bikes and cars is an externality of use.
Both externality systems are complex, and include a great many things that aren’t obvious at first glance. The point I want to make here, though, is that the car’s externality system is far and away the more complex of the two. In fact, the bike’s externality system is a subset of the car’s, and this reflects the specific historical order in which the two technologies were developed. When the technologies that were needed for a bicycle’s externality system came into use, the first bicycles appeared; when all the additional technologies needed for a car’s externality system were added onto that foundation, the first cars followed. That sort of incremental addition of externality-generating technologies is far and away the most common way that technology progresses.
We can thus restate the pattern just analyzed in a way that brings out some of its less visible and more troublesome aspects: by and large, each new generation of technology imposes more dependencies on its users than the generation it replaces. Again, a comparison between bicycles and automobiles will help make that clear. If you want to ride a bike, you’ve committed yourself to dependence on all the technical, economic, and social systems that go into manufacturing, maintaining, and using the bike; you can’t own, maintain, and ride a bike without the steel mills that produce the frame, the chemical plants that produce the oil you squirt on the gears, the gravel pits that provide raw material for roads and bike paths, and so on.
On the other hand, you’re not dependent on a galaxy of other systems that provide the externality system for your neighbor who drives. You don’t depend on the immense network of pipelines, tanker trucks, and gas stations that provide him with fuel; you don’t depend on the interstate highway system or the immense infrastructure that supports it; if you did the sensible thing and bought a bike that was made by a local craftsperson, your dependence on vast multinational corporations and all of their infrastructure, from sweatshop labor in Third World countries to financial shenanigans on Wall Street, is considerably smaller than that of your driving neighbor. Every dependency you have, your neighbor also has, but not vice versa.
Whether or not these dependencies matter is a complex thing. Obviously there’s a personal equation—some people like to be independent, others are fine with being just one more cog in the megamachine—but there’s also a historical factor to consider. In an age of economic expansion, the benefits of dependency very often outweigh the costs; standards of living are rising, opportunities abound, and it’s easy to offset the costs of any given dependency. In a stable economy, one that’s neither growing nor contracting, the benefits and costs of any given dependency need to be weighed carefully on a case by case basis, as one dependency may be worth accepting while another costs more than it’s worth.
On the other hand, in an age of contraction and decline—or, shall we say, negative expansion?—most dependencies are problematic, and some are lethal. In a contracting economy, as everyone scrambles to hold onto as much as possible of the lifestyles of a more prosperous age, your profit is by definition someone else’s loss, and dependency is just another weapon in the Hobbesian war of all against all. By many measures, the US economy has been contracting since before the bursting of the housing bubble in 2008; by some—in particular, the median and modal standards of living—it’s been contracting since the 1970s, and the unmistakable hissing sound as air leaks out of the fracking bubble just now should be considered fair warning that another round of contraction is on its way.
With that in mind, it’s time to talk about the downsides of dependency.
First of all, dependency is expensive. In the struggle for shares of a shrinking pie in a contracting economy, turning any available dependency into a cash cow is an obvious strategy, and one that’s already very much in play. Consider the conversion of freeways into toll roads, an increasingly popular strategy in large parts of the United States. Consider, for that matter, the soaring price of health care in the US, which hasn’t been accompanied by any noticeable increase in quality of care or treatment outcomes. In the dog-eat-dog world of economic contraction, commuters and sick people are just two of many captive populations whose dependencies make them vulnerable to exploitation. As the spiral of decline continues, it’s safe to assume that any dependency that can be exploited will be exploited, and the more dependencies you have, the more likely you are to be squeezed dry.
The same principle applies to power as well as money; thus, whoever owns the systems on which you depend, owns you. In the United States, again, laws meant to protect employees from abusive behavior on the part of employers are increasingly ignored; as the number of the permanently unemployed keeps climbing year after year, employers know that those who still have jobs are desperate to keep them, and will put up with almost anything in order to keep that paycheck coming in. The old adage about the inadvisability of trying to fight City Hall has its roots in this same phenomenon; no matter what rights you have on paper, you’re not likely to get far with them when the other side can stop picking up your garbage and then fine you for creating a public nuisance, or engage in some other equally creative use of their official prerogatives. As decline accelerates, expect to see dependencies increasingly used as levers for exerting various kinds of economic, political, and social power at your expense.
Finally, and crucially, if you’re dependent on a failing system, when the system goes down, so do you. That’s not just an issue for the future; it’s a huge if still largely unmentioned reality of life in today’s America, and in most other corners of the industrial world as well. Most of today’s permanently unemployed got that way because the job on which they depended for their livelihood got offshored or automated out of existence; much of the rising tide of poverty across the United States is a direct result of the collapse of political and social systems that once countered the free market’s innate tendency to drive the gap between rich and poor to Dickensian extremes. For that matter, how many people who never learned how to read a road map are already finding themselves in random places far from help because something went wrong with their GPS units?
It’s very popular among those who recognize the problem with being shackled to a collapsing system to insist that it’s a problem for the future, not the present.  They grant that dependency is going to be a losing bet someday, but everything’s fine for now, so why not enjoy the latest technological gimmickry while it’s here? Of course that presupposes that you enjoy the latest technological gimmicry, which isn’t necessarily a safe bet, and it also ignores the first two difficulties with dependency outlined above, which are very much present and accounted for right now. We’ll let both those issues pass for the moment, though, because there’s another factor that needs to be included in the calculation.
A practical example, again, will be useful here. In my experience, it takes around five years of hard work, study, and learning from your mistakes to become a competent vegetable gardener. If you’re transitioning from buying all your vegetables at the grocery store to growing them in your backyard, in other words, you need to start gardening about five years before your last trip to the grocery store. The skill and hard work that goes into growing vegetables is one of many things that most people in the world’s industrial nations externalize, and those things don’t just pop back to you when you leave the produce section of the store for the last time. There’s a learning curve that has to be undergone.
Not that long ago, there used to be a subset of preppers who grasped the fact that a stash of cartridges and canned wieners in a locked box at their favorite deer camp cabin wasn’t going to get them through the downfall of industrial civilization, but hadn’t factored in the learning curve. Businesses targeting the prepper market thus used to sell these garden-in-a-box kits, which had seed packets for vegetables, a few tools, and a little manual on how to grow a garden. It’s a good thing that Y2K, 2012, and all those other dates when doom was supposed to arrive turned out to be wrong, because I met a fair number of people who thought that having one of those kits would save them even though they last grew a plant from seed in fourth grade. If the apocalypse had actually arrived, survivors a few years later would have gotten used to a landscape scattered with empty garden-in-a-box kits, overgrown garden patches, and the skeletal remains of preppers who starved to death because the learning curve lasted just that much longer than they did.
The same principle applies to every other set of skills that has been externalized by people in today’s industrial society, and will be coming back home to roost as economic contraction starts to cut into the viability of our externality systems. You can adopt them now, when you have time to get through the learning curve while there’s still an industrial society around to make up for the mistakes and failures that are inseparable from learning, or you can try to adopt them later, when those same inevitable mistakes and failures could very well land you in a world of hurt. You can also adopt them now, when your dependencies haven’t yet been used to empty your wallet and control your behavior, or you can try to adopt them later, when a much larger fraction of the resources and autonomy you might have used for the purpose will have been extracted from you by way of those same dependencies.
This is a point I’ve made in previous posts here, but it applies with particular force to negative progress—that is, to the deliberate adoption of older, simpler, more independent technologies in place of the latest, dependency-laden offerings from the corporate machine. As decline—or, shall we say, negative growth—becomes an inescapable fact of life in postprogress America, decreasing your dependence on sprawling externality systems is going to be an essential tactic.
Those who become early adopters of the retro future, to use an edgy term from last week’s post, will have at least two, and potentially three, significant advantages. The first, as already noted, is that they’ll be much further along the learning curve by the time rising costs, increasing instabilities, and cascading systems failures either put the complex technosystems out of reach or push the relationship between costs and benefits well over into losing-proposition territory. The second is that as more people catch onto the advantages of older, simpler, more sustainable technologies, surviving examples will become harder to find and more expensive to buy; in this case as in many others, collapsing first ahead of the rush is, among other things, the more affordable option.
The third advantage? Depending on exactly which old technologies you happen to adopt, and whether or not you have any talent for basement-workshop manufacture and the like, you may find yourself on the way to a viable new career as most other people will be losing their jobs—and their shirts. As the global economy comes unraveled and people in the United States lose their current access to shoddy imports from Third World sweatshops, there will be a demand for a wide range of tools and simple technologies that still make sense in a deindustrializing world. Those who already know how to use such technologies will be prepared to teach others how to use them; those who know how to repair, recondition, or manufacture those technologies will be prepared to barter, or to use whatever form of currency happens to replace today’s mostly hallucinatory forms of money, to good advantage.
My guess, for what it’s worth, is that salvage trades will be among the few growth industries in the 21st century, and the crafts involved in turning scrap metal and antique machinery into tools and machines that people need for their homes and workplaces will be an important part of that economic sector. To understand how that will work, though, it’s probably going to be necessary to get a clearer sense of the way that today’s complex technostructures are likely to come apart. Next week, with that in mind, we’ll spend some time thinking about the unthinkable—the impending death of the internet.

The Retro Future

Wed, 2015-04-15 18:16
Is it just me, or has the United States taken yet another great leap forward into the surreal over the last few days? Glancing through the news, I find another round of articles babbling about how fracking has guaranteed America a gaudy future as a petroleum and natural gas exporter. Somehow none of these articles get around to mentioning that the United States is a major net importer of both commodities, that most of the big-name firms in the fracking industry have been losing money at a rate of billions a year since the boom began, and that the pileup of bad loans to fracking firms is pushing the US banking industry into a significant credit crunch, but that’s just par for the course nowadays.
Then there’s the current tempest in the media’s teapot, Hillary Clinton’s presidential run. I’ve come to think of Clinton as the Khloe Kardashian of American politics, since she owed her original fame to the mere fact that she’s related to someone else who once caught the public eye. Since then she’s cycled through various roles because, basically, that’s what Famous People do, and the US presidency is just the next reality-TV gig on her bucket list. I grant that there’s a certain wry amusement to be gained from watching this child of privilege, with the help of her multimillionaire friends, posturing as a champion of the downtrodden, but I trust that none of my readers are under the illusion that this rhetoric will amount to anything more than all that chatter about hope and change eight years ago.
Let us please be real: whoever mumbles the oath of office up there on the podium in 2017, whether it’s Clinton or the interchangeably Bozoesque figures currently piling one by one out of the GOP’s clown car to contend with her, we can count on more of the same: more futile wars, more giveaways to the rich at everyone else’s expense, more erosion of civil liberties, more of all the other things Obama’s cheerleaders insisted back in 2008 he would stop as soon as he got into office.  As Arnold Toynbee pointed out a good many years ago, one of the hallmarks of a nation in decline is that the dominant elite sinks into senility, becoming so heavily invested in failed policies and so insulated from the results of its own actions that nothing short of total disaster will break its deathgrip on the body politic.
While we wait for the disaster in question, though, those of us who aren’t part of the dominant elite and aren’t bamboozled by the spectacle du jour might reasonably consider what we might do about it all. By that, of course, I don’t mean that it’s still possible to save industrial civilization in general, and the United States in particular, from the consequences of their history. That possibility went whistling down the wind a long time ago. Back in 2005, the Hirsch Report showed that any attempt to deal with the impending collision with the hard ecological limits of a finite planet had to get under way at least twenty years before the peak of global conventional petroleum reserves, if there was to be any chance of avoiding massive disruptions. As it happens, 2005 also marked the peak of conventional petroleum production worldwide, which may give you some sense of the scale of the current mess.
Consider, though, what happened in the wake of that announcement. Instead of dealing with the hard realities of our predicament, the industrial world panicked and ran the other way, with the United States well in the lead. Strident claims that ethanol—er, solar—um, biodiesel—okay, wind—well, fracking, then—would provide a cornucopia of cheap energy to replace the world’s rapidly depleting reserves of oil, coal, and natural gas took the place of a serious energy policy, while conservation, the one thing that might have made a difference, was as welcome as garlic aioli at a convention of vampires.
That stunningly self-defeating response had a straightforward cause, which was that everyone except a few of us on the fringes treated the whole matter as though the issue was how the privileged classes of the industrial world could maintain their current lifestyles on some other resource base.  Since that question has no meaningful answer, questions that could have been answered—for example, how do we get through the impending mess with at least some of the achievements of the last three centuries intact?—never got asked at all. At this point, as a result, ten more years have been wasted trying to come up with answers to the wrong question, and most of the  doors that were still open in 2005 have been slammed shut by events since that time.
Fortunately, there are still a few possibilities for constructive action open even this late in the game. More fortunate still, the ones that will likely matter most don’t require Hillary Clinton, or any other member of America’s serenely clueless ruling elite, to do something useful for a change. They depend, rather, on personal action, beginning with individuals, families, and local communities and spiraling outward from there to shape the future on wider and wider scales.
I’ve talked about two of these possibilities at some length in posts here. The first can be summed up simply enough in a cheery sentence:  “Collapse now and avoid the rush!”  In an age of economic contraction—and behind the current facade of hallucinatory paper wealth, we’re already in such an age—nothing is quite so deadly as the attempt to prop up extravagant lifestyles that the real economy of goods and services will no longer support. Those who thrive in such times are those who downshift ahead of the economy, take the resources that would otherwise be wasted on attempts to sustain the unsustainable, and apply them to the costs of transition to less absurd ways of living. The acronym L.E.S.S.—“Less Energy, Stuff, and Stimulation”—provides a good first approximation of the direction in which such efforts at controlled collapse might usefully move.
The point of this project isn’t limited to its advantages on the personal scale, though these are fairly substantial. It’s been demonstrated over and over again that personal example is far more effective than verbal rhetoric at laying the groundwork for collective change. A great deal of what keeps so many people pinned in the increasingly unsatisfying and unproductive lifestyles sold to them by the media is simply that they can’t imagine a better alternative. Those people who collapse ahead of the rush and demonstrate that it’s entirely possible to have a humane and decent life on a small fraction of the usual American resource footprint are already functioning as early adopters; with every month that passes, I hear from more people—especially young people in their teens and twenties—who are joining them, and helping to build a bridgehead to a world on the far side of the impending crisis.
The second possibility is considerably more complex, and resists summing up so neatly. In a series of posts here  in 2010 and 2011, and then in my book Green Wizardry, I sketched out the toolkit of concepts and approaches that were central to the appropriate technology movement back in the 1970s, where I had my original education in the subjects central to this blog. I argued then, and still believe now, that by whatever combination of genius and sheer dumb luck, the pioneers of that movement managed to stumble across a set of approaches to the work of sustainability that are better suited to the needs of our time than anything that’s been proposed since then.
Among the most important features of what I’ve called the “green wizardry” of appropriate tech is the fact that those who want to put it to work don’t have to wait for the Hillary Clintons of the world to lift a finger. Millions of dollars in government grants and investment funds aren’t necessary, or even particularly useful. From its roots in the Sixties counterculture, the appropriate tech scene inherited a focus on do-it-yourself projects that could be done with hand tools, hard work, and not much money. In an age of economic contraction, that makes even more sense than it did back in the day, and the ability to keep yourself and others warm, dry, fed, and provided with many of the other needs of life without potentially lethal dependencies on today’s baroque technostructures has much to recommend it.
Nor, it has to be said, is appropriate tech limited to those who can afford a farm in the country; many of the most ingenious and useful appropriate tech projects were developed by and for people living in ordinary homes and apartments, with a small backyard or no soil at all available for gardening. The most important feature of appropriate tech, though, is that the core elements of its toolkit—intensive organic gardening and small-scale animal husbandry, homescale solar thermal technologies, energy conservation, and the like—are all things that will still make sense long after the current age of fossil fuel extraction has gone the way of the dinosaurs. Getting these techniques into as many hands as possible now is thus not just a matter of cushioning the impacts of the impending era of crisis; it’s also a way to start building the sustainable world of the future right now.
Those two strategies, collapsing ahead of the rush and exploring the green wizardry of appropriate technology, have been core themes of this blog for quite a while now. There’s a third project, though, that I’ve been exploring in a more abstract context here for a while now, and it’s time to talk about how it can be applied to some of the most critical needs of our time.
In the early days of this blog, I pointed out that technological progress has a feature that’s not always grasped by its critics, much less by those who’ve turned faith in progress into the established religion of our time. Very few new technologies actually meet human needs that weren’t already being met, and so the arrival of a new technology generally leads to the abandonment of an older technology that did the same thing. The difficulty here is that new technologies nowadays are inevitably more dependent on global technostructures, and the increasingly brittle and destructive economic systems that support them, than the technologies they replace. New technologies look more efficient than old ones because more of the work is being done somewhere else, and can therefore be ignored—for now.
This is the basis for what I’ve called the externality trap. As technologies get more complex, that complexity allows more of their costs to be externalized—that is to say, pushed onto someone other than the makers or users of the technology. The pressures of a market economy guarantee that those economic actors who externalize more of their costs will prosper at the expense of those who externalize less. The costs thus externalized, though, don’t go away; they get passed from hand to hand like hot potatoes and finally pile up in the whole systems—the economy, the society, the biosphere itself—that have no voice in economic decisions, but are essential to the prosperity and survival of every economic actor, and sooner or later those whole systems will break down under the burden.  Unlimited technological progress in a market economy thus guarantees the economic, social, and/or environmental destruction of the society that fosters it.
The externality trap isn’t just a theoretical possibility. It’s an everyday reality, especially but not only in the United States and other industrial societies. There are plenty of forces driving the rising spiral of economic, social, and environmental disruption that’s shaking the industrial world right down to its foundations, but among the most important is precisely the unacknowledged impact of externalized costs on the whole systems that support the industrial economy. It’s fashionable these days to insist that increasing technological complexity and integration will somehow tame that rising spiral of crisis, but the externality trap suggests that exactly the opposite is the case—that the more complex and integrated technologies become, the more externalities they will generate. It’s precisely because technological complexity makes it easy to ignore externalized costs that progress becomes its own nemesis.
Yes, I know, suggesting that progress isn’t infallibly beneficent is heresy, and suggesting that progress will necessarily terminate itself with extreme prejudice is heresy twice over. I can’t help that; it so happens that in most declining civilizations, ours included, the things that most need to be said are the things that, by and large, nobody wants to hear. That being the case, I might as well make it three for three and point out that the externality trap is a problem rather than a predicament. The difference, as longtime readers know, is that problems can be solved, while predicaments can only be faced. We don’t have to keep loading an ever-increasing burden of externalized costs on the whole systems that support us—which is to say, we don’t have to keep increasing the complexity and integration of the technologies that we use in our daily lives. We can stop adding to the burden; we can even go the other way.
Now of course suggesting that, even thinking it, is heresy on the grand scale. I’m reminded of a bit of technofluff in the Canadian media a week or so back that claimed to present a radically pessimistic view of the next ten years. Of course it had as much in common with actual pessimism as lite beer has with a pint of good brown ale; the worst thing the author, one Douglas Coupland, is apparently able to imagine is that industrial society will keep on doing what it’s doing now—though the fact that more of what’s happening now apparently counts as radical pessimism these days is an interesting point, and one that deserves further discussion.
The detail of this particular Dystopia Lite that deserves attention here, though, is Coupland’s dogmatic insistence that “you can never go backward to a lessened state of connectedness.” That’s a common bit of rhetoric out of the mouths of tech geeks these days, to be sure, but it isn’t even remotely true. I know quite a few people who used to be active on social media and have dropped the habit. I know others who used to have allegedly smart phones and went back to ordinary cell phones, or even to a plain land line, because they found that the costs of excess connectedness outweighed the benefits. Technological downshifting is already a rising trend, and there are very good reasons for that fact.
Most people find out at some point in adolescence that there really is such a thing as drinking too much beer. I think a lot of people are slowly realizing that the same thing is true of connectedness, and of the other prominent features of today’s fashionable technologies. One of the data points that gives me confidence in that analysis is the way that people like Coupland angrily dismiss the possibility. Part of his display of soi-disant pessimism is the insistence that within a decade, people who don’t adopt the latest technologies will be dismissed as passive-aggressive control freaks. Now of course that label could be turned the other way just as easily, but the point I want to make here is that nobody gets that bent out of shape about behaviors that are mere theoretical possibilities. Clearly, Coupland and his geek friends are already contending with people who aren’t interested in conforming to the technosphere.
It’s not just geek technologies that are coming in for that kind of rejection, either. These days, in the town where I live, teenagers whose older siblings used to go hotdogging around in cars ten years ago are doing the same thing on bicycles today. Granted, I live in a down-at-the-heels old mill town in the north central Appalachians, but there’s more to it than that. For a lot of these kids, the costs of owning a car outweigh the benefits so drastically that cars aren’t cool any more. One consequence of that shift in cultural fashion is that these same kids aren’t contributing anything like so much to the buildup of carbon dioxide in the atmosphere, or to the other externalized costs generated by car ownership.
I’ve written here already about deliberate technological regression as a matter of public policy. Over the last few months, though, it’s become increasingly clear to me that deliberate technological regression as a matter of personal choice is also worth pursuing. Partly this is because the deathgrip of failed policies on the political and economic order of the industrial world, as mentioned earlier, is tight enough that any significant change these days has to start down here at the grassroots level, with individuals, families, and communities, if it’s going to get anywhere at all; partly, it’s because technological regression, like anything else that flies in the face of the media stereotypes of our time, needs the support of personal example in order to get a foothold; partly, it’s because older technologies, being less vulnerable to the impacts of whole-system disruptions, will still be there meeting human needs when the grid goes down, the economy freezes up, or something really does break the internet, and many of them will still be viable when the fossil fuel age is a matter for the history books.
Still, there’s another aspect, and it’s one that the essay by Douglas Coupland mentioned above managed to hit squarely: the high-tech utopia ballyhooed by the first generation or so of internet junkies has turned out in practice to be a good deal less idyllic, and in fact a good deal more dystopian, than its promoters claimed. All the wonderful things we were supposedly going to be able to do turned out in practice to consist of staring at little pictures on glass screens and pushing buttons, and these are not exactly the most interesting activities in the world, you know. The people who are dropping out of social media and ditching their allegedly smart phones for a less connected lifestyle have noticed this.
What’s more, a great many more people—the kids hotdogging on bikes here in Cumberland are among them—are weighing  the costs and benefits of complex technologies with cold eyes, and deciding that an older, simpler technology less dependent on global technosystems is not just more practical, but also, and importantly, more fun. True believers in the transhumanist cyberfuture will doubtless object to that last point, but the deathgrip of failed ideas on societies in decline isn’t limited to the senile elites mentioned toward the beginning of this post; it can also afflict the fashionable intellectuals of the day, and make them proclaim the imminent arrival of the future’s rising waters when the tide’s already turned and is flowing back out to sea.
I’d like to suggest, in fact, that it’s entirely possible that we could be heading toward a future in which people will roll their eyes when they think of Twitter, texting, 24/7 connectivity, and the rest of today’s overblown technofetishism—like, dude, all that stuff is so twenty-teens! Meanwhile, those of us who adopt the technologies and habits of earlier eras, whether that adoption is motivated by mere boredom with little glass screens or by some more serious set of motives, may actually be on the cutting edge: the early adopters of the Retro Future. We’ll talk about that more in the weeks ahead.

The Burden of Denial

Wed, 2015-04-08 16:29
It occurred to me the other day that quite a few of the odder features of contemporary American culture make perfect sense if you assume that everybody knows exactly what’s wrong and what’s coming as our society rushes, pedal to the metal, toward its face-first collision with the brick wall of the future. It’s not that they don’t get it; they get it all too clearly, and they just wish that those of us on the fringes would quit reminding them of the imminent impact, so they can spend whatever time they’ve got left in as close to a state of blissful indifference as they can possibly manage.  
  I grant that this realization probably had a lot to do with the context in which it came to me. I was sitting in a restaurant, as it happens, with a vanload of fellow Freemasons.  We’d carpooled down to Baltimore, some of us to receive one of the higher degrees of Masonry and the rest to help with the ritual work, and we stopped for dinner on the way back home. I’ll spare you the name of the place we went; it was one of those currently fashionable beer-and-burger joints where the waitresses have all been outfitted with skirts almost long enough to cover their underwear, bare midriffs, and the sort of push-up bras that made them look uncomfortably like inflatable dolls—an impression that their too obviously scripted jiggle-and-smile routines did nothing to dispell.
Still, that wasn’t the thing that made the restaurant memorable. It was the fact that every wall in the place had television screens on it. By this I don’t mean that there was one screen per wall; I mean that they were lined up side by side right next to each other, covering the upper part of every single wall in the place, so that you couldn’t raise your eyes above head level without looking at one. They were all over the interior partitions of the place, too. There must have been forty of them in one not too large restaurant, each one blaring something different into the thick air, while loud syrupy music spattered down on us from speakers on the ceiling and the waitresses smiled mirthlessly and went through their routines. My burger and fries were tolerably good, and two tall glasses of Guinness will do much to ameliorate even so charmless a situation; still, I was glad to get back on the road.
The thing I’d point out is that all this is quite recent. Not that many years ago, it was tolerably rare to see a TV screen in an American restaurant, and even those bars that had a television on the premises for the sake of football season generally had the grace to leave the thing off the rest of the time. Within the last decade, I’ve watched televisions sprout in restaurants and pubs I used to enjoy, for all the world like buboes on the body of a plague victim: first one screen, then several, then one on each wall, then metastatizing across the remaining space. Meanwhile, along the same lines, people who used to go to coffee shops and the like to read the papers, talk with other patrons, or do anything else you care to name are now sitting in the same coffee shops in total silence, hunched over their allegedly smart phones like so many scowling gargoyles on the walls of a medieval cathedral.
Yes, there were people in the restaurant crouched in the gargoyle pose over their allegedly smart phones, too, and that probably also had something to do with my realization that evening.  It so happens that the evening before my Baltimore trip, I’d recorded a podcast interview with Chris Martenson on his Peak Prosperity show, and he’d described to me a curious response he’d been fielding from people who attended his talks on the end of the industrial age and the unwelcome consequences thereof. He called it “the iPhone moment”—the point at which any number of people in the audience pulled that particular technological toy out of their jacket pockets and waved it at him, insisting that its mere existence somehow disproved everything he was saying.
You’ve got to admit, as modern superstitions go, this one is pretty spectacular.  Let’s take a moment to look at it rationally. Do iPhones produce energy? Nope. Will they refill our rapidly depleting oil and gas wells, restock the ravaged oceans with fish, or restore the vanishing topsoil from the world’s  fields? Of course not. Will they suck carbon dioxide from the sky, get rid of the vast mats of floating plastic that clog the seas, or do something about the steadily increasing stockpiles of nuclear waste that are going to sicken and kill people for the next quarter of a million years unless the waste gets put someplace safe—if there is anywhere safe to put it at all? Not a chance. As a response to any of the predicaments that are driving the crisis of our age, iPhones are at best irrelevant.  Since they consume energy and resources, and the sprawling technosystems that make them function consume energy and resources at a rate orders of magnitude greater, they’re part of the problem, not any sort of a solution
Now of course the people waving their iPhones at Chris Martenson aren’t thinking about any of these things. A good case could be made that they’re not actually thinking at all. Their reasoning, if you want to call it that, seems to be that the existence of iPhones proves that progress is still happening, and this in turn somehow proves that progress will inevitably bail us out from the impacts of every one of the predicaments we face. To call this magical thinking is an insult to honest sorcerers; rather, it’s another example of the arbitrary linkage of verbal noises to emotional reactions that all too often passes for thinking in today’s America. Readers of classic science fiction may find all this weirdly reminiscent of a scene from some edgily updated version of H.G. Wells’ The Island of Doctor Moreau: “Not to doubt Progress: that is the Law. Are we not Men?”
Seen from a certain perspective, though, there’s a definite if unmentionable logic to “the iPhone moment,” and it has much in common with the metastatic spread of television screens across pubs and restaurants in recent years. These allegedly smart phones don’t do anything to fix the rising spiral of problems besetting industrial civilization, but they make it easier for people to distract themselves from those problems for a little while longer. That, I’d like to suggest, is also what’s driving the metastasis of television screens in the places that people used to go to enjoy a meal, a beer, or a cup of coffee and each other’s company. These days, that latter’s too risky; somebody might mention a friend who lost his job and can’t get another one, a spouse who gets sicker with each overpriced prescription the medical industry pushes on her, a kid who didn’t come back from Afghanistan, or the like, and then it’s right back to the reality that everyone’s trying to avoid. It’s much easier to sit there in silence staring at little colored pictures on a glass screen, from which all such troubles have been excluded.
Of course that habit has its own downsides. To begin with, those who are busy staring at the screens have to know, on some level, that sooner or later it’s going to be their turn to lose their jobs, or have their health permanently wrecked by the side effects their doctors didn’t get around to telling them about, or have their kids fail to come back from whatever America’s war du jour happens to be just then, or the like. That’s why so many people these days put so much effort into insisting as loudly as possible that the poor and vulnerable are to blame for their plight. The people who say this know perfectly well that it’s not true, but repeating such claims over and over again is the only defense they’ve got against the bitter awareness that their jobs, their health, and their lives or those of the people they care about could all too easily be next on the chopping block.
What makes this all the more difficult for most Americans to face is that none of these events are happening in a vacuum.  They’re part of a broader process, the decline and fall of modern industrial society in general and the United States of America in particular. Outside the narrowing circles of the well-to-do, standards of living for most Americans have been declining since the 1970s, along with standards of education, public health, and most of the other things that make for a prosperous and stable society. Today, a nation that once put human bootprints on the Moon can’t afford to maintain its roads and bridges or keep its cities from falling into ruin. Hiding from that reality in an imaginary world projected onto glass screens may be comforting in the short term; the mere fact that realities don’t go away just because they’re ignored does nothing to make this choice any less tempting.
What’s more, the world into which that broader process of decline is bringing us is not one in which staring at little colored pictures on a glass screen will count for much. Quite the contrary, it promises to be a world in which raw survival, among other things, will depend on having achieved at least a basic mastery of one or more of a very different range of skills. There’s no particular mystery about those latter skills; they were, in point of fact, the standard set of basic human survival skills for thousands of years before those glass screens were invented, and they’ll still be in common use when the last of the glass screens has weathered away into sand; but they have to be learned and practiced before they’re needed, and there may not be all that much time left to learn and practice them before hard necessity comes knocking at the door.
I think a great many people who claim that everything’s fine are perfectly aware of all this. They know what the score is; it’s doing something about it that’s the difficulty, because taking meaningful action at this very late stage of the game runs headlong into at least two massive obstacles. One of them is practical in nature, the other psychological, and human nature being what it is, the psychological dimension is far and away the most difficult of the two.
Let’s deal with the practicalities first. The non-negotiable foundation of any meaningful response to the crisis of our time, as I’ve pointed out more than once here, can be summed up conveniently with the acronym L.E.S.S.—that is, Less Energy, Stuff, and Stimulation. We are all going to have much less of these things at our disposal in the future.  Using less of them now frees up time, money, and other resources that can be used to get ready for the inevitable transformations. It also makes for decreased dependence on systems and resources that in many cases are already beginning to fail, and in any case will not be there indefinitely in a future of hard limits and inevitable scarcities.
On the other hand, using L.E.S.S. flies in the face of two powerful forces in contemporary culture. The first is the ongoing barrage of advertising meant to convince people that they can’t possibly be happy without the latest time-, energy-, and resource-wasting trinket that corporate interests want to push on them. The second is the stark shivering terror that seizes most Americans at the thought that anybody might think that they’re poorer than they actually are. Americans like to think of themselves as proud individualists, but like so many elements of the American self-image, that’s an absurd fiction; these days, as a rule, Americans are meek conformists who shudder with horror at the thought that they might be caught straying in the least particular from whatever other people expect of them.
That’s what lies behind the horrified response that comes up the moment someone suggests that using L.E.S.S. might be a meaningful part of our response to the crises of our age. When people go around insisting that not buying into the latest overhyped and overpriced lump of technogarbage is tantamount to going back to the caves—and yes, I field such claims quite regularly—you can tell that what’s going on in their minds has nothing to do with the realities of the situation and everything to do with stark unreasoning fear. Point out that a mere thirty years ago, people got along just fine without email and the internet, and you’re likely to get an even more frantic and abusive reaction, precisely because your listener knows you’re right and can’t deal with the implications.
This is where we get into the psychological dimension. What James Howard Kunstler has usefully termed the psychology of previous investment is a massive cultural force in today’s America. The predicaments we face today are in very large part the product of a long series of really bad decisions that were made over the last four decades or so. Most Americans, even those who had little to do with making those decisions, enthusiastically applauded them, and treated those who didn’t with no small amount of abuse and contempt. Admitting just how misguided those decisions turned out to be thus requires a willingness to eat crow that isn’t exactly common among Americans these days. Thus there’s a strong temptation to double down on the bad decisions, wave those iPhones in the air, and put a few more television screens on the walls to keep the cognitive dissonance at bay for a little while longer.
That temptation isn’t an abstract thing. It rises out of the raw emotional anguish woven throughout America’s attempt to avoid looking at the future it’s made for itself. The intensity of that anguish can be measured most precisely, I think, in one small but telling point: the number of people whose final response to the lengthening shadow of the future is, “I hope I’ll be dead before it happens.”
Think about those words for a moment. It used to be absolutely standard, and not only in America, for people of every social class below the very rich to work hard, save money, and do without so that their children could have a better life than they had. That parents could say to their own children, “I got mine, Jack; too bad your lives are going to suck,” belonged in the pages of lurid dime novels, not in everyday life. Yet that’s exactly what the words “I hope I’ll be dead before it happens” imply.  The destiny that’s overtaking the industrial world isn’t something imposed from outside; it’s not an act of God or nature or callous fate; rather, it’s unfolding with mathematical exactness from the behavior of those who benefit from the existing order of things.  It could be ameliorated significantly if those same beneficiaries were to let go of the absurd extravagance that characterizes what passes for a normal life in the modern industrial world these days—it’s just that the act of letting go involves an emotional price that few people are willing to pay.
Thus I don’t think that anyone says “I hope I’ll be dead before it happens” lightly. I don’t think the people who are consigning their own children and grandchildren to a ghastly future, and placing their last scrap of hope on the prospect that they themselves won’t live to see that future arrive, are making that choice out of heartlessness or malice. The frantic concentration on glass screens, the bizarre attempts to banish unwelcome realities by waving iPhones in their faces, and the other weird behavior patterns that surround American society’s nonresponse to its impending future, are signs of the enormous strain that so many Americans these days are under as they try to keep pretending that nothing is wrong in the teeth of the facts.
Denying a reality that’s staring you in the face is an immensely stressful process, and the stress gets worse as the number of things that have to be excluded from awareness mounts up. These days, that list is getting increasingly long. Look away from the pictures on the glass screens, and the United States is visibly a nation in rapid decline: its cities collapsing, its infrastructure succumbing to decades of malign neglect, its politics mired in corruption and permanent gridlock, its society frayed to breaking, and the natural systems that support its existence passing one tipping point after another and lurching through chaotic transitions.
Oklahoma has passed California as the most seismically active state in the Union as countless gallons of fracking fluid pumped into deep disposal wells remind us that nothing ever really “goes away.” It’s no wonder that so many shrill voices these days are insisting that nothing is wrong, or that it’s all the fault of some scapegoat or other, or that Jesus or the Space Brothers or somebody will bail us out any day now, or that we’re all going to be wiped out shortly by some colorful Hollywood cataclysm that, please note, is never our fault.
There is, of course, another option.
Over the years since this blog first began to attract an audience, I’ve spoken to quite a few people who broke themselves out of that trap, or were popped out of it willy-nilly by some moment of experience just that little bit too forceful to yield to the exclusionary pressure; many of them have talked about how the initial burst of terror—no, no, you can’t say that, you can’t think that!—gave way to an immense feeling of release and freedom, as the burden of keeping up the pretense dropped away and left them able to face the world in front of them at last.
I suspect, for what it’s worth, that a great many more people are going to be passing through that transformative experience in the years immediately ahead. A majority? Almost certainly not; to judge by historical precedents, the worse things get, the more effort will go into the pretense that nothing is wrong at all, and the majority will cling like grim death to that pretense until it drags them under. That said, a substantial minority might make a different choice: to let go of the burden of denial soon enough to matter, to let themselves plunge through those moments of terror and freedom, and to haul themselves up, shaken but alive, onto the unfamiliar shores of the future.
When they get there, there will be plenty of work for them to do. I’ve discussed some of the options in previous posts on this blog, but there’s at least one that hasn’t gotten a detailed examination yet, and it’s one that I’ve come to think may be of crucial importance in the decades ahead. We’ll talk about that next week.

Atlantis Won't Sink, Experts Agree

Wed, 2015-04-01 17:32
If you’re like most Atlanteans these days, you’ve heard all sorts of unnerving claims about the future of our continent. Some people are even saying that recent earth tremors are harbingers of a cataclysm that will plunge Atlantis to the bottom of the sea. Those old prophecies from the sacred scrolls of the Sun Temple have had the dust blown off them again, adding to the stew of rumors.
So is there anything to it? Should you be worried about the future of Atlantis?
Not according to the experts. I visited some of the most widely respected hierarchs here in the City of the Golden Gates yesterday to ask them about the rumors, and they assured me that there’s no reason to take the latest round of alarmist claims at all seriously.
  ***My first stop was the temple complex of black orichalcum just outside the Palace of the Ten Kings, where Nacil Buper, Grand Priestess of the Temple of Night, took time out of her busy schedule to meet with me. I asked her what she thought about the rumors of imminent catastrophe. “Complete and utter nonsense,” she replied briskly. “There are always people who want to insist that the end is nigh, and they can always find something to use to justify that sort of thing. Remember a few years ago, when everyone was running around insisting that the end of the Forty-First Grand Cycle of Time was going to bring the destruction of the world? This is more of the same silliness.”
Just at that moment, the floor shook beneath us, and I asked her about the earth tremors, pointing out that those seem to be more frequent than they were just a few years back.
“Atlantis has always had earthquakes,” the Grand Priestess reminded me, gesturing with her scepter of human bone.  “There are natural cycles affecting their frequency, and there’s no proof that they’re more frequent because of anything human beings are doing. In fact, I’m far from convinced that they’re any more frequent than they used to be. There are serious questions about whether the priests of the Sun Temple have been fiddling with their data, you know.”
“And the claim from those old prophecies that offering human sacrifices to Mu-Elortep, Lord of Evil, might have something to do with it?” I asked. 
“That’s the most outrageous kind of nonsense,” the Grand Priestess replied. “Atlanteans have been worshipping the Lord of Evil for more than a century and a half. It’s one of the foundations of our society and our way of life, and we should be increasing the number of offerings to Mu-Elortep as rapidly as we can, not listening to crazies from the fringe who insist that there’s something wrong with slaughtering people for the greater glory of the Lord of Evil. We can’t do without Mu-Elortep, not if we’re going to restore Atlantis to full prosperity and its rightful place in the world order, and if that means sacrifices have to be made—and it does—then sacrifices need to be made.”
She leaned forward confidentially, and her necklace of infant’s skulls rattled. “You know as well as I do that all this is just another attempt by the Priests of the Sun to dodge their responsibility for their own bad policies. Nobody would care in the least about all these crazy rumors of imminent doom if the Sun Priest Erogla hadn’t made such a fuss about the old prophecies in the scrolls of the Sun Temple a few years back. The Sun Temple’s the real problem we face. Fortunately, though, we of the Temple of Night have a majority in the Council of the Ten Kings now. We’re working on legislation right now to eradicate poverty in Atlantis by offering up the poor to Mu-Elortep in one grand bonfire. Once that’s done, I’m convinced, Atlantis will be on the road to a full recovery.”
  ***After my conversation with the Grand Priestess, I went uphill to the foot of the Sacred Mountain, where the Sun Temple rises above the golden-roofed palaces of the Patricians of Atlantis. I had made an appointment to see Tarc Omed, the Hierophant of the Priests of the Sun; he met me in his private chamber, and had his servants pour us purple wine from Valusia as we talked.
“I know the kind of thing you must have heard from the Temple of Night,” the Hierophant said wearily. “It’s all our fault the economy’s in trouble. Everything’s our fault. That’s how they avoid responsibility for the consequences of the policies they’ve been pursuing for decades now.”
I asked him what he thought of Nacil Buper’s claim that offering up the poor as human sacrifices would solve all the problems Atlantis faces these days.
“Look,” he said, “everybody knows that we’ve got to wean ourselves off making human sacrifices to the Lord of Evil one of these days. There’s no way we can keep that up indefinitely, and it’s already causing measurable problems. That’s why we’re proposing increased funding for more sustainable forms of worship directed toward other deities, so we can move step by step to a society that doesn’t have to engage in human sacrifice or deal with Mu-Elortep at all.”
And the ground tremors? Do they have anything to do with the sacrifices?
“That’s a good question. It’s hard to say whether any particular burst of tremors is being caused by the prophesied curse, you know, but that’s no reason for complacency.”
A tremor shook the room, and we both steadied our golden goblets of wine on the table. “Doesn’t that lend support to the rumors that Atlantis might sink soon?” I asked.
Tarc Omed looked weary again, and leaned back in his great chair of gold and ivory. “We have to be realistic,” he said. “Right now, Atlantean society depends on human sacrifice, and transitioning away from that isn’t something we can do overnight. We need to get those more sustainable forms of worship up and running first, and that can’t be done without negotiated compromises and the support of as many stakeholders as possible. Alarmism doesn’t further that.”
I thought of one of the things Nacil Buper had said. “But aren’t the prophecies of doom we’re discussing right there in the sacred scrolls of the Sun Temple?”
“We don’t consider that relevant just now,” the Hierophant told me firmly. “What matters right at the moment is to build a coalition strong enough to take back a majority in the Council of the Ten Kings, stop the Temple of Night’s crazy plan to sacrifice all of the poor to Mu-Elortep, and make sure that human sacrifices are conducted in as painless and sanitary a fashion as possible and increased only at the rate that’s really necessary, while we work toward phasing out human sacrifice altogether. Of course we can’t continue on our current path, but I have faith that Atlanteans can and will work together to stop any sort of worst-case scenario from happening.”
  ***From the Temple of the Sun I walked out of the patrician district, into one of the working class neighborhoods overlooking the Old Harbor. The ground shook beneath my feet a couple of times as I went. People working in the taverns and shops looked up at the Sacred Mountain each time, and then went back to their labor. It made me feel good to know that their confidence was shared by both the hierarchs I’d just interviewed.
I decided to do some person-in-the-street interviews for the sake of local color, and stepped into one of the taverns. Introducing myself to the patrons as a reporter, I asked what they thought about the rumors of disaster and the ongoing earth tremors.
“Oh, I’m sure the Priests of the Sun will think of something,” one patron said. I wrote that down on my wax tablet.
“Yeah,” agreed another. “How long have these prophecies been around? And Atlantis is still above water, isn’t it? I’m not worried.”
“I used to believe that stuff back in the day,” said a third patron. “You know, you buy into all kinds of silly things when you’re young and gullible, then grow out of it once it’s time to settle down and deal with the real world.  I sure did.”
That got nods and murmurs of approval all around. “I honestly think a lot of the people who are spreading these rumors actually want Atlantis to sink,” the third patron went on. “All this obsessing about those old prophecies and how awful human sacrifice is—I mean, can we get real, please?”
“You can say that again,” said the second patron. “I bet they do want Atlantis to sink. I bet they’re actually Lemurian sympathizers.”
The third patron turned to look at him.  “You know, that would make a lot of sense—”
Just then another tremor, a really strong one, shook the tavern. The whole room went dead silent for a moment. As the tremor died down, everybody started talking loudly all at once. I said my goodbyes and headed for the door.
As I stopped outside to put my wax tablet into the scribe’s case on my belt, one of the other patrons—a woman who hadn’t said anything—came through the door after me. “If you’re looking for a different point of view,” she told me, “you ought to go down to the Sea Temple. They’ll give you an earful.”
I thanked her, and started downhill toward the Old Harbor.
  ***I’d never been to the Sea Temple before; I don’t think most Atlanteans ever go there, though it’s been right there next to the Old Harbor since time out of mind. When I got there, the big doors facing the harbor were wide open, but the place seemed empty; the only sounds were the flapping of the big blue banners above the temple and the cries of sea birds up overhead.
As another tremor rattled the city, I walked in through the open doors. I didn’t see anyone at first, but after a few moments a woman in the blue robes of a Sea Priestess came out of the sanctuary further inside and hurried toward me. She had a basket of scrolls in her arms.
I introduced myself, explained that I was a journalist, and asked if she minded answering some questions.
“Not if you don’t mind walking with me to the harbor,” she said. “I’m in a bit of a hurry.”
“Sure,” I told her. “So what do you think about all these scary rumors? Do you really think Atlantis could end up underwater?”
We left the temple and started across the plaza outside, toward the harbor. “Have you read the prophecies of Emor Fobulc?” she asked me.
“Can’t say I have.”
“They predicted everything that’s happened: the rise of the cult of Mu-Elortep, the sacrifices, the earth tremors, and now the Sign.”
“The what?”
“When’s the last time you looked at the top of the Sacred Mountain?”
I stopped and looked right then. There was a plume of smoke rising from the great rounded peak. After a moment, I hurried to catch up to her.
“That’s the Sign,” she told me. “It means that the fires of Under-Earth have awakened and Atlantis will soon be destroyed.”
“Seriously?”
“Seriously.”
I thought about it for a moment as we walked, and the ground shook beneath our feet. “There could be plenty of other explanations for that smoke, you know.”
The priestess looked at me for a long moment. “No doubt,” she said dryly. 
By then we were near the edge of the quay, and half a dozen people came hurrying down the gangplank from a ship that was tied up there, an old-fashioned sailing vessel with a single mast and the prow carved to look like a swan. One of them, a younger priestess, bowed, took the basket of scrolls, and hurried back on board the ship. Another, who was dressed like a mariner, bowed too, and said to the priestess I’d spoken with, “Is there anything else, Great Lady?”
“Nothing,” she said. “We should go.” She turned to me. “You may come with us if you wish.”
“I need to have this story back to the pressroom before things shut down this afternoon,” I told her. “Are you going to be coming back within two hours or so?”
I got another of her long silent looks. “No,” she said. “We’ll be much longer than that.”
“Sorry, then—I hate to turn down a cruise, but work is work.”
She didn’t have anything to say to that, and the others more or less bundled her up the gangplank onto the ship. A couple of sailors untied the cables holding the ship against the quay and then climbed on board before it drifted away. A few minutes later the ship was pulling out into the Old Harbor; I could hear the oarsmen belowdecks singing one of their chanteys while the sailors climbed aloft and got the sail unfurled and set to the breeze.
After a few more minutes, I turned and started back up the hill toward the middle of town. As I climbed the slope, I could see more and more of the City of the Golden Gates around me in the afternoon sun: the Palace of the Ten Kings with the Temple of Night beside it, the Sun Temple and the golden roofs of the patricians’ palaces higher up the slope. The ground was shaking pretty much nonstop, but I barely noticed it, I’d gotten so used to the tremors.
The view got better as I climbed. Below, the Old Harbor spread out to one side and the New Harbor to the other. Next to the New Harbor was the charnel ground of Elah-Slio, where smoke was rising from the altars and long lines of victims were being driven forward with whips to be offered up as sacrifices to Mu-Elortep; off the other way, beyond the Old Harbor, I spotted twenty or so sails in the middle distance, heading away from Atlantis, and the ship with the priestess on it hurrying to join them.
That’s when it occurred to me that the Sea Priestess couldn’t have been serious when she said that Atlantis would soon be destroyed. Surely, if the prophecies were true, the Sea Priestesses would have had more important things to do than go on some kind of long vacation cruise. I laughed at how gullible I’d been there for a moment, and kept climbing the hill into the sunlight.
Above the Sacred Mountain, the cloud of smoke had gotten much bigger, and it looked as though some kind of red glow was reflecting off the bottom of it. I wondered what that meant, but figured I’d find out from the news soon enough. It certainly made me feel good to know that there was no reason whatever to worry about the far-fetched notion that Atlantis might end up at the bottom of the sea.

(Note: due to a date-linked transtemporal anomaly, this week’s planned Archdruid Report post got switched with a passage from the Swenyliad, an Atlantean chronicle dating from 9613 BCE. We apologize for any inconvenience.)

Planet of the Space Bats

Wed, 2015-03-25 17:16
As my regular readers know, I’ve been talking for quite a while now here about the speculative bubble that’s built up around the fracking phenomenon, and the catastrophic bust that’s guaranteed to follow so vast and delusional a boom. Over the six months or so, I’ve noted the arrival of one warning sign after another of the impending crash. As the saying has it, though, it’s not over ‘til the fat lady sings, so I’ve been listening for the first notes of the metaphorical aria that, in the best Wagnerian style, will rise above the orchestral score as the fracking industry’s surrogate Valhalla finally bursts into flames and goes crashing down into the Rhine.
 
I think I just heard those first high notes, though, in an improbable place: the email inbox of the Ancient Order of Druids in America (AODA), the Druid order I head.
I have no idea how many of my readers know the first thing about my unpaid day job as chief executive—the official title is Grand Archdruid—of one of the two dozen or so Druid orders in the western world. Most of what goes into that job, and the admittedly eccentric minority religious tradition behind it, has no relevance to the present subject. Still, I think most people know that Druids revere the natural world, and take ecology seriously even when that requires scrapping some of the absurd extravagances that pass for a normal lifestyle these days. Thus a Druid order is arguably the last place that would come to mind if you wanted to sell stock in a fracking company.
Nonetheless, that’s what happened. The bemused AODA office staff the other day fielded a solicitation from a stock firm trying to get Druids to invest their assets in the fracking industry.
Does that sound like a desperation move to you, dear reader? It certainly does to me—and there’s good reason to think that it probably sounds that way to the people who are trying to sell shares in fracking firms to one final round of clueless chumps, too. A recent piece in the Wall Street Journal (available outside the paywall here) noted that American banks have suddenly found themselves stuck with tens of millions of dollars’ worth of loans to fracking firms which they hoped to package up and sell to investors—but suddenly nobody’s buying. Bankruptcies and mass layoffs are becoming an everyday occurrence in the fracking industry, and the price of oil continues to lurch down as producers maximize production for the sake of immediate cash flow.
Why, though, isn’t the drop in the price of oil being met by an upsurge in consumption that drives the price back up, as the accepted rules of economics would predict? That’s the cream of the jest. Here in America, and to a lesser extent elsewhere in the industrial world, four decades of enthusiastically bipartisan policies that benefited the rich at everyone else’s expense managed to prove Henry Ford’s famous argument: if you don’t pay your own employees enough that they can afford to buy your products, sooner or later, you’re going to go broke.
By driving down wages and forcing an ever larger fraction of the US population into permanent unemployment and poverty, the movers and shakers of America’s political class have managed to trigger a classic crisis of overproduction, in which goods go begging for buyers because too few people can afford to buy them at any price that will pay for their production. It’s not just oil that’s affected, either: scores of other commodities are plunging in price as the global economy tips over into depression. There’s a specter haunting the industrial world; it’s the ghost of Karl Marx, laughing with mordant glee as the soi-disant masters of the universe, having crushed his misbegotten Soviet stepchildren, go all out to make his prophecy of capitalism’s self-immolation look remarkably prescient.
The soaring price of crude oil in the wake of the 2005 global peak of conventional oil production should have served notice to the industrial world that, to adapt the title of Richard Heinberg’s excellent 2003 summary of the situation, the party was over:  the long era in which energy supplies had increased year over year was giving way to an unwelcome new reality in which decreasing energy supplies and increasing environmental blowback were the defining themes. As my readers doubtless noticed, though, the only people who willing to grasp that were out here on the fringes where archdruids lurk. Closer to the mainstream of our collective thinking, most people scrunched shut their eyes, plugged their ears with their fingers, and shouted “La, la, la, I can’t hear you” at the top of their lungs, in a desperate attempt to keep reality from getting a word in edgewise.
For the last five years or so, any attempt to talk about the impending twilight of the age of oil thus ran headfirst into a flurry of pro-fracking propaganda. Fatuous twaddle about America’s inevitable future as the world’s new energy superpower took the place of serious discussions of the predicament into which we’ve backed ourselves—and not for the first time, either. That’s what makes the attempt to get Druids to invest their life savings in fracking so funny, in a bleak sort of way: it’s an attempt to do for the fracking boom what the fracking boom attempted to do for industrial civilization as a whole—to pretend, in the teeth of the facts, that the unsustainable can be sustained for just a little while longer.
A few months back, I decided to celebrate this sort of thinking by way of the grand old Druid custom of satire. The Great Squirrel Case Challenge of 2015 solicited mock proposals for solving the world’s energy problems that were even nuttier than the ones in the mainstream media. That was no small challenge—a detail some of my readers pointed up by forwarding any number of clueless stories from the mainstream media loudly praising energy boondoggles of one kind or another.
I’m delighted to say, though, that the response was even better than I’d hoped for.  The contest fielded more than thirty entries, ranging from the merely very good to the sidesplittingly funny. There were two winners, one chosen by the members of the Green Wizardsforum, one chosen by me; in both cases, it was no easy choice, and if I had enough author’s copies of my new book After Progress, I’d probably just up and given prizes to all the entries, they were that good. Still, it’s my honor to announce the winners:
My choice for best squirrel case—drumroll, please—goes to Steve Morgan, for his fine gosh-wow sales prospectus for, ahem, Shares of Hydrocarbons Imported from Titan. The Green Wizards forum choice—drumroll again—goes to Jason Heppenstall for his hilarious parody of a sycophantic media story, King Solomon’s Miners. Please join me in congratulating them. (Steve and Jason, drop me a comment with your mailing addresses, marked not for posting, and I’ll get your prizes on the way.)
Their hard-won triumph probably won’t last long. In the months and years ahead, I expect to see claims even more ludicrous being taken oh-so-seriously by the mainstream media, because the alternative is to face up to just how badly we’ve bungled the opportunities of the last four decades or so and just how rough a road we have ahead of us as a result. What gave the fracking bubble whatever plausibility it ever had, after all, was the way it fed on one of the faith-based credos at the heart of contemporary popular culture: the insistence, as pervasive as it is irrational, that the universe is somehow obligated to hand us abundant new energy sources to replace the ones we’ve already used so profligately. Lacking that blind faith, it would have been obvious to everyone—as it was to those of us in the peak oil community—that the fracking industry was scraping the bottom of the barrel and pretending that this proved the barrel was full.
Read the morning news with eyes freed from the deathgrip of the conventional wisdom and it’s brutally obvious that that’s what happened, and that the decline and fall of our civilization is well under way. Here in the US, a quarter of the country is in the fourth year of record drought, with snowpack on California’s Sierra Nevada mountains about 9% of normal; the Gulf Stream is slowing to a crawl due to the rapid melting of the Greenland ice sheets; permanent joblessness and grinding poverty have become pervasive in this country; the national infrastructure is coming apart after decades of malign neglect—well, I could go on; if you want to know what life is like in a falling civilization, go look out the window.
In the mainstream media, on the occasions when such things are mentioned at all, they’re treated as disconnected factoids irrelevant to the big picture. Most people haven’t yet grasped that these things arethe big picture—that while we’re daydreaming about an assortment of shiny futures that look more or less like the present with more toys, climate change, resource depletion, collapsing infrastructure, economic contraction, and the implosion of political and cultural institutions are creating the future we’re going to inhabit. Too many of us suffer from a weird inability to imagine a future that isn’t simply a continuation of the present, even when such a future stands knocking at our own front doors.
So vast a failure of imagination can’t be overcome by the simple expedient of pointing out the ways that it’s already failed to explain the world in which we live. That said, there are other ways to break the grip of the conventional wisdom, and I’m pleased to say that one of those other ways seems to be making modest but definite headway just now.
Longtime readers here will remember that in 2011, this blog launched a contest for short stories about the kind of future we can actually expect—a future in which no deus ex machina saves industrial civilization from the exhaustion of its resource base, the deterioration of the natural systems that support it, and the normal process of decline and fall. That contest resulted in an anthology, After Oil: SF Stories of a Post-Petroleum Future, which found a surprisingly large audience. On the strength of its success, I ran a second contest in 2014, which resulted in two more volumes—After Oil 2: The Years of Crisis, which is now available, and After Oil 3: The Years of Rebirth, which is in preparation. Demand for the original volume has remained steady, and the second is selling well; after a conversation with the publisher, I’m pleased to announce that we’re going to do it again, with a slight twist.
The basic rules are mostly the same as before:
Stories should be between 2500 and 7500 words in length; They should be entirely the work of their author or authors, and should not borrow characters or setting from someone else’s work;They should be in English, with correct spelling, grammar and punctuation; They should be stories—narratives with a plot and characters—and not simply a guided tour of some corner of the future as the author imagines it; They should be set in our future, not in an alternate history or on some other planet;They should be works of realistic fiction or science fiction, not magical or supernatural fantasy—that is, the setting and story should follow the laws of nature as those are presently understood;They should take place in settings subject to thermodynamic, ecological, and economic limits to growth; and as before,They must not rely on “alien space bats”—that is, dei ex machina inserted to allow humanity to dodge the consequences of the limits to growth. (Aspiring authors might want to read the whole “Alien Space Bats” post for a more detailed explanation of what I mean here; reading the stories from one or both of the published After Oil volumes might also be a good plan.)
This time, though, I’m adding an additional rule:
Stories submitted for this contest must be set at least one thousand years in the future—that is, after March 25, 3015 in our calendar.
That’s partly a reflection of a common pattern in entries for the two previous contests, and partly something deeper. The common pattern? A great many authors submitted stories that were set during or immediately after the collapse of industrial civilization; there’s certainly room for those, enough so that the entire second volume is basically devoted to them, but tales of surviving decline and fall are only a small fraction of the galaxy of potential stories that would fit within the rules listed above.  I’d like to encourage entrants to consider telling something different, at least this time.
The deeper dimension? That’s a reflection of the blindness of the imagination discussed earlier in this post, the inability of so many people to think of a future that isn’t simply a prolongation of the present. Stories set in the immediate aftermath of our civilization don’t necessarily challenge that, and I think it’s high time to start talking about futures that are genuinely other—neither utopia nor oblivion, but different, radically different, from the linear extrapolations from the present that fill so many people’s imaginations these days, and have an embarrassingly large role even in science fiction.
You have to read SF from more than a few decades back to grasp just how tight the grip of a single linear vision of the future has become on what used to be a much more freewheeling literature of ideas. In book after book, and even more in film after film, technologies that are obviously derived from ours, ideologies that are indistinguishable from ours, political and economic arrangements that could pass for ours, and attitudes and ideas that belong to this or that side of today’s cultural struggles get projected onto the future as though they’re the only imaginable options. This takes place even when there’s very good reason to think that the linear continuation of current trends isn’t an option at all—for example, the endlessly regurgitated, done-to-death trope of interstellar travel.
Let us please be real:  we aren’t going to the stars—not in our lifetimes, not in the lifetime of industrial civilization, not in the lifetime of our species. There are equally  good thermodynamic and economic reasons to believe that many of the other standard tropes of contemporary science fiction are just as unreachable—that, for example, limitless energy from gimmicks of the dilithium-crystal variety, artificial intelligences capable of human or superhuman thought, and the like belong to fantasy, not to the kind of science fiction that has any likelihood of becoming science fact. Any of my readers who want to insist that human beings can create anything they can imagine, by the way, are welcome to claim that, just as soon as they provide me with a working perpetual motion machine.
It’s surprisingly common to see people insist that the absence of the particular set of doodads common to today’s science fiction would condemn our descendants to a future of endless boredom. This attitude shows a bizarre stunting of the imagination—not least because stories about interstellar travel normally end up landing the protagonists in a world closely modeled on some past or present corner of the Earth. If our genus lasts as long as the average genus of vertebrate megafauna, we’ve got maybe ten million years ahead of us, or roughly two thousand times as long as all of recorded human history to date: more than enough time for human beings to come up with a dazzling assortment of creative, unexpected, radically differentsocieties, technologies, and ways of facing the universe and themselves.
That’s what I’d like to see in submissions to this year’s Space Bats challenge—yes, it’ll be an annual thing from here on out, as long as the market for such stories remains lively. A thousand years from now, industrial civilization will be as far in the past as the Roman Empire was at the time of the Renaissance, and new human societies will have arisen to pass their own judgment on the relics of our age. Ten thousand years from now, or ten million? Those are also options. Fling yourself into the far future, far enough that today’s crises are matters for the history books, or tales out of ancient myth, or forgotten as completely as the crises and achievements of the Neanderthal people are today, and tell a story about human beings (or, potentially, post-human beings) confronting the challenges of their own time in their own way. Do it with verve and a good readable style, and your story may be be one of the ones chosen to appear in the pages of After Oil 4:  The Future’s Distant Shores.
The mechanics are pretty much the same as before. Write your story and post it to the internet—if you don’t have a blog, you can get one for free from Blogspot or Wordpress. Post a link to it in the comments to The Archdruid Report. You can write more than one story, but please let me know which one you want entered in the competition—there will be only one entry accepted per author this time. Stories must be written and posted online, and a link posted to this blog, by August 30, 2015 to be eligible for inclusion in the anthology.

The View From Outside

Wed, 2015-03-18 17:25
Recently I’ve been reacquainting myself with the stories of Clark Ashton Smith. Though he’s largely forgotten today, Smith was one of the leading lights of Weird Tales magazine during its 1930s golden age, ranking with H.P Lovecraft and Robert Howard as a craftsman of fantasy fiction. Like Lovecraft, Howard, and most of the other authors in the Weird Talesstable, Smith was an outsider; he spent his life in a small town in rural California; he was roundly ignored by the literary scene of his day, and returned the favor with gusto. With the twilight of the pulps, Smith’s work was consigned to the dustbin of literary history.  It was revived briefly during the fantasy boom of the 1970, only to sink from sight again when the fantasy genre drowned in a swamp of faux-medieval clichés thereafter.
There’s no shortage of reasons to give Smith another look today, starting with his mastery of image and atmosphere and the wry humor that shaped the best of his mature work. Still, that’s a theme for another time, and possibly another forum. The theme that’s relevant to this blog is woven into one of  Smith’s classic stories, The Dark Age. First published in 1938, it’s among the earliest science fiction stories I know of that revolves around an organized attempt to preserve modern science through a future age of barbarism.
The story’s worth reading in its own right, so I won’t hand out spoilers here. Still, I don’t think it will give away anything crucial to mention that one of the mainsprings of the story is the inability of the story’s scientists to find or make common ground with the neo-barbarian hill tribes around them. That aspect of the story has been much on my mind of late. Despite the rockets and rayguns that provide so much of its local color, science fiction is always about the present, which it displays in an unfamiliar light by showing a view from outside, from the distant perspective of an imaginary future.
That’s certainly true of Smith’s tale, which drew much of its force at the time of its composition from the widening chasm between the sciences and the rest of human culture that C.P. Snow discussed two decades later in his famous work “The Two Cultures.” That chasm has opened up a good deal further since Smith’s time, and its impact on the future deserves discussion here, not least because it’s starting to come into sight even through the myopic lenses of today’s popular culture.
I’m thinking here, for example, of a recent blog post by Scott Adams, the creator of the “Dilbert” comic strip. There’s a certain poetic justice in seeing popular culture’s acknowledged expert on organizational failure skewer one of contemporary science’s more embarrassing habits, but there’s more to the spectacle than a Dilbertesque joke. As Adams points out, there’s an extreme mismatch between the way that science works and the way that scientists expect their claims to be received by the general public. Within the community of researchers, the conclusions of the moment are, at least in theory, open to constant challenge—but only from within the scientific community.
The general public is not invited to take part in those challenges. Quite the contrary, it’s supposed to treat the latest authoritative pronouncement as truth pure and simple, even when that contradicts the authoritative pronouncements of six months before. Now of course there are reasons why scientists might not want to field a constant stream of suggestions and challenges from people who don’t have training in relevant disciplines, but the fact remains that expecting people to blindly accept whatever scientists say about nutrition, when scientific opinion on that subject has been whirling around like a weathercock for decades now, is not a strategy with a long shelf life. Sooner or later people start asking why they should take the latest authoritative pronouncement seriously, when so many others landed in the trash can of discarded opinions a few years further on.
There’s another, darker reason why such questions are increasingly common just now. I’m thinking here of the recent revelation that the British scientists tasked by the government with making dietary recommendations have been taking payola of various kinds from the sugar industry.  That’s hardly a new thing these days. Especially but not only in those branches of science concerned with medicine, pharmacology, and nutrition, the prostitution of the scientific process by business interests has become an open scandal. When a scientist gets behind a podium and makes a statement about the safety or efficacy of a drug, a medical treatment, or what have you, the first question asked by an ever-increasing number of people outside the scientific community these days is “Who’s paying him?”
It would be bad enough if that question was being asked because of scurrilous rumors or hostile propaganda. Unfortunately, it’s being asked because there’s nothing particularly unusual about the behavior of the British scientists mentioned above. These days, in any field where science comes into contact with serious money, scientific studies are increasingly just another dimension of marketing. From influential researchers being paid to put their names on dubious studies to give them unearned credibility to the systematic concealment of “outlying” data that doesn’t support the claims made for this or that lucrative product, the corruption of science is an ongoing reality, and one that existing safeguards within the scientific community are not effectively countering.
Scientists have by and large treated the collapse in scientific ethics as an internal matter. That’s a lethal mistake, because the view that matters here is the view from outside. What looks to insiders like a manageable problem that will sort itself out in time, looks from outside the laboratory and the faculty lounge like institutionalized corruption on the part of a self-proclaimed elite whose members cover for each other and are accountable to no one. It doesn’t matter, by the way, how inaccurate that view is in specific cases, how many honest men and women are laboring at lab benches, or how overwhelming the pressure to monetize research that’s brought to bear on scientists by university administrations and corporate sponsors: none of that finds its way into the view from outside, and in the long run, the view from outside is the one that counts..
The corruption of science by self-interest is an old story, and unfortunately it’s most intense in those fields where science impacts the lives of nonscientists most directly:  yes, those would be medicine, pharmacology, and nutrition. I mentioned in an earlier blog post here a friend whose lifelong asthma, which landed her in the hospital repeatedly and nearly killed her twice, was cured at once by removing a common allergen from her diet. Mentioning this to her physician led to the discovery that he’d known about the allergy issue all along, but as he explained, “We prefer to medicate for that.” Understandably so, as a patient who’s cured of an ailment is a good deal less lucrative for the doctor than one who has to keep on receiving regular treatments and prescriptions—but as a result of that interaction among others, the friend in question has lost most of what respect she once had for mainstream medicine, and is now learning herbalism to meet her health care needs.
It’s an increasingly common story these days, and I could add plenty of other accounts here. The point I want to make, though, is that it’s painfully obvious that the physician who preferred to medicate never thought about the view from outside. I have no way of knowing what combination of external pressures and personal failings led him to conceal a less costly cure from my friend, and keep her on expensive and ineffective drugs with a gallery of noxious side effects instead, but from outside the walls of the office, it certainly looked like a callous betrayal of whatever ethics the medical profession might still have left—and again, the view from outside is the one that counts.
It counts because institutional science only has the authority and prestige it possesses today because enough of those outside the scientific community accept its claim to speak the truth about nature. Not that many years ago, all things considered, scientists didn’t have the authority or the prestige, and no law of nature or of society guarantees that they’ll keep either one indefinitely. Every doctor who would rather medicate than cure, every researcher who treats conflicts of interest as just another detail of business as usual, every scientist who insists in angry tones that nobody without a Ph.D. in this or that discipline is entitled to ask why this week’s pronouncement should be taken any more seriously than the one it just disproved—and let’s not even talk about the increasing, and increasingly public, problem of overt scientific fraud in the pharmaceutical field among others—is hastening the day when modern science is taken no more seriously by the general public than, say, academic philosophy is today.
That day may not be all that far away. That’s the message that should be read, and is far too rarely read, in the accelerating emergence of countercultures that reject the authority of science in one field. As a recent and thoughtful essay in Slate pointed out, that crisis of authority is what gives credibility to such movements as climate denialists and “anti-vaxxers” (the growing number of parents who refuse to have their children vaccinated). A good many any people these days, when the official voices of the scientific community say this or that, respond by asking “Why should we believe you?”—and too many of them don’t get a straightforward answer that addresses their concerns.
A bit of personal experience from a different field may be relevant here. Back in the late 1980s and early 1990s, when I lived in Seattle, I put a fair amount of time into collecting local folklore concerning ghosts and other paranormal phenomena. I wasn’t doing this out of any particular belief, or for that matter any particular unbelief; I was seeking a sense of the mythic terrain of the Puget Sound region, the landscapes of belief and imagination that emerged from the experiences of people on the land, with an eye toward the career writing fiction that I then hoped to launch. While I was doing this research, when something paranormal was reported anywhere in the region, I generally got to hear about it fairly quickly, and in the process I got to watch a remarkable sequence of events that repeated itself like a broken record in more cases than I can count.
Whether the phenomenon that was witnessed was an unusual light in the sky, a seven-foot-tall hairy biped in the woods, a visit from a relative who happened to be dead at the time, or what have you, two things followed promptly once the witness went public. The first was the arrival of a self-proclaimed skeptic, usually a member of CSICOP (the Committee for Scientific Investigation of Claims of the Paranormal), who treated the witness with scorn and condescension, made dogmatic claims about what must have happened, and responded to any disagreement with bullying and verbal abuse. The other thing that followed was the arrival of an investigator from one of the local paranormal-research organizations, who was invariably friendly and supportive, listened closely to the account of the witness, and took the incident seriously. I’ll let you guess which of the proposed explanations the witness usually ended up embracing, not to mention which organization he or she often joined.
The same process on a larger and far more dangerous scale is shaping attitudes toward science across a wide and growing sector of American society. Notice that unlike climate denialism, the anti-vaxxer movement isn’t powered by billions of dollars of grant money, but it’s getting increasing traction. The reason is as simple as it is painful: parents are asking physicians and scientists, “How do I know this substance you want to put into my child is safe?”—and the answers they’re getting are not providing them with the reassurance they need.
It’s probably necessary here to point out that I’m no fan of the anti-vaxxer movement. Since epidemic diseases are likely to play a massive role in the future ahead of us, I’ve looked into anti-vaxxer arguments with some care, and they don’t convince me at all. It’s clear from the evidence that vaccines do far more often than not provide protection against dangerous diseases; while some children are harmed by the side effects of vaccination, that’s true of every medical procedure, and the toll from side effects is orders of magnitude smaller than the annual burden of deaths from these same diseases in the pre-vaccination era.
Nor does the anti-vaxxer claim that vaccines cause autism hold water. (I have Aspergers syndrome, so the subject’s of some personal interest to me.)  The epidemiology of autism spectrum disorders simply doesn’t support that claim; to my educated-layperson’s eyes, at least, it matches that of an autoimmune disease instead, complete with the rapid increase in prevalence in recent years. The hypothesis I’d be investigating now, if I’d gone into biomedical science rather than the history of ideas, is that autism spectrum disorders are sequelae of an autoimmune disease that strikes in infancy or early childhood, and causes damage to any of a variety of regions in the central nervous system—thus the baffling diversity of neurological deficits found in those of us on the autism spectrum.
Whether that’s true or not will have to be left to trained researchers. The point that I want to make here is that I don’t share the beliefs that drive the anti-vaxxer movement. Similarly, I’m sufficiently familiar with the laws of thermodynamics and the chemistry of the atmosphere to know that when the climate denialists insist that dumping billions of tons of carbon dioxide into the atmosphere can’t change its capacity to retain heat, they’re smoking their shorts.  I’ve retained enough of a childhood interest in paleontology, and studied enough of biology and genetics since then, to be able to follow the debates between evolutionary biology and so-called “creation science,” and I’m solidly on Darwin’s side of the bleachers. I could go on; I have my doubts about a few corners of contemporary scientific theory, but then so do plenty of scientists.
That is to say, I don’t agree with the anti-vaxxers, the climate denialists, the creationists, or their equivalents, but I think I understand why they’ve rejected the authority of science, and it’s not because they’re ignorant cretins, much as though the proponents and propagandists of science would like to claim that. It’s because they’ve seen far too much of the view from outside. Parents who encounter a medical industry that would rather medicate than heal are more likely to listen to anti-vaxxers; Americans who watch climate change activists demand that the rest of the world cut its carbon footprint, while the activists themselves get to keep cozy middle-class lifestyles, are more likely to believe that global warming is a politically motivated hoax; Christians who see atheists using evolution as a stalking horse for their ideology are more likely to turn to creation science—and all three, and others, are not going to listen to scientists who insist that they’re wrong, until and unless the scientists stop and take a good hard look at how they and their proclamations look when viewed from outside.
I’m far from sure that anybody in the scientific community is willing to take that hard look. It’s possible; these days, even committed atheists are starting to notice that whenever Richard Dawkins opens his mouth, twenty people who were considering atheism decide to give God a second chance. The arrogant bullying that used to be standard practice among the self-proclaimed skeptics and “angry atheists” has taken on a sullen and defensive tone recently, as though it’s started to sink in that yelling abuse at people who disagree with you might not be the best way to win their hearts and minds. Still, for that same act of reflection to get any traction in the scientific community, a great many people in that community are going to have to rethink the way they handle dealing with the public, especially when science, technology, and medicine cause harm. That, in turn, is only going to happen if enough of today’s scientists remember the importance of the view from outside.
In the light of the other issues I’ve tried to discuss over the years in this blog, that view has another dimension, and it’s a considerably harsher one. Among the outsiders whose opinion of contemporary science matters most are some that haven’t been born yet: our descendants, who will inhabit a world shaped by science and the technologies that have resulted from scientific research. It’s still popular to insist that their world will be a Star Trek fantasy of limitlessness splashed across the galaxy, but I think most people are starting to realize just how unlikely that future actually is.
Instead, the most likely futures for our descendants are those in which the burdens left behind by today’s science and technology are much more significant than the benefits.  Those most likely futures will be battered by unstable climate and rising oceans due to anthropogenic climate change, stripped of most of the world's topsoil, natural resources, and ecosystems, strewn with the radioactive and chemical trash that our era produced in such abundance and couldn’t be bothered to store safely—and most of today’s advanced technologies will have long since rusted into uselessness, because the cheap abundant energy and other nonrenewable resources that were needed to keep them running all got used up in our time.
People living in such a future aren’t likely to remember that a modest number of scientists signed petitions and wrote position papers protesting some of these things. They’re even less likely to recall the utopian daydreams of perpetual progress and limitless abundance that encouraged so many other people in the scientific community to tell themselves that these things didn’t really matter—and if by chance they do remember those daydreams, their reaction to them won’t be pretty. That science today, like every other human institution in every age, combines high ideals and petty motives in the usual proportions will not matter to them in the least.
Unless something changes sharply very soon, their view from outside may well see modern science—all of it, from the first gray dawn of the scientific revolution straight through to the flamelit midnight when the last laboratory was sacked and burned by a furious mob—as a wicked dabbling in accursed powers that eventually brought down just retribution upon a corrupt and arrogant age. So long as the proponents and propagandists of science ignore the view from outside, and blind themselves to the ways that their own defense of science is feeding the forces that are rising against it, the bleak conclusion of the Clark Ashton Smith story cited at the beginning of this post may yet turn out to be far more prophetic than the comfortable fantasies of perpetual scientific advancement cherished by so many people today.
********
On a less bleak but not wholly unrelated subject, I’m pleased to announce that my forthcoming book After Progress is rolling off the printing press as I write this. There were a few production delays, and so it’ll be next month before orders from the publisher start being shipped; the upside to this is that the book can still be purchased for 20% off the cover price. I’m pretty sure that this book will offend people straight across the spectrum of acceptable opinion in today’s industrial society, so get your copy now, pop some popcorn, and get ready to enjoy the show.

The Prosthetic Imagination

Wed, 2015-03-11 19:16
Two news stories and an op-ed piece in the media in recent days provide a useful introduction to the theme of this week’s post here on The Archdruid Report. The first news story followed the official announcement that the official unemployment rate here in the United States dropped to 5.5% last month. This was immediately hailed by pundits and politicians as proof that the recession we weren’t in is over at last, and the happy days that never went away are finally here again.
This jubilation makes perfect sense so long as you don’t happen to know that the official unemployment rate in the United States doesn’t actually depend on the number of people who are out of work. What it indicates is the percentage of US residents who happen to be receiving unemployment benefits—which, as I think most people know at this point, run out after a certain period. Right now there are a huge number of Americans who exhausted their unemployment benefits a long time ago, can’t find work, and would count as unemployed by any measure except the one used by the US government these days.  As far as officialdom is concerned, they are nonpersons in very nearly an Orwellian sense, their existence erased to preserve a politically expedient fiction of prosperity.
How many of these economic nonpersons are there in the United States today? That figure’s not easy to find amid the billowing statistical smokescreens. Still, it’s worth noting that 92,898,000 Americans of working age are not currently in the work force—that is, more than 37 per cent of the working age population. If you spend time around people who don’t belong to this nation’s privileged classes, you already know that a lot of those people would gladly take jobs if there were jobs to be had, but again, that’s not something that makes it through the murk.
We could spend quite a bit of time talking about the galaxy of ways in which economic statistics are finessed and/or fabricated these days, but the points already raised are enough for the present purpose. Let’s move on. The op-ed piece comes from erstwhile environmentalist Stewart Brand, whose long journey from editing CoEvolution Quarterly to channeling Bjorn Lomborg is as perfect a microcosm of the moral collapse of 20th century American environmentalism as you could hope to find. Brand’s latest piece claims that despite all evidence to the contrary—and of course there’s quite a bit of that these days—the environment is doing just fine: the economy has decoupled from resource use in recent decades, at least here in America, and so we can continue to wallow in high-tech consumer goodies without worrying about what we’re doing to the planet.
There’s a savage irony in the fact that in 1975, when his magazine was the go-to place to read about the latest ideas in systems theory and environmental science, Brand could have pointed out the gaping flaw in that argument in a Sausalito minute. Increasing prosperity in the United States has “decoupled” from resource use for two reasons: first, only a narrowing circle of privileged Americans get to see any of the paper prosperity we’re discussing—the standard of living for most people in this country has been contracting steadily for four decades—and second, the majority of consumer goods used in the United States are produced overseas, and so the resource use and environmental devastation involved in manufacturing the goodies we consume so freely takes place somewhere else.
That is to say, what Brand likes to call decoupling is our old friend, the mass production of ecological externalities. Brand can boast about prosperity without environmental cost because the great majority of the costs are being carried by somebody else, somewhere else, and so don’t find their way into his calculations.  The poor American neighborhoods where people struggle to get by without jobs are as absent from his vision of the world as they are from the official statistics; the smokestacks, outflow pipes, toxic-waste dumps, sweatshopped factories, and open-pit mines worked by slave labor that prop up his high-tech lifestyle are overseas, so they don’t show up on US statistics either. As far as Brand is concerned, that means they don’t count.
We could talk more about the process by which a man who first became famous for pressuring NASA into releasing a photo of the whole earth is now insisting that the only view that matters is the one from his living room window, but let’s go on. The other news item is the simplest and, in a bleak sort of way, the funniest of the lot.  According to recent reports, state government officials in Florida are being forbidden from using the phrase “climate change” when discussing the effects of, whisper it, climate change.
This is all the more mordantly funny because Florida is on the front lines of climate change right now.  Even the very modest increases in sea level we’ve seen so far, driven by thermal expansion and the first rounds of Greenland and Antarctic meltwater, are sending seawater rushing out of the storm sewers into the streets of low-lying parts of coastal Florida towns whenever the tide is high and an onshore wind blows hard enough. As climate change accelerates—and despite denialist handwaving, it does seem to be doing that just now—a lot of expensive waterfront property in Florida is going to end up underwater in more than a financial sense.  The state government’s response to this clear and present danger? Prevent state officials from talking about it.
We could look at a range of other examples of this same kind, but these three will do for now. What I want to discuss now is what’s going on here, and what it implies.
Let’s begin with the obvious. In all three of the cases I’ve cited, an uncomfortable reality is being dismissed by manipulating abstractions. An abstraction called “the unemployment rate” has been defined so that the politicians and bureaucrats who cite it don’t have to deal with just how many Americans these days can’t get paid employment; an abstraction called “decoupling” and a range of equally abstract (and cherrypicked) measures of environmental health are being deployed so that Brand and his readers don’t have to confront the soaring ecological costs of computer technology in particular and industrial society in general; an abstraction called “climate change,” finally, is being banned from use by state officials because it does too good a job of connecting certain dots that, for political reasons, Florida politicians don’t want people to connect.
To a very real extent, this sort of thing is pervasive in human interaction, and has been since the hoots and grunts of hominin vocalization first linked up with a few crude generalizations in the dazzled mind of an eccentric australopithecine. Human beings everywhere use abstract categories and the words that denote them as handles by which to grab hold of unruly bundles of experience. We do it far more often, and far more automatically, than most of us ever notice.  It’s only under special circumstances—waking up at night in an unfamiliar room, for example, and finding that the vague somethings around us take a noticeable amount of time to coalesce into ordinary furniture—that the mind’s role in assembling the fragmentary data of sensation into the objects of our experience comes to light.
When you look at a tree, for example, it’s common sense to think that the tree is sitting out there, and your eyes and mind are just passively receiving a picture of it—but then it’s common sense to think that the sun revolves around the earth. In fact, as philosophers and researchers into the psychophysics of sensation both showed a long time ago, what happens is that you get a flurry of fragmentary sense data—green, brown, line, shape, high contrast, low contrast—and your mind constructs a tree out of it, using its own tree-concept (as well as a flurry of related concepts such as “leaf,” “branch,” “bark,” and so on) as a template. You do that with everything you see, and the reason you don’t notice it is that it was the very first thing you learned how to do, as a newborn infant, and you’ve practiced it so often you don’t have to think about it any more.
You do the same thing with every representation of a sensory object. Let’s take visual art for an example.  Back in the 1880s, when the Impressionists first started displaying their paintings, it took many people a real effort to learn how to look at them, and a great many never managed the trick at all. Among those who did, though, it was quite common to hear comments about how this or that painting had taught them to see a landscape, or what have you, in a completely different way. That wasn’t just hyperbole:  the Impressionists had learned how to look at things in a way that brought out features of their subjects that other people in late 19th century Europe and America had never gotten around to noticing, and highlighted those things in their paintings so forcefully that the viewer had to notice them.
The relation between words and the things they denote is thus much more complex, and much more subjective, than most people ever quite get around to realizing. That’s challenging enough when we’re talking about objects of immediate experience, where the concept in the observer’s mind has the job of fitting fragmentary sense data into a pattern that can be verified by other forms of sense data—in the example of the tree, by walking up to it and confirming by touch that the trunk is in fact where the sense of sight said it was. It gets far more difficult when the raw material that’s being assembled by the mind consists of concepts rather than sensory data: when, let’s say, you move away from your neighbor Joe, who can’t find a job and is about to lose his house, start thinking about all the people in town who are in a similar predicament, and end up dealing with abstract concepts such as unemployment, poverty, the distribution of wealth, and so on.
Difficult or not, we all do this, all the time. There’s a common notion that dealing in abstractions is the hallmark of the intellectual, but that puts things almost exactly backwards; it’s the ordinary unreflective person who thinks in abstractions most of the time, while the thinker’s task is to work back from the abstract category to the raw sensory data on which it’s based. That’s what the Impressionists did:  staring at a snowbank as Monet did, until he could see the rainbow play of colors behind the surface impression of featureless white, and then painting the colors into the representation of the snowbank so that the viewer was shaken out of the trance of abstraction (“snow” = “white”) and saw the colors too—first in the painting, and then when looking at actual snow.
Human thinking, and human culture, thus dance constantly between the concrete and the abstract, or to use a slightly different terminology, between immediate experience and a galaxy of forms that reflect experience back in mediated form. It’s a delicate balance: too far into the immediate and experience disintegrates into fragmentary sensation; too far from the immediate and experience vanishes into an echo chamber of abstractions mediating one another. The most successful and enduring creations of human culture have tended to be those that maintain the balance. Representational painting is one of those; another is literature. Read the following passage closely:
“Eastward the Barrow-downs rose, ridge behind ridge into the morning, and vanished out of eyesight into a guess: it was no more than a guess of blue and a remote white glimmer blending with the hem of the sky, but it spoke to them, out of memory and old tales, of the high and distant mountains.”
By the time you finished reading it, you likely had a very clear sense of what Frodo Baggins and his friends were seeing as they looked off to the east from the hilltop behind Tom Bombadil’s house. So did I, as I copied the sentence, and so do most people who read that passage—but no two people see the same image, because the image each of us sees is compounded out of bits of our own remembered experiences. For me, the image that comes to mind has always drawn heavily on the view eastwards from the suburban Seattle neighborhoods where I grew up, across the rumpled landscape to the stark white-topped rampart of the Cascade Mountains. I know for a fact that that wasn’t the view that Tolkien himself had in mind when he penned that sentence; I suspect he was thinking of the view across the West Midlands toward the Welsh mountains, which I’ve never seen; and I wonder what it must be like for someone to read that passage whose concept of ridges and mountains draws on childhood memories of the Urals, the Andes, or Australia’s Great Dividing Range instead.
That’s one of the ways that literature takes the reader through the mediation of words back around to immediate experience. If I ever do have the chance to stand on a hill in the West Midlands and look off toward the Welsh mountains, Tolkien’s words are going to be there with me, pointing me toward certain aspects of the view I might not otherwise have noticed, just as they did in my childhood. It’s the same trick the Impressionists managed with a different medium: stretching the possibilities of experience by representing (literally re-presenting) the immediate in a mediated form.
Now think about what happens when that same process is hijacked, using modern technology, for the purpose of behavioral control.
That’s what advertising does, and more generally what the mass media do. Think about the fast food company that markets its product under the slogan “I’m loving it,” complete with all those images of people sighing with post-orgasmic bliss as they ingest some artificially flavored and colored gobbet of processed pseudofood. Are they loving it? Of course not; they’re hack actors being paid to go through the motions of loving it, so that the imagery can be drummed into your brain and drown out your own recollection of the experience of not loving it. The goal of the operation is to keep you away from immediate experience, so that a deliberately distorted mediation can be put in its place.
You can do that with literature and painting, by the way. You can do it with any form of mediation, but it’s a great deal more effective with modern visual media, because those latter short-circuit the journey back to immediate experience. You see the person leaning back with the sigh of bliss after he takes a bite of pasty bland bun and tasteless gray mystery-meat patty, and you see it over and over and over again. If you’re like most Americans, and spend four or five hours a day staring blankly at little colored images on a glass screen, a very large fraction of your total experience of the world consists of this sort of thing: distorted imitations of immediate experience, intended to get you to think about the world in ways that immediate experience won’t justify.
The externalization of the human mind and imagination via the modern mass media has no shortage of problematic features, but the one I want totalk about here is the way that it feeds into the behavior discussed at the beginning of this post: the habit, pervasive in modern industrial societies just now, of responding to serious crises by manipulating abstractions to make them invisible. That kind of thing is commonplace in civilizations on their way out history’s exit door, for reasons I’ve discussed in an earlier sequence of posts here, but modern visual media make it an even greater problem in the present instance. These latter function as a prosthetic for the imagination, a device for replacing the normal image-making functions of the human mind with electromechanical equivalents. What’s more, you don’t control the prosthetic imagination; governments and corporations control it, and use it to shape your thoughts and behavior in ways that aren’t necessarily in your best interests.
The impact on the prosthetic imagination on the crisis of our time is almost impossible to overstate. I wonder, for example, how many of my readers have noticed just how pervasive references to science fiction movies and TV shows have become in discussions of the future of technology. My favorite example just now is the replicator, a convenient gimmick from the Star Trek universe: you walk up to it and order something, and the replicator pops it into being out of nothing.
It’s hard to think of a better metaphor for the way that people in the privileged classes of today’s industrial societies like to think of the consumer economy. It’s also hard to think of anything that’s further removed from the realities of the consumer economy. The replicator is the ultimate wet dream of externalization: it has no supply chains, no factories, no smokestacks, no toxic wastes, just whatever product you want any time you happen to want it. That’s exactly the kind of thinking that lies behind Stewart Brand’s fantasy of “decoupling”—and it’s probably no accident that more often than not, when I’ve had conversations with people who think that 3-D printers are the solution to everything, they bring Star Trek replicators into the discussion.
3-D printers are not replicators. Their supply chains and manufacturing costs include the smokestacks, outflow pipes, toxic-waste dumps, sweatshopped factories, and open-pit mines worked by slave labor mentioned earlier, and the social impacts of their widespread adoption would include another wave of mass technological unemployment—remember, it’s only in the highly mediated world of current economic propaganda that people who lose their jobs due to automation automatically get new jobs in some other field; in the immediate world, that’s become increasingly uncommon. As long as people look at 3-D printers through minds full of little pictures of Star Trekreplicators, though, those externalized ecological and social costs are going to be invisible to them.
That, in turn, defines the problem with the externalization of the human mind and imagination: no matter how frantically you manipulate abstractions, the immediate world is still what it is, and it can still clobber you. Externalizing a cost doesn’t make it go away; it just guarantees that you won’t see it in time to do anything but suffer the head-on impact.

Peak Meaninglessness

Wed, 2015-03-04 19:00
Last week’s discussion of externalities—costs of doing business that get dumped onto the economy, the community, or the environment, so that those doing the dumping can make a bigger profit—is, I’m glad to say, not the first time this issue has been raised recently.  The long silence that closed around such things three decades ago is finally cracking; they’re being mentioned again, and not just by archdruids.  One of my readers—tip of the archdruidical hat to Jay McInerney—noted an article in Grist a while back that pointed out the awkward fact that none of the twenty biggest industries in today’s world could break even, much less make a profit, if they had to pay for the damage they do to the environment.
Now of course the conventional wisdom these days interprets that statement to mean that it’s unfair to make those industries pay for the costs they impose on the rest of us—after all, they have a God-given right to profit at everyone else’s expense, right?  That’s certainly the attitude of fracking firms in North Dakota, who recently proposed that  they ought to be exempted from the state’s rules on dumping radioactive waste, because following the rules would cost them too much money. That the costs externalized by the fracking industry will sooner or later be paid by others, as radionuclides in fracking waste work their way up the food chain and start producing cancer clusters, is of course not something anyone in the industry or the media is interested in discussing.
Watch this sort of thing, and you can see the chasm opening up under the foundations of industrial society. Externalized costs don’t just go away; one way or another, they’re going to be paid, and costs that don’t appear on a company’s balance sheet still affect the economy. That’s the argument of The Limits to Growth, still the most accurate (and thus inevitably the most reviled) of the studies that tried unavailingly to turn industrial society away from its suicidal path: on a finite planet, once an inflection point is passed, the costs of economic growth rise faster than growth does, and sooner or later force the global economy to its knees.
The tricks of accounting that let corporations pretend that their externalized costs vanish into thin air don’t change that bleak prognosis. Quite the contrary, the pretense that externalities don’t matter just makes it harder for a society in crisis to recognize the actual source of its troubles. I’ve come to think that that’s the unmentioned context behind a dispute currently roiling those unhallowed regions where economists lurk in the shrubbery: the debate over secular stagnation.
Secular stagnation? That’s the concept, unmentionable until recently, that the global economy could stumble into a rut of slow, no, or negative growth, and stay there for years. There are still plenty of economists who insist that this can’t happen, which is rather funny, really, when you consider that this has basically been the state of the global economy since 2009. (My back-of-the-envelope calculations suggest, in fact, that if you subtract the hallucinatory paper wealth manufactured by derivatives and similar forms of financial gamesmanship from the world’s GDP, the production of nonfinancial goods and services worldwide has actually been declining since before the 2008 housing crash.)
Even among those who admit that what’s happening can indeed happen, there’s no consensus as to how or why such a thing could occur.  On the off chance that any mainstream economists are lurking in the shrubbery in the even more unhallowed regions where archdruids utter unspeakable heresies, and green wizards clink mugs of homebrewed beer together and bay at the moon, I have a suggestion to offer: the most important cause of secular stagnation is the increasing impact of externalities on the economy. The dishonest macroeconomic bookkeeping that leads economists to think that externalized costs go away because they’re not entered into anyone’s ledger books doesn’t actually make them disappear; instead, they become an unrecognized burden on the economy as a whole, an unfelt headwind blowing with hurricane force in the face of economic growth.
Thus there’s a profound irony in the insistence by North Dakota fracking firms that they ought to be allowed to externalize even more of their costs in order to maintain their profit margin. If I’m right, the buildup of externalized costs is what’s causing the ongoing slowdown in economic activity worldwide that’s driving down commodity prices, forcing interest rates in many countries to zero or below, and resurrecting the specter of deflationary depression. The fracking firms in question thus want to respond to the collapse in oil prices—a result of secular stagnation—by doing even more of what’s causing secular stagnation. To say that this isn’t likely to end well is to understate the case considerably.
In the real world, of course, mainstream economists don’t listen to suggestions from archdruids, and fracking firms, like every other business concern these days, can be expected to put their short-term cash flow ahead of the survival of their industry, or for that matter of industrial civilization as a whole. Thus I propose to step aside from the subject of economic externalities for a moment—though I’ll be returning to it at intervals as we proceed with this sequence of posts—in order to discuss a subtler and less crassly financial form of the same phenomenon.
That form came in for discussion in the same post two weeks ago that brought the issue of externalities into this blog’s ongoing conversation. Quite a few readers commented about the many ways in which things labeled “more advanced,” “more progressive,” and the like were actually less satisfactory and less effective at meeting human needs than the allegedly more primitive technologies they replaced. Some of those comments focused, and quite sensibly, on the concrete examples, but others pondered the ways that today’s technology fails systematically at meeting certain human needs, and reflected on the underlying causes for that failure. One of my readers—tip of the archdruidical hat here to Ruben—gave an elegant frame for that discussion by suggesting that the peak of technological complexity in our time may also be described as peak meaninglessness.
I’d like to take the time to unpack that phrase. In the most general sense, technologies can be divided into two broad classes, which we can respectively call tools and prosthetics. The difference is a matter of function. A tool expands human potential, giving people the ability to do things they couldn’t otherwise do. A prosthetic, on the other hand, replaces human potential, doing something that under normal circumstances, people can do just as well for themselves.  Most discussions of technology these days focus on tools, but the vast majority of technologies that shape the lives of people in a modern industrial society are not tools but prosthetics.
Prosthetics have a definite value, to be sure. Consider an artificial limb, the sort of thing on which the concept of technology-as-prosthetic is modeled. If you’ve lost a leg in an accident, say, an artificial leg is well worth having; it replaces a part of ordinary human potential that you don’t happen to have any more, and enables you to do things that other people can do with their own leg. Imagine, though, that some clever marketer were to convince people to have their legs cut off so that they could be fitted for artificial legs. Imagine, furthermore, that the advertising for artificial legs became so pervasive, and so successful, that nearly everybody became convinced that human legs were hopelessly old-fashioned and ugly, and rushed out to get their legs amputated so they could walk around on artificial legs.
Then, of course, the manufacturers of artificial arms got into the same sort of marketing, followed by the makers of sex toys. Before long you’d have a society in which most people were gelded quadruple amputees fitted with artificial limbs and rubber genitals, who spent all their time talking about the wonderful things they could do with their prostheses. Only in the darkest hours of the night, when the TV was turned off, might some of them wonder why it was that a certain hard-to-define numbness had crept into all their interactions with other people and the rest of the world.
In a very real sense, that’s the way modern industrial society has reshaped and deformed human life for its more privileged inmates. Take any human activity, however humble or profound, and some clever marketer has found a way to insert a piece of technology in between the person and the activity. You can’t simply bake bread—a simple, homely, pleasant activity that people have done themselves for thousands of years using their hands and a few simple handmade tools; no, you have to have a bread machine, into which you dump a prepackaged mix and some liquid, push a button, and stand there being bored while it does the work for you, if you don’t farm out the task entirely to a bakery and get the half-stale industrially extruded product that passes for bread these days.
Now of course the bread machine manufacturers and the bakeries pitch their products to the clueless masses by insisting that nobody has time to bake their own bread any more. Ivan Illich pointed out in Energy and Equity a long time ago the logical fallacy here, which is that using a bread machine or buying from a bakery is only faster if you don’t count the time you have to spend earning the money needed to pay for it, power it, provide it with overpriced prepackaged mixes, repair it, clean it, etc., etc., etc. Illich’s discussion focused on automobiles; he pointed out that if you take the distance traveled by the average American auto in a year, and divide that by the total amount of time spent earning the money to pay for the auto, fuel, maintenance, insurance, etc., plus all the other time eaten up by tending to the auto in various ways, the average American car goes about 3.5 miles an hour: about the same pace, that is, that an ordinary human being can walk.
If this seems somehow reminiscent of last week’s discussion of externalities, dear reader, it should. The claim that technology saves time and labor only seems to make sense if you ignore a whole series of externalities—in this case, the time you have to put into earning the money to pay for the technology and into coping with whatever requirements, maintenance needs, and side effects the technology has. Have you ever noticed that the more “time-saving technologies” you bring into your life, the less free time you have? This is why—and it’s also why the average medieval peasant worked shorter hours, had more days off, and kept a larger fraction of the value of his labor than you do.
Something else is being externalized by prosthetic technology, though, and it’s that additional factor that gives Ruben’s phrase “peak meaninglessness” its punch. What are you doing, really, when you use a bread machine? You’re not baking bread; the machine is doing that. You’re dumping a prepackaged mix and some water into a machine, closing the lid, pushing a button, and going away to do something else. Fair enough—but what is this “something else” that you’re doing? In today’s industrial societies, odds are you’re going to go use another piece of prosthetic technology, which means that once again, you’re not actually doing anything. A machine is doing something for you. You can push that button and walk away, but again, what are you going to do with your time? Use another machine?
The machines that industrial society uses to give this infinite regress somewhere to stop—televisions, video games, and computers hooked up to the internet—simply take the same process to its ultimate extreme. Whatever you think you’re doing when you’re sitting in front of one of these things, what you’re actually doing is staring at little colored pictures on a glass screen and pushing some buttons. All things considered, this is a profoundly boring activity, which is why the little colored pictures jump around all the time; that’s to keep your nervous system so far off balance that you don’t notice just how tedious it is to spend hours at a time staring at little colored pictures on a screen.
I can’t help but laugh when people insist that the internet is an information-rich environment. It’s quite the opposite, actually: all you get from it is the very narrow trickle of verbal, visual, and auditory information that can squeeze through the digital bottleneck and turn into little colored pictures on a glass screen. The best way to experience this is to engage in a media fast—a period in which you deliberately cut yourself off from all electronic media for a week or more, preferably in a quiet natural environment. If you do that, you’ll find that it can take two or three days, or even more, before your numbed and dazzled nervous system recovers far enough that you can begin to tap in to the ocean of sensory information and sensual delight that surrounds you at every moment. It’s only then, furthermore, that you can start to think your own thoughts and dream your own dreams, instead of just rehashing whatever the little colored pictures tell you.
A movement of radical French philosophers back in the 1960s, the Situationists, argued that modern industrial society is basically a scheme to convince people to hand over their own human capabilities to the industrial machine, so that imitations of those capabilities can be sold back to them at premium prices. It was a useful analysis then, and it’s even more useful now, when the gap between realities and representations has become even more drastic than it was back then. These days, as often as not, what gets sold to people isn’t even an imitation of some human capability, but an abstract representation of it, an arbitrary marker with only the most symbolic connection to what it represents.
This is one of the reasons why I think it’s deeply mistaken to claim that Americans are materialistic. Americans are arguably the least materialistic people in the world; no actual materialist—no one who had the least appreciation for actual physical matter and its sensory and sensuous qualities—could stand the vile plastic tackiness of America’s built environment and consumer economy for a fraction of a second.  Americans don’t care in the least about matter; they’re happy to buy even the most ugly, uncomfortable, shoddily made and absurdly overpriced consumer products you care to imagine, so long as they’ve been convinced that having those products symbolizes some abstract quality they want, such as happiness, freedom, sexual pleasure, or what have you.
Then they wonder, in the darkest hours of the night, why all the things that are supposed to make them happy and satisfied somehow never manage to do anything of the kind. Of course there’s a reason for that, too, which is that happy and satisfied people don’t keep on frantically buying products in a quest for happiness and satisfaction. Still, the little colored pictures keep showing them images of people who are happy and satisfied because they guzzle the right brand of tasteless fizzy sugar water, and pay for the right brand of shoddily made half-disposable clothing, and keep watching the little colored pictures: that last above all else. “Tune in tomorrow” is the most important product that every media outlet sells, and they push it every minute of every day on every stop and key.
That is to say, between my fantasy of voluntary amputees eagerly handing over the cash for the latest models of prosthetic limbs, and the reality of life in a modern industrial society, the difference is simply in the less permanent nature of the alterations imposed on people here and now.  It’s easier to talk people into amputating their imaginations than it is to convince them to amputate their limbs, but it’s also a good deal easier to reverse the surgery.
What gives this even more importance than it would otherwise have, in turn, is that all this is happening in a society that’s hopelessly out of touch with the realities that support its existence, and that relies on bookkeeping tricks of the sort discussed toward the beginning of this essay to maintain the fantasy that it’s headed somewhere other than history’s well-used compost bin. The externalization of the mind and the imagination plays just as important a role in maintaining that fantasy as the externalization of costs—and the cold mechanical heart of the externalization of the mind and imagination is mediation, the insertion of technological prosthetics into the space between the individual and the world. We’ll talk more about that in next week’s post.
****************In other news, I’m delighted to report the publication of a new book of mine that may be of particular interest to readers of this blog: Collapse Now and Avoid the Rush: The Best of the Archdruid Report, which is just out from Founders House Publishing. As the title suggests, it’s an anthology of twenty-five of the most popular weekly posts from this blog, including such favorites as "Knowing Only One Story," "An Elegy for the Age of Space," "The Next Ten Billion Years," and "The Time of the Seedbearers," as well as the title essay and many more. These are the one-of-a-kind essays that haven’t appeared in my books; if you’re looking for something to hand to the spouse or friend or twelve-year-old kid who wants to know why you keep visiting this sight every Wednesday night, or simply want this blog’s best essays in a more permanent form, this is the book. It’s available in print and e-book formats and can be ordered here.

The Externality Trap, or, How Progress Commits Suicide

Wed, 2015-02-25 19:14
I've commented more than once in these essays about the cooperative dimension of writing:  the way that even the most solitary of writers inevitably takes part in what Mortimer Adler used to call the Great Conversation, the flow of ideas and insights across the centuries that’s responsible for most of what we call culture. Sometimes that conversation takes place second- or third-hand—for example, when ideas from two old books collide in an author’s mind and give rise to a third book, which will eventually carry the fusion to someone else further down the stream of time—but sometimes it’s far more direct.
Last week’s post here brought an example of the latter kind. My attempt to cut through the ambiguities surrounding that slippery word “progress” sparked a lively discussion on the comments page of my blog about just exactly what counted as progress, what factors made one change “progressive” while another was denied that label. In the midst of it all, one of my readers—tip of the archdruidical hat to Jonathan—proposed an unexpected definition:  what makes a change qualify as progress, he suggested, is that it increases the externalization of costs. 
I’ve been thinking about that definition since Jonathan proposed it, and it seems to me that it points up a crucial and mostly unrecognized dimension of the crisis of our time. To make sense of it, though, it’s going to be necessary to delve briefly into economic jargon.
Economists use the term “externalities” to refer to the costs of an economic activity that aren’t paid by either party in an exchange, but are pushed off onto somebody else. You won’t hear a lot of talk about externalities these days; it many circles, it’s considered impolite to mention them, but they’re a pervasive presence in contemporary life, and play a very large role in some of the most intractable problems of our age. Some of those problems were discussed by Garret Hardin in his famous essay on the tragedy of the commons, and more recently by Elinor Ostrom in her studies of how that tragedy can be avoided; still, I’m not sure how often it’s recognized that the phenomena they discussed applies not just to commons systems, but to societies as a whole—especially to societies like ours.
An example may be useful here. Let’s imagine a blivet factory, which turns out three-prong, two-slot blivets in pallet loads for customers. The blivet-making process, like manufacturing of every other kind, produces waste as well as blivets, and we’ll assume for the sake of the example that blivet waste is moderately toxic and causes health problems in people who ingest it. The blivet factory produces one barrel of blivet waste for every pallet load of blivets it ships. The cheapest option for dealing with the waste, and thus the option that economists favor, is to dump it into the river that flows past the factory.
Notice what happens as a result of this choice. The blivet manufacturer has maximized his own benefit from the manufacturing process, by avoiding the expense of finding some other way to deal with all those barrels of blivet waste. His customers also benefit, because blivets cost less than they would if the cost of waste disposal was factored into the price. On the other hand, the costs of dealing with the blivet waste don’t vanish like so much twinkle dust; they are imposed on the people downstream who get their drinking water from the river, or from aquifers that receive water from the river, and who suffer from health problems because there’s blivet waste in their water. The blivet manufacturer is externalizing the cost of waste disposal; his increased profits are being paid for at a remove by the increased health care costs of everyone downstream.
That’s how externalities work. Back in the days when people actually talked about the downsides of economic growth, there was a lot of discussion of how to handle externalities, and not just on the leftward end of the spectrum.  I recall a thoughtful book titled TANSTAAFL—that’s an acronym, for those who don’t know their Heinlein, for “There Ain’t No Such Thing As A Free Lunch”—which argued, on solid libertarian-conservative grounds, that the environment could best be preserved by making sure that everyone paid full sticker price for the externalities they generated. Today’s crop of pseudoconservatives, of course, turned their back on all this a long time ago, and insist at the top of their lungs on their allegedly God-given right to externalize as many costs as they possibly can.  This is all the more ironic in that most pseudoconservatives claim to worship a God who said some very specific things about “what ye do to the least of these,” but that’s a subject for a different post.
Economic life in the industrial world these days can be described, without too much inaccuracy, as an arrangement set up to allow a privileged minority to externalize nearly all their costs onto the rest of society while pocketing as much as possible the benefits themselves. That’s come in for a certain amount of discussion in recent years, but I’m not sure how many of the people who’ve participated in those discussions have given any thought to the role that technological progress plays in facilitating the internalization of benefits and the externalization of costs that drive today’s increasingly inegalitarian societies. Here again, an example will be helpful.
Before the invention of blivet-making machinery, let’s say, blivets were made by old-fashioned blivet makers, who hammered them out on iron blivet anvils in shops that were to be found in every town and village. Like other handicrafts, blivet-making was a living rather than a ticket to wealth; blivet makers invested their own time and muscular effort in their craft, and turned out enough in the way of blivets to meet the demand. Notice also the effect on the production of blivet waste. Since blivets were being made one at a time rather than in pallet loads, the total amount of waste was smaller; the conditions of handicraft production also meant that blivet makers and their families were more likely to be exposed to the blivet waste than anyone else, and so had an incentive to invest the extra effort and expense to dispose of it properly. Since blivet makers were ordinary craftspeople rather than millionaires, furthermore, they weren’t as likely to be able to buy exemption from local health laws.
The invention of the mechanical blivet press changed that picture completely.  Since one blivet press could do as much work as fifty blivet makers, the income that would have gone to those fifty blivet makers and their families went instead to one factory owner and his stockholders, with as small a share as possible set aside for the wage laborers who operate the blivet press. The factory owner and stockholders had no incentive to pay for the proper disposal of the blivet waste, either—quite the contrary, since having to meet the disposal costs cut into their profit, buying off local governments was much cheaper, and if the harmful effects of blivet waste were known, you can bet that the owner and shareholders all lived well upstream from the factory. 
Notice also that a blivet manufacturer who paid a living wage to his workers and covered the costs of proper waste disposal would have to charge a higher price for blivets than one who did neither, and thus would be driven out of business by his more ruthless competitor. Externalities aren’t simply made possible by technological progress, in other words; they’re the inevitable result of technological progress in a market economy, because externalizing the costs of production is in most cases the most effective way to outcompete rival firms, and the firm that succeeds in externalizing the largest share of its costs is the most likely to prosper and survive.
Each further step in the progress of blivet manufacturing, in turn, tightened the same screw another turn. Today, to finish up the metaphor, the entire global supply of blivets is made in a dozen factories in  distant Slobbovia, where sweatshop labor under ghastly working conditions and the utter absence of environmental regulations make the business of blivet fabrication more profitable than anywhere else. The blivets are as shoddily made as possible; the entire blivet supply chain from the open-pit mines worked by slave labor that provide the raw materials to the big box stores with part-time, poorly paid staff selling blivetronic technology to the masses is a human and environmental disaster.  Every possible cost has been externalized, so that the two multinational corporations that dominate the global blivet industry can maintain their profit margins and pay absurdly high salaries to their CEOs.
That in itself is bad enough, but let’s broaden the focus to include the whole systems in which blivet fabrication takes place: the economy as a whole, society as a whole, and the biosphere as a whole. The impact of technology on blivet fabrication in a market economy has predictable and well understood consequences for each of these whole systems, which can be summed up precisely in the language we’ve already used. In order to maximize its own profitability and return on shareholder investment, the blivet industry externalizes costs in every available direction. Since nobody else wants to bear those costs, either, most of them end up being passed onto the whole systems just named, because the economy, society, and the biosphere have no voice in today’s economic decisions.
Like the costs of dealing with blivet waste, though, the other externalized costs of blivet manufacture don’t go away just because they’re externalized. As externalities increase, they tend to degrade the whole systems onto which they’re dumped—the economy, society, and the biosphere. This is where the trap closes tight, because blivet manufacturing exists within those whole systems, and can’t be carried out unless all three systems are sufficiently intact to function in their usual way. As those systems degrade, their ability to function degrades also, and eventually one or more of them breaks down—the economy plunges into a depression, the society disintegrates into anarchy or totalitarianism, the biosphere shifts abruptly into a new mode that lacks adequate rainfall for crops—and the manufacture of blivets stops because the whole system that once supported it has stopped doing so.
Notice how this works out from the perspective of someone who’s benefiting from the externalization of costs by the blivet industry—the executives and stockholders in a blivet corporation, let’s say. As far as they’re concerned, until very late in the process, everything is fine and dandy: each new round of technological improvements in blivet fabrication increases their profits, and if each such step in the onward march of progress also means that working class jobs are eliminated or offshored, democratic institutions implode, toxic waste builds up in the food chain, or what have you, hey, that’s not their problem—and after all, that’s just the normal creative destruction of capitalism, right?
That sort of insouciance is easy for at least three reasons. First, the impacts of externalities on whole systems can pop up a very long way from the blivet factories.  Second, in a market economy, everyone else is externalizing their costs as enthusiastically as the blivet industry, and so it’s easy for blivet manufacturers (and everyone else) to insist that whatever’s going wrong is not their fault.  Third, and most crucially, whole systems as stable and enduring as economies, societies, and biospheres can absorb a lot of damage before they tip over into instability. The process of externalization of costs can thus run for a very long time, and become entrenched as a basic economic habit, long before it becomes clear to anyone that continuing along the same route is a recipe for disaster.
Even when externalized costs have begun to take a visible toll on the economy, society, and the biosphere, furthermore, any attempt to reverse course faces nearly insurmountable obstacles. Those who profit from the existing order of things can be counted on to fight tooth and nail for the right to keep externalizing their costs: after all, they have to pay the full price for any reduction in their ability to externalize costs, while the benefits created by not imposing those costs on whole systems are shared among all participants in the economy, society, and the biosphere respectively. Nor is it necessarily easy to trace back the causes of any given whole-system disruption to specific externalities benefiting specific people or industries. It’s rather like loading hanging weights onto a chain; sooner or later, as the amount of weight hung on the chain goes up, the chain is going to break, but the link that breaks may be far from the last weight that pushed things over the edge, and every other weight on  the chain made its own contribution to the end result
A society that’s approaching collapse because too many externalized costs have been loaded onto on the whole systems that support it thus shows certain highly distinctive symptoms. Things are going wrong with the economy, society, and the biosphere, but nobody seems to be able to figure out why; the measurements economists use to determine prosperity show contradictory results, with those that measure the profitability of individual corporations and industries giving much better readings those that measure the performance of whole systems; the rich are convinced that everything is fine, while outside the narrowing circles of wealth and privilege, people talk in low voices about the rising spiral of problems that beset them from every side. If this doesn’t sound familiar to you, dear reader, you probably need to get out more.
At this point it may be helpful to sum up the argument I’ve developed here:
a) Every increase in technological complexity tends also to increase the opportunities for externalizing the costs of economic activity;
b) Market forces make the externalization of costs mandatory rather than optional, since economic actors that fail to externalize costs will tend to be outcompeted by those that do;
c) In a market economy, as all economic actors attempt to externalize as many costs as possible, externalized costs will tend to be passed on preferentially and progressively to whole systems such as the economy, society, and the biosphere, which provide necessary support for economic activity but have no voice in economic decisions;
d) Given unlimited increases in technological complexity, there is no necessary limit to the loading of externalized costs onto whole systems short of systemic collapse;
e) Unlimited increases in technological complexity in a market economy thus necessarily lead to the progressive degradation of the whole systems that support economic activity;
f) Technological progress in a market economy  is therefore self-terminating, and ends in collapse.
Now of course there are plenty of arguments that could be deployed against this modest proposal. For example, it could be argued that progress doesn’t have to generate a rising tide of externalities. The difficulty with this argument is that externalization of costs isn’t an accidental side effect of technology but an essential aspect—it’s not a bug, it’s a feature. Every technology is a means of externalizing some cost that would otherwise be borne by a human body. Even something as simple as a hammer takes the wear and tear that would otherwise affect the heel of your hand, let’s say, and transfers it to something else: directly, to the hammer; indirectly, to the biosphere, by way of the trees that had to be cut down to make the charcoal to smelt the iron, the plants that were shoveled aside to get the ore, and so on.
For reasons that are ultimately thermodynamic in nature, the more complex a technology becomes, the more costs it generates. In order to outcompete a simpler technology, each more complex technology has to externalize a significant proportion of its additional costs, in order to compete against the simpler technology. In the case of such contemporary hypercomplex technosystems as the internet, the process of externalizing costs has gone so far, through so many tangled interrelationships, that it’s remarkably difficult to figure out exactly who’s paying for how much of the gargantuan inputs needed to keep the thing running. This lack of transparency feeds the illusion that large systems are cheaper than small ones, by making externalities of scale look like economies of scale.
It might be argued instead that a sufficiently stringent regulatory environment, forcing economic actors to absorb all the costs of their activities instead of externalizing them onto others, would be able to stop the degradation of whole systems while still allowing technological progress to continue. The difficulty here is that increased externalization of costs is what makes progress profitable. As just noted, all other things being equal, a complex technology will on average be more expensive in real terms than a simpler technology, for the simple fact that each additional increment of complexity has to be paid for by an investment of energy and other forms of real capital.
Strip complex technologies of the subsidies that transfer some of their costs to the government, the perverse regulations that transfer some of their costs to the rest of the economy, the bad habits of environmental abuse and neglect that transfer some of their costs to the biosphere, and so on, and pretty soon you’re looking at hard economic limits to technological complexity, as people forced to pay the full sticker price for complex technologies maximize their benefits by choosing simpler, more affordable options instead. A regulatory environment sufficiently strict to keep technology from accelerating to collapse would thus bring technological progress to a halt by making it unprofitable.
Notice, however, the flipside of the same argument: a society that chose to stop progressing technologically could maintain itself indefinitely, so long as its technologies weren’t dependent on nonrenewable resources or the like. The costs imposed by a stable technology on the economy, society, and the biosphere would be more or less stable, rather than increasing over time, and it would therefore be much easier to figure out how to balance out the negative effects of those externalities and maintain the whole system in a steady state.  Societies that treated technological progress as an option rather than a requirement, and recognized the downsides to increasing complexity, could also choose to reduce complexity in one area in order to increase it in another, and so on—or they could just raise a monument to the age of progress, and go do something else instead.
The logic suggested here requires a comprehensive rethinking of most of the contemporary world’s notions about technology, progress, and the good society. We’ll begin that discussion in future posts—after, that is, we discuss a second dimension of progress that came out of last week’s discussion.

What Progress Means

Wed, 2015-02-18 18:06
Last week’s post here on The Archdruid Report appears to have hit a nerve. That didn’t come as any sort of a surprise, admittedly.  It’s one thing to point out that going back to the simpler and less energy-intensive technologies of earlier eras could help extract us from the corner into which industrial society has been busily painting itself in recent decades; it’s quite another to point out that doing this can also be great fun, more so than anything that comes out of today’s fashionable technologies, and in a good many cases the results include an objectively better quality of life as well
That’s not one of the canned speeches that opponents of progress are supposed to make. According to the folk mythology of modern industrial culture, since progress always makes things better, the foes of whatever gets labeled as progress are supposed to put on hair shirts and insist that everyone has to suffer virtuously from a lack of progress, for some reason based on sentimental superstition. The Pygmalion effect being what it is, it’s not hard to find opponents of progress who say what they’re expected to say, and thus fulfill their assigned role in contemporary culture, which is to stand there in their hair shirts bravely protesting until the steamroller of progress rolls right over them.
The grip of that particular bit of folk mythology on the collective imagination of our time is tight enough that when somebody brings up some other reason to oppose “progress”—we’ll get into the ambiguities behind that familiar label in a moment—a great many people quite literally can’t absorb what’s actually being said, and respond instead to the canned speeches they expect to hear. Thus I had several people attempt to dispute the comments on last week’s post, castigating my readers with varying degrees of wrath and profanity for thinking that they had to sacrifice the delights of today’s technology and go creeping mournfully back to the unsatisfying lifestyles of an earlier day.
That was all the more ironic in that none of the readers who were commenting on the post were saying anything of the kind. Most of them were enthusiastically talking about how much more durable, practical, repairable, enjoyable, affordable, and user-friendly older technologies are compared to the disposable plastic trash that fills the stores these days. They were discussing how much more fun it is to embrace the delights of outdated technologies than it would be to go creeping mournfully back—or forward, if you prefer—to the unsatisfying lifestyles of the present time. That heresy is far more than the alleged openmindness and intellectual diversity of our age is willing to tolerate, so it’s not surprising that some people tried to pretend that nothing of the sort had been said at all. What was surprising to me, and pleasantly so, was the number of readers who were ready to don the party clothes of some earlier time and join in the Butlerian carnival.
There are subtleties to the project of deliberate technological regress that may not be obvious at first glance, though, and it seems sensible to discuss those here before we proceed.  It’s important, to begin with, to remember that when talking heads these days babble about technology in the singular, as a uniform, monolithic thing that progresses according to some relentless internal logic of its own, they’re spouting balderdash.  In the real world, there’s no such monolith; instead, there are technologies in the plural, a great many of them, clustered more or less loosely in technological suites which may or may not have any direct relation to one another.
An example might be useful here. Consider the technologies necessary to build a steel-framed bicycle. The metal parts require the particular suite of technologies we use to smelt ores, combine the resulting metals into useful alloys, and machine and weld those into shapes that fit together to make a bicycle. The tires, inner tubes, brake pads, seat cushion, handlebar grips, and paint require a different suite of technologies drawing on various branches of applied organic chemistry, and a few other suites also have a place:  for example, the one that’s needed to make and apply lubricants  The suites that make a bicycle have other uses; if you can build a bicycle, as Orville and Wilbur Wright demonstrated, you can also build an aircraft, and a variety of other interesting machines as well; that said, there are other technologies—say, the ones needed to manufacture medicines, or precision optics, or electronics—that require very different technological suites. You can have everything you need to build a bicycle and still be unable to make a telescope or a radio receiver, and vice versa.
Strictly speaking, therefore, nothing requires the project of deliberate technological regress to move in lockstep to the technologies of a specific past date and stay there. It would be wholly possible to dump certain items of modern technology while keeping others. It would be just as possible to replace one modern technological suite with an older equivalent from one decade, another with an equivalent from a different decade and so on. Imagine, for example, a future America in which solar water heaters (worked out by 1920) and passive solar architecture (mostly developed in the 1960s and 1970s) were standard household features, canal boats (dating from before 1800) and tall ships (ditto) were the primary means of bulk transport, shortwave radio (developed in the early 20th century) was the standard long-range communications medium, ultralight aircraft (largely developed in the 1980s) were still in use, and engineers crunched numbers using slide rules (perfected around 1880).
There’s no reason why such a pastiche of technologies from different eras couldn’t work. We know this because what passes for modern technology is a pastiche of the same kind, in which (for example) cars whose basic design dates from the 1890s are gussied up with onboard computers invented a century later. Much of modern technology, in fact, is old technology with a new coat of paint and a few electronic gimmicks tacked on, and it’s old technology that originated in many different eras, too. Part of what differentiates modern technology from older equivalents, in other words, is mere fashion. Another part, though, moves into more explosive territory.
In the conversation that followed last week’s post, one of my readers—tip of the archdruid’s hat to Cathy—recounted the story of the one and only class on advertising she took at college. The teacher invited a well-known advertising executive to come in and talk about the business, and one of the points he brought up was the marketing of disposable razors. The old-fashioned steel safety razor, the guy admitted cheerfully, was a much better product: it was more durable, less expensive, and gave a better shave than disposable razors. Unfortunately, it didn’t make the kind of profits for the razor industry that the latter wanted, and so the job of the advertising company was to convince shavers that they really wanted to spend more money on a worse product instead.
I know it may startle some people to hear a luxuriantly bearded archdruid talk about shaving, but I do have a certain amount of experience with the process—though admittedly it’s been a while. The executive was quite correct: an old-fashioned safety razor gives better shaves than a disposable. What’s more, an old-fashioned safety razor combined with a shaving brush, a cake of shaving soap, a mug and a bit of hot water from the teakettle produces a shaving experience that’s vastly better, in every sense, than what you’ll get from squirting cold chemical-laced foam out of a disposable can and then scraping your face with a disposable razor; the older method, furthermore, takes no more time, costs much less on a per-shave basis, and has a drastically smaller ecological footprint to boot.
Notice also the difference in the scale and complexity of the technological suites needed to maintain these two ways of shaving. To shave with a safety razor and shaving soap, you need the metallurgical suite that produces razors and razor blades, the very simple household-chemistry suite that produces soap, the ability to make pottery and brushes, and some way to heat water. To shave with a disposable razor and a can of squirt-on shaving foam, you need fossil fuels for plastic feedstocks, chemical plants to manufacture the plastic and the foam, the whole range of technologies needed to manufacture and fill the pressurized can, and so on—all so that you can count on getting an inferior shave at a higher price, and the razor industry can boost its quarterly profits.
That’s a small and arguably silly example of a vast and far from silly issue. These days, when you see the words “new and improved” on a product, rather more often than not, the only thing that’s been improved is the bottom line of the company that’s trying to sell it to you. When you hear equivalent claims about some technology that’s being marketed to society as a whole, rather than sold to you personally, the same rule applies at least as often. That’s one of the things that drove the enthusiastic conversations on this blog’s comment page last week, as readers came out of hiding to confess that they, too, had stopped using this or that piece of cutting-edge, up-to-date, hypermodern trash, and replaced it with some sturdy, elegant, user-friendly device from an earlier decade which works better and lacks the downsides of the newer item.
What, after all, defines a change as “progress”? There’s a wilderness of ambiguities hidden in that apparently simple word. The popular notion of progress presupposes that there’s an inherent dynamic to history, that things change, or tend to change, or at the very least ought to change, from worse to better over time.  That presupposition then gets flipped around into the even more dubious claim that just because something’s new, it must be better than whatever it replaced. Move from there to specific examples, and all of a sudden it’s necessary to deal with competing claims—if there are two hot new technologies on the market, is option A more progressive than option B, or vice versa? The answer, of course, is that whichever of them manages to elbow the other aside will be retroactively awarded the coveted title of the next step in the march of progress.
That was exactly the process by which the appropriate tech of the 1970s was shoved aside and buried in the memory hole of our culture. In its heyday, appropriate tech was as cutting-edge and progressive as anything you care to name, a rapidly advancing field pushed forward by brilliant young engineers and innovative startups, and it saw itself (and presented itself to the world) as the wave of the future. In the wake of the Reagan-Thatcher counterrevolution of the 1980s, though, it was retroactively stripped of its erstwhile status as an icon of progress and consigned to the dustbin of the past. Technologies that had been lauded in the media as brilliantly innovative in 1978 were thus being condemned in the same media as Luddite throwbacks by 1988. If that abrupt act of redefinition reminds any of my readers of the way history got rewritten in George Orwell’s 1984—“Oceania has never been allied with Eurasia” and the like—well, let’s just say the parallel was noticed at the time, too.
The same process on a much smaller scale can be traced with equal clarity in the replacement of the safety razor and shaving soap with the disposable razor and squirt-can shaving foam. In what sense is the latter, which wastes more resources and generates more trash in the process of giving users a worse shave at a higher price, more progressive than the former? Merely the fact that it’s been awarded that title by advertising and the media. If razor companies could make more money by reintroducing the Roman habit of scraping beard hairs off the face with a chunk of pumice, no doubt that would quickly be proclaimed as the last word in cutting-edge, up-to-date hypermodernity, too.
Behind the mythological image of the relentless and inevitable forward march of technology-in-the-singular in the grand cause of progress, in other words, lies a murky underworld of crass commercial motives and no-holds-barred struggles over which of the available technologies will get the funding and marketing that will define it as the next great step in progress. That’s as true of major technological programs as it is of shaving supplies. Some of my readers are old enough, as I am, to remember when supersonic airliners and undersea habitats were the next great steps in progress, until all of a sudden they weren’t.  We may not be all that far from the point at which space travel and nuclear power will go the way of Sealab and the Concorde.
In today’s industrial societies, we don’t talk about that. It’s practically taboo these days to mention the long, long list of waves of the future that abruptly stalled and rolled back out to sea without delivering on their promoters’ overblown promises. Remind people that the same rhetoric currently being used to prop up faith in space travel, nuclear power, or any of today’s other venerated icons of the religion of progress was lavished just as thickly on these earlier failures, and you can pretty much expect to have that comment shouted down as an irrelevancy if the other people in the conversation don’t simply turn their backs and pretend that they never heard you say anything at all.
They have to do something of the sort, because the alternative is to admit that what we call “progress” isn’t the impersonal, unstoppable force of nature that industrial culture’s ideology insists it must be. Pay attention to the grand technological projects that failed, compare them with those that are failing now, and it’s impossible to keep ignoring certain crucial if hugely unpopular points. To begin with technological progress is a function of collective choices—do we fund Sealab or the Apollo program? Supersonic transports or urban light rail? Energy conservation and appropriate tech or an endless series of wars in the Middle East? No impersonal force makes those decisions; individuals and institutions make them, and then use the rhetoric of impersonal progress to cloak the political and financial agendas that guide the decision-making process.
What’s more, even if the industrial world chooses to invest its resources in a project, the laws of physics and economics determine whether the project is going to work. The Concorde is the poster child here, a technological successbut an economic flop that never even managed to cover its operating costs. Like nuclear power, it was only viable given huge and continuing government subsidies, and since the strategic benefits Britain and France got from having Concordes in the air were nothing like so great as those they got from having an independent source of raw material for nuclear weapons, it’s not hard to see why the subsidies went where they did.
That is to say, when something is being lauded as the next great step forward in the glorious march of progress leading humanity to a better world, those who haven’t drunk themselves tipsy on folk mythology need to keep four things in mind. The first is that the next great step forward  in the glorious march of progres (etc.) might not actually work when it’s brought down out of the billowing clouds of overheated rhetoric into the cold hard world of everyday life. The second is that even if it works, the next great step forward (etc.) may be a white elephant in economic terms, and survive only so long as it gets propped up by subsidies. The third is that even if it does make economic sense, the next great step (etc.) may be an inferior product, and do a less effective job of meeting human needs than whatever it’s supposed to replace. The fourth is that when it comes right down to it, to label something as the next great (etc.) is just a sales pitch, an overblown and increasingly trite way of saying “Buy this product!”
Those necessary critiques, in turn, are all implicit in the project of deliberate technological regress. Get past the thoughtstopping rhetoric that insists “you can’t turn back the clock”—to rephrase a comment of G.K. Chesterton’s, most people turn back the clock every fall, so that’s hardly a valid objection—and it becomes hard not to notice that “progress” is just a label for whatever choices happen to have been made by governments and corporations, with or without input from the rest of us. If we don’t like the choices that have been made for us in the name of progress, in turn, we can choose something else.
Now of course it’s possible to stuff that sort of thinking back into the straitjacket of progress, and claim that progress is chugging along just fine, and all we have to do is get it back on the proper track, or what have you. This is a very common sort of argument, and one that’s been used over and over again by critics of this or that candidate for the next (etc.). The problem with that argument, as I see it, is that it may occasionally win battles but it pretty consistently loses the war; by failing to challenge the folk mythology of progress and the agendas that are enshrined by that mythology, it guarantees that no matter what technology or policy or program gets put into place, it’ll end up leading the same place as all the others before it, because it questions the means but forgets to question the goals.
That’s the trap hardwired into the contemporary faith in progress. Once you buy into the notion that the specific choices made by industrial societies over the last three centuries or so are something more than the projects that happened to win out in the struggle for wealth and power, once you let yourself believe that there’s a teleology to it all—that there’s some objectively definable goal called “progress” that all these choices did a better or worse job of furthering—you’ve just made it much harder to ask where this thing called “progress” is going. The word “progress,” remember, means going further in the same direction, and it’s precisely questions about the direction that industrial society is going that most need to be asked.
I’d like to suggest, in fact, that going further in the direction we’ve been going isn’t a particularly bright idea just now.  It isn’t even necessary to point to the more obviously self-destructive dimensions of business as usual. Look at any trend that affects your life right now, however global or local that trend may be, and extrapolate it out in a straight line indefinitely; that’s what going further in the same direction means. If that appeals to you, dear reader, then you’re certainly welcome to it.  I have to say it doesn’t do much for me.
It’s only from within the folk mythology of progress that we have no choice but to accept the endless prolongation of current trends. Right now, as individuals, we can choose to shrug and walk away from the latest hypermodern trash, and do something else instead. Later on, on the far side of the crisis of our time, it may be possible to take the same logic further, and make deliberate technological regress a recognized policy option for organizations, communities, and whole nations—but that will depend on whether individuals do the thing first, and demonstrate to everyone else that it’s a workable option. In next week’s post, we’ll talk more about where that strategy might lead.

The Butlerian Carnival

Wed, 2015-02-11 16:29
Over the last week or so, I’ve heard from a remarkable number of people who feel that a major crisis is in the offing. The people in question don’t know each other, many of them have even less contact with the mass media than I do, and the sense they’ve tried to express to me is inchoate enough that they’ve been left fumbling for words, but they all end up reaching for the same metaphors: that something in the air just now seems reminiscent of the American colonies in 1775, France in 1789, America in 1860, Europe in 1914, or the world in 1939: a sense of being poised on the brink of convulsive change, with the sound of gunfire and marching boots coming ever more clearly from the dimly seen abyss ahead.
It’s not an unreasonable feeling, all things considered. In Washington DC, Obama’s flunkies are beating the war drums over Ukraine, threatening to send shipments of allegedly “defensive” weapons to join the mercenaries and military advisors we’ve already not-so-covertly got over there. Russian officials have responded to American saber-rattling by stating flatly that a US decision to arm Kiev will be the signal for all-out war. The current Ukrainian regime, installed by a US-sponsored coup and backed by NATO, means to Russia precisely what a hostile Canadian government installed by a Chinese-sponsored coup and backed by the People’s Liberation Army would mean to the United States; if Obama’s trademark cluelessness leads him to ignore that far from minor point and decide that the Russians are bluffing, we could be facing a European war within weeks.
Head south and west from the fighting around Donetsk, and another flashpoint is heating up toward an explosion of its own just now. Yes, that would be Greece, where the new Syriza government has refused to back down from the promises that got it into office: promises that center on the rejection of the so-called “austerity” policies that have all but destroyed the Greek economy since they were imposed in 2009.  This shouldn’t be news to anyone; those same policies, though they’ve been praised to the skies by neoliberal economists for decades now as a guaranteed ticket to prosperity, have had precisely the opposite effect in every single country where they’ve been put in place.
Despite that track record of unbroken failure, the EU—in particular, Germany, which has benefited handsomely from the gutting of southern European economies—continues to insist that Greece must accept what amounts to a perpetual state of debt peonage. The Greek defense minister noted in response in a recent speech that if Europe isn’t willing to cut a deal, other nations might well do so. He’s quite correct; it’s probably a safe bet that cold-eyed men in Moscow and Beijing are busy right now figuring out how best to step through the window of opportunity the EU is flinging open for them. If they do so—well, I’ll leave it to my readers to consider how the US is likely to respond to the threat of Russian air and naval bases in Greece, which would be capable of projecting power anywhere in the eastern and central Mediterranean basin. Here again, war is a likely outcome; I hope that the Greek government is braced for an attempt at regime change.
That is to say, the decline and fall of industrial civilization is proceeding in the normal way, at pretty much the normal pace. The thermodynamic foundations tipped over into decline first, as stocks of cheap abundant fossil fuels depleted steadily and the gap had to be filled by costly and much less abundant replacements, driving down net energy; the economy went next, as more and more real wealth had to be pulled out of all other economic activities to keep the energy supply more or less steady, until demand destruction cut in and made that increasingly frantic effort moot; now a global political and military superstructure dependent on cheap abundant fossil fuels, and on the economic arrangement that all of that surplus energy made possible, is cracking at the seams.
One feature of times like these is that the number of people who can have an influence on the immediate outcome declines steadily as crisis approaches. In the years leading up to 1914, for example, a vast number of people contributed to the rising spiral of conflict between the aging British Empire and its German rival, but the closer war came, the narrower the circle of decision-makers became, until a handful of politicians in Germany, France, and Britain had the fate of Europe in their hands. A few more bad decisions, and the situation was no longer under anybody’s control; thereafter, the only option left was to let the juggernaut of the First World War roll mindlessly onward to its conclusion.
In the same way, as recently as the 1980s, many people in the United States and elsewhere had some influence on how the industrial age would end; unfortunately most of them backed politicians who cashed in the resources that could have built a better future on one last round of absurd extravagance, and a whole landscape of possibilities went by the boards. Step by step, as the United States backed itself further and further into a morass of short-term gimmicks with ghastly long-term consequences, the number of people who have had any influence on the trajectory we’re on has narrowed steadily, and as we approach what may turn out to be the defining crisis of our time, a handful of politicians in a handful of capitals are left to make the last decisions that can shape the situation in any way at all, before the tanks begin to roll and the fighter-bombers rise up from their runways.
Out here on the fringes of the collective conversation of our time, where archdruids lurk and heresies get uttered, the opportunity to shape events as they happen is a very rare thing. Our role, rather, is to set agendas for the future, to take ideas that are unthinkable in the mainstream today and prepare them for their future role as the conventional wisdom of eras that haven’t dawned yet. Every phrase on the lips of today’s practical men of affairs, after all, was once a crazy notion taken seriously only by the lunatic fringe—yes, that includes democracy, free-market capitalism, and all the other shibboleths of our age. 
With that in mind, while we wait to see whether today’s practical men of affairs stumble into war the way they did in 1914, I propose to shift gears and talk about something else—something that may seem whimsical, even pointless, in the light of the grim martial realities just discussed. It’s neither whimsical nor pointless, as it happens, but the implications may take a little while to dawn even on those of my readers who’ve been following the last few years of discussions most closely. Let’s begin with a handful of data points.
Item: Britain’s largest bookseller recently noted that sales of the Kindle e-book reader have dropped like a rock in recent months, while sales of old-fashioned printed books are up. Here in the more gizmocentric USA, e-books retain more of their erstwhile popularity, but the bloom is off the rose; among the young and hip, it’s not hard at all to find people who got rid of their book collections in a rush of enthusiasm when e-books came out, regretted the action after it was too late, and now are slowly restocking their bookshelves while their e-book readers collect cobwebs or, at best, find use as a convenience for travel and the like.
Item: more generally, a good many of the hottest new trends in popular culture aren’t new trends at all—they’re old trends revived, in many cases, by people who weren’t even alive to see them the first time around. Kurt B. Reighley’s lively guide The United States of Americana was the first, and remains the best, introduction to the phenomenon, one that embraces everything from burlesque shows and homebrewed bitters to backyard chickens and the revival of Victorian martial arts. One pervasive thread that runs through the wild diversity of this emerging subculture is the simple recognition that many of these older things are better, in straightforwardly measurable senses, than their shiny modern mass-marketed not-quite-equivalents.
Item: within that subculture, a small but steadily growing number of people have taken the principle to its logical extreme and adopted the lifestyles and furnishings of an earlier decade wholesale in their personal lives. The 1950s are a common target, and so far as I know, adopters of 1950s culture are the furthest along the process of turning into a community, but other decades are increasingly finding the same kind of welcome among those less than impressed by what today’s society has on offer. Meanwhile, the reenactment scene has expanded spectacularly in recent years from the standard hearty fare of Civil War regiments and the neo-medievalism of the Society for Creative Anachronism to embrace almost any historical period you care to name. These aren’t merely dress-up games; go to a buckskinner’s rendezvous or an outdoor SCA event, for example, and you’re as likely as not to see handspinners turning wool into yarn with drop spindles, a blacksmith or two laboring over a portable forge, and the like.
Other examples of the same broad phenomenon could be added to the list, but these will do for now. I’m well aware, of course, that most people—even most of my readers—will have dismissed the things just listed as bizarre personal eccentricities, right up there with the goldfish-swallowing and flagpole-sitting of an earlier era. I’d encourage those of my readers who had that reaction to stop, take a second look, and tease out the mental automatisms that make that dismissal so automatic a part of today’s conventional wisdom. Once that’s done, a third look might well be in order, because the phenomenon sketched out here marks a shift of immense importance for our future.
For well over two centuries now, since it first emerged as the crackpot belief system of a handful of intellectuals on the outer fringes of their culture, the modern ideology of progress has taken it as given that new things were by definition better than whatever they replaced.  That assumption stands at the heart of contemporary industrial civilization’s childlike trust in the irreversible cumulative march of progress toward a future among the stars. Finding ways to defend that belief even when it obviously wasn’t true—when the latest, shiniest products of progress turned out to be worse in every meaningful sense than the older products they elbowed out of the way—was among the great growth industries of the 20th century; even so, there were plenty of cases where progress really did seem to measure up to its billing. Given the steady increases of energy per capita in the world’s industrial nations over the last century or so, that was a predictable outcome.
The difficulty, of course, is that the number of cases where new things really are better than what they replace has been shrinking steadily in recent decades, while the number of cases where old products are quite simply better than their current equivalents—easier to use, more effective, more comfortable, less prone to break, less burdened with unwanted side effects and awkward features, and so on—has been steadily rising. Back behind the myth of progress, like the little man behind the curtain in The Wizard of Oz, stand two unpalatable and usually unmentioned realities. The first is that profits, not progress, determines which products get marketed and which get roundfiled; the second is that making a cheaper, shoddier product and using advertising gimmicks to sell it anyway has been the standard marketing strategy across a vast range of American businesses for years now.
More generally, believers in progress used to take it for granted that progress would sooner or later bring about a world where everyone would live exciting, fulfilling lives brimfull of miracle products and marvelous experiences. You still hear that sort of talk from the faithful now and then these days, but it’s coming to sound a lot like all that talk about the glorious worker’s paradise of the future did right around the time the Iron Curtain came down for good. In both cases, the future that was promised didn’t have much in common with the one that actually showed up. The one we got doesn’t have some of the nastier features of the one the former Soviet Union and its satellites produced—well, not yet, at least—but the glorious consumer’s paradise described in such lavish terms a few decades back got lost on the way to the spaceport, and what we got instead was a bleak landscape of decaying infrastructure, abandoned factories, prostituted media, and steadily declining standards of living for everyone outside the narrowing circle of the privileged, with the remnants of our once-vital democratic institutions hanging above it all like rotting scarecrows silhouetted against a darkening sky.
In place of those exciting, fulfilling lives mentioned above, furthermore, we got the monotony and stress of long commutes, cubicle farms, and would-you-like-fries-with that for the slowly shrinking fraction of our population who can find a job at all. The Onion, with its usual flair for packaging unpalatable realities in the form of deadpan humor, nailed it a few days ago with a faux health-news article announcing that the best thing office workers could do for their health is stand up at their desk, leave the office, and never go back. Joke or not, it’s not bad advice; if you have a full-time job in today’s America, the average medieval peasant had a less stressful job environment and more days off than you do; he also kept a larger fraction of the product of his labor than you’ll ever see.
Then, of course, if you’re like most Americans, you’ll numb yourself once you get home by flopping down on the sofa and spending most of your remaining waking hours staring at little colored pictures on a glass screen. It’s remarkable how many people get confused about what this action really entails. They insist that they’re experiencing distant places, traveling in worlds of pure imagination, and so on through the whole litany of self-glorifying drivel the mass media likes to employ in its own praise. Let us please be real: when you watch a program about the Amazon rain forest, you’re not experiencing the Amazon rain forest; you’re experiencing colored pictures on a screen, and you’re only getting as much of the experience as fits through the narrow lens of a video camera and the even narrower filter of the production process. The difference between experiencing something and watching it on TV or the internet, that is to say, is precisely the same as the difference between making love and watching pornography; in each case, the latter is a very poor substitute for the real thing.
For most people in today’s America, in other words, the closest approach to the glorious consumer’s paradise of the future they can expect to get is eight hours a day, five days a week of mindless, monotonous work under the constant pressure of management efficiency experts, if they’re lucky enough to get a job at all, with anything up to a couple of additional hours commuting and any off-book hours the employer happens to choose to demand from them into the deal, in order to get a paycheck that buys a little less each month—inflation is under control, the government insists, but prices somehow keep going up—of products that get more cheaply made, more likely to be riddled with defects, and more likely to pose a serious threat to the health and well-being of their users, with every passing year. Then they can go home and numb their nervous systems with those little colored pictures on the screen, showing them bland little snippets of experiences they will never have, wedged in there between the advertising.
That’s the world that progress has made. That’s the shining future that resulted from all those centuries of scientific research and technological tinkering, all the genius and hard work and sacrifice that have gone into the project of progress. Of course there’s more to the consequences of progress than that; progress has saved quite a few children from infectious diseases, and laced the environment with so many toxic wastes that childhood cancer, all but unheard of in 1850, is a routine event today; it’s made impressive contributions to human welfare, while flooding the atmosphere with greenhouse gases that will soon make far more impressive contributions to human suffering and death—well, I could go on along these lines for quite a while. True believers in the ideology of perpetual progress like to insist that all the good things ought to be credited to progress while all the bad things ought to be blamed on something else, but that’s not so plausible an article of faith as it once was, and it bids fair to become a great deal less common as the downsides of progress become more and more difficult to ignore.
The data points I noted earlier in this week’s post, I’ve come to believe, are symptoms of that change, the first stirrings of wind that tell of the storm to come. People searching for a better way of living than the one our society offers these days are turning to the actual past, rather than to some imaginary future, in that quest. That’s the immense shift I mentioned earlier. What makes it even more momentous is that by and large, it’s not being done in the sort of grim Puritanical spirit of humorless renunciation that today’s popular culture expects from those who want something other than what the consumer economy has on offer. It’s being done, rather, in a spirit of celebration.
One of my readers responded to my post  two weeks ago on deliberate technological regress by suggesting that I was proposing a Butlerian jihad of sorts. (Those of my readers who don’t get the reference should pick up a copy of Frank Herbert’s iconic SF novel Dune and read it.) I demurred, for two reasons. First, the Butlerian jihad in Herbert’s novel was a revolt against computer technology, and I see no need for that; once the falling cost of human labor intersects the rising cost of energy and technology, and it becomes cheaper to hire file clerks and accountants than to maintain the gargantuan industrial machine that keeps computer technology available, computers will go away, or linger as a legacy technology for a narrowing range of special purposes until the hardware finally burns out.
The second reason, though, is the more important. I’m not a fan of jihads, or of holy wars of any flavor; history shows all too well that when you mix politics and violence with religion, any actual religious content vanishes away, leaving its castoff garments to cover the naked rule of force and fraud. If you want people to embrace a new way of looking at things, furthermore, violence, threats, and abusive language don’t work, and it’s even less effective to offer that new way as a ticket to virtuous misery, along the lines of the Puritanical spirit noted above. That’s why so much of the green-lifestyle propaganda of the last thirty years has done so little good—so much of it has been pitched as a way to suffer self-righteously for the good of Gaia, and while that approach appeals to a certain number of wannabe martyrs, that’s not a large enough fraction of the population to matter.
The people who are ditching their Kindles and savoring books as physical objects, brewing their own beer and resurrecting other old arts and crafts, reformatting their lives in the modes of a past decade, or spending their spare time reconnecting with the customs and technologies of an earlier time—these people aren’t doing any of those things out of some passion for self-denial. They’re doing them because these things bring them delights that the shoddy mass-produced lifestyles of the consumer economy can’t match. What these first stirrings suggest to me is that the way forward isn’t a Butlerian jihad, but a Butlerian carnival—a sensuous celebration of the living world outside the cubicle farms and the glass screens, which will inevitably draw most of its raw materials from eras, technologies, and customs of the past, which don’t require the extravagant energy and resource inputs that the modern consumer economy demands, and so will be better suited to a future defined by scarce energy and resources.
The Butlerian carnival isn’t the only way to approach the deliberate technological regression we need to carry out in the decades ahead, but it’s an important one. In upcoming posts, I’ll talk more about how this and other avenues to the same goal might be used to get through the mess immediately ahead, and start laying foundations for a future on the far side of the crises of our time.

As Night Closes In

Wed, 2015-02-04 17:47
I was saddened to learn a few days ago, via a phone call from a fellow author, that William R. Catton Jr. died early last month, just short of his 89th birthday. Some of my readers will have no idea who he was; others may dimly recall that I’ve mentioned him and his most important book, Overshoot, repeatedly in these essays. Those who’ve taken the time to read the book just named may be wondering why none of the sites in the peak oil blogosphere has put up an obituary, or even noted the man’s passing. I don’t happen to know the answer to that last question, though I have my suspicions.
I encountered Overshoot for the first time in a college bookstore in Bellingham, Washington in 1983. Red letters on a stark yellow spine spelled out the title, a word I already knew from my classes in ecology and systems theory; I pulled it off the shelf, and found the future staring me in the face. This is what’s on the front cover below the title:
carrying capacity: maximum permanently supportable load.
cornucopian myth: euphoric belief in limitless resources.
drawdown: stealing resources from the future.
cargoism: delusion that technology will always save us from
overshoot: growth beyond an area’s carrying capacity, leading to
crash: die-off.
If you want to know where I got the core ideas I’ve been exploring in these essays for the last eight-going-on-nine years, in other words, now you know. I still have that copy of Overshoot; it’s sitting on the desk in front of me right now, reminding me yet again just how many chances we had to turn away from the bleak future that’s closing in around us now, like the night at the end of a long day.
Plenty of books in the 1970s and early 1980s applied the lessons of ecology to the future of industrial civilization and picked up at least part of the bad news that results. Overshoot was arguably the best of the lot, but it was pretty much guaranteed to land even deeper in the memory hole than the others. The difficulty was that Catton’s book didn’t pander to the standard mythologies that still beset any attempt to make sense of the predicament we’ve made for ourselves; it provided no encouragement to what he called cargoism, the claim that technological progress will inevitably allow us to have our planet and eat it too, without falling off the other side of the balance into the sort of apocalyptic daydreams that Hollywood loves to make into bad movies. Instead, in calm, crisp, thoughtful prose, he explained how industrial civilization was cutting its own throat, how far past the point of no return we’d already gone, and what had to be done in order to salvage anything from the approaching wreck.
As I noted in a post here in 2011, I had the chance to meet Catton at an ASPO conference, and tried to give him some idea of how much his book had meant to me. I did my best not to act like a fourteen-year-old fan meeting a rock star, but I’m by no means sure that I succeeded. We talked for fifteen minutes over dinner; he was very gracious; then things moved on, each of us left the conference to carry on with our lives, and now he’s gone. As the old song says, that’s the way it goes.
There’s much more that could be said about William Catton, but that task should probably be left for someone who knew the man as a teacher, a scholar, and a human being. I didn’t; except for that one fifteen-minute conversation, I knew him solely as the mind behind one of the books that helped me make sense of the world, and then kept me going on the long desert journey through the Reagan era, when most of those who claimed to be environmentalists over the previous decade cashed in their ideals and waved around the cornucopian myth as their excuse for that act. Thus I’m simply going to urge all of my readers who haven’t yet read Overshoot to do so as soon as possible, even if they have to crawl on their bare hands and knees over abandoned fracking equipment to get a copy. Having said that, I’d like to go on to the sort of tribute I think he would have appreciated most: an attempt to take certain of his ideas a little further than he did.
The core of Overshoot, which is also the core of the entire world of appropriate technology and green alternatives that got shot through the head and shoved into an unmarked grave in the Reagan years, is the recognition that the principles of ecology apply to industrial society just as much as they do to other communities of living things. It’s odd, all things considered, that this is such a controversial proposal. Most of us have no trouble grasping the fact that the law of gravity affects human beings the same way it affects rocks; most of us understand that other laws of nature really do apply to us; but quite a few of us seem to be incapable of extending that same sensible reasoning to one particular set of laws, the ones that govern how communities of living things relate to their environments.
If people treated gravity the way they treat ecology, you could visit a news website any day of the week and read someone insisting with a straight face that while it’s true that rocks fall down when dropped, human beings don’t—no, no, they fall straight up into the sky, and anyone who thinks otherwise is so obviously wrong that there’s no point even discussing the matter. That degree of absurdity appears every single day in the American media, and in ordinary conversations as well, whenever ecological issues come up. Suggest that a finite planet must by definition contain a finite amount of fossil fuels, that dumping billions of tons of gaseous trash into the air every single year for centuries might change the way that the atmosphere retains heat, or that the law of diminishing returns might apply to technology the way it applies to everything else, and you can pretty much count on being shouted down by those who, for all practical purposes, might as well believe that the world is flat.
Still, as part of the ongoing voyage into the unspeakable in which this blog is currently engaged, I’d like to propose that, in fact, human societies are as subject to the laws of ecology as they are to every other dimension of natural law. That act of intellectual heresy implies certain conclusions that are acutely unwelcome in most circles just now; still, as my regular readers will have noticed long since, that’s just one of the services this blog offers.
Let’s start with the basics. Every ecosystem, in thermodynamic terms, is a process by which relatively concentrated energy is dispersed into diffuse background heat. Here on Earth, at least, the concentrated energy mostly comes from the Sun, in the form of solar radiation—there are a few ecosystems, in deep oceans and underground, that get their energy from chemical reactions driven by the Earth’s internal heat instead. Ilya Prigogine showed some decades back that the flow of energy through a system of this sort tends to increase the complexity of the system; Jeremy England, a MIT physicist, has recently shown that the same process accounts neatly for the origin of life itself. The steady flow of energy from source to sink is the foundation on which everything else rests.
The complexity of the system, in turn, is limited by the rate at which energy flows through the system, and this in turn depends on the difference in concentration between the energy that enters the system, on the one hand, and the background into which waste heat diffuses when it leaves the system, on the other. That shouldn’t be a difficult concept to grasp. Not only is it basic thermodynamics, it’s basic physics—it’s precisely equivalent, in fact, to pointing out that the rate at which water flows through any section of a stream depends on the difference in height between the place where the water flows into that section and the place where it flows out.
Simple as it is, it’s a point that an astonishing number of people—including some who are scientifically literate—routinely miss. A while back on this blog, for example, I noted that one of the core reasons you can’t power a modern industrial civilization on solar energy is that sunlight is relatively diffuse as an energy source, compared to the extremely concentrated energy we get from fossil fuels. I still field rants from people insisting that this is utter hogwash, since photons have exactly the same amount of energy they did when they left the Sun, and so the energy they carry is just as concentrated as it was when it left the Sun. You’ll notice, though, that if this was the only variable that mattered, Neptune would be just as warm as Mercury, since each of the photons hitting the one planet pack on average the same energetic punch as those that hit the other.
It’s hard to think of a better example of the blindness to whole systems that’s pandemic in today’s geek culture. Obviously, the difference between the temperatures of Neptune and Mercury isn’t a function of the energy of individual photons hitting the two worlds; it’s a function of differing concentrations of photons—the number of them, let’s say, hitting a square meter of each planet’s surface. This is also one of the two figures that matter when we’re talking about solar energy here on Earth. The other? That’s the background heat into which waste energy disperses when the system, eco- or solar, is done with it. On the broadest scale, that’s deep space, but ecosystems don’t funnel their waste heat straight into orbit, you know. Rather, they diffuse it into the ambient temperature at whatever height above or below sea level, and whatever latitude closer or further from the equator, they happen to be—and since that’s heated by the Sun, too, the difference between input and output concentrations isn’t very substantial.
Nature has done astonishing things with that very modest difference in concentration. People who insist that photosynthesis is horribly inefficient, and of course we can improve its efficiency, are missing a crucial point: something like half the energy that reaches the leaves of a green plant from the Sun is put to work lifting water up from the roots by an ingenious form of evaporative pumping, in which water sucked out through the leaf pores as vapor draws up more water through a network of tiny tubes in the plant’s stems. Another few per cent goes into the manufacture of sugars by photosynthesis, and a variety of minor processes, such as the chemical reactions that ripen fruit, also depend to some extent on light or heat from the Sun; all told, a green plant is probably about as efficient in its total use of solar energy as the laws of thermodynamics will permit. 
What’s more, the Earth’s ecosystems take the energy that flows through the green engines of plant life and put it to work in an extraordinary diversity of ways. The water pumped into the sky by what botanists call evapotranspiration—that’s the evaporative pumping I mentioned a moment ago—plays critical roles in local, regional, and global water cycles. The production of sugars to store solar energy in chemical form kicks off an even more intricate set of changes, as the plant’s cells are eaten by something, which is eaten by something, and so on through the lively but precise dance of the food web. Eventually all the energy the original plant scooped up from the Sun turns into diffuse waste heat and permeates slowly up through the atmosphere to its ultimate destiny warming some corner of deep space a bit above absolute zero, but by the time it gets there, it’s usually had quite a ride.
That said, there are hard upper limits to the complexity of the ecosystem that these intricate processes can support. You can see that clearly enough by comparing a tropical rain forest to a polar tundra. The two environments may have approximately equal amounts of precipitation over the course of a year; they may have an equally rich or poor supply of nutrients in the soil; even so, the tropical rain forest can easily support fifteen or twenty thousand species of plants and animals, and the tundra will be lucky to support a few hundred. Why? The same reason Mercury is warmer than Neptune: the rate at which photons from the sun arrive in each place per square meter of surface.
Near the equator, the sun’s rays fall almost vertically.  Close to the poles, since the Earth is round, the Sun’s rays come in at a sharp angle, and thus are spread out over more surface area. The ambient temperature’s quite a bit warmer in the rain forest than it is on the tundra, but because the vast heat engine we call the atmosphere pumps heat from the equator to the poles, the difference in ambient temperature is not as great as the difference in solar input per cubic meter. Thus ecosystems near the equator have a greater difference in energy concentration between input and output than those near the poles, and the complexity of the two systems varies accordingly.
All this should be common knowledge. Of course it isn’t, because the industrial world’s notions of education consistently ignore what William Catton called “the processes that matter”—that is, the fundamental laws of ecology that frame our existence on this planet—and approach a great many of those subjects that do make it into the curriculum in ways that encourage the most embarrassing sort of ignorance about the natural processes that keep us all alive. Down the road a bit, we’ll be discussing that in much more detail. For now, though, I want to take the points just made and apply them systematically, in much the way Catton did, to the predicament of industrial civilization.
A human society is an ecosystem.  Like any other ecosystem, it depends for its existence on flows of energy, and as with any other ecosystem, the upper limit on its complexity depends ultimately on the difference in concentration between the energy that enters it and the background into which its waste heat disperses. (This last point is a corollary of White’s Law, one of the fundamental principles of human ecology, which holds that a society’s economic development is directly proportional to its consumption of energy per capita.)  Until the beginning of the industrial revolution, that upper limit was not much higher than the upper limit of complexity in other ecosystems, since human ecosystems drew most of their energy from the same source as nonhuman ones: sunlight falling on green plants.  As human societies figured out how to tap other flows of solar energy—windpower to drive windmills and send ships coursing over the seas, water power to turn mills, and so on—that upper limit crept higher, but not dramatically so.
The discoveries that made it possible to turn fossil fuels into mechanical energy transformed that equation completely. The geological processes that stockpiled half a billion years of sunlight into coal, oil, and natural gas boosted the concentration of the energy inputs available to industrial societies by an almost unimaginable factor, without warming the ambient temperature of the planet more than a few degrees, and the huge differentials in energy concentration that resulted drove an equally unimaginable increase in complexity. Choose any measure of complexity you wish—number of discrete occupational categories, average number of human beings involved in the production, distribution, and consumption of any given good or service, or what have you—and in the wake of the industrial revolution, it soared right off the charts. Thermodynamically, that’s exactly what you’d expect.
The difference in energy concentration between input and output, it bears repeating, defines the upper limit of complexity. Other variables determine whether or not the system in question will achieve that upper limit. In the ecosystems we call human societies, knowledge is one of those other variables. If you have a highly concentrated energy source and don’t yet know how to use it efficiently, your society isn’t going to become as complex as it otherwise could. Over the three centuries of industrialization, as a result, the production of useful knowledge was a winning strategy, since it allowed industrial societies to rise steadily toward the upper limit of complexity defined by the concentration differential. The limit was never reached—the law of diminishing returns saw to that—and so, inevitably, industrial societies ended up believing that knowledge all by itself was capable of increasing the complexity of the human ecosystem. Since there’s no upper limit to knowledge, in turn, that belief system drove what Catton called the cornucopian myth, the delusion that there would always be enough resources if only the stock of knowledge increased quickly enough.
That belief only seemed to work, though, as long as the concentration differential between energy inputs and the background remained very high. Once easily accessible fossil fuels started to become scarce, and more and more energy and other resources had to be invested in the extraction of what remained, problems started to crop up. Tar sands and oil shales in their natural form are not as concentrated an energy source as light sweet crude—once they’re refined, sure, the differences are minimal, but a whole system analysis of energy concentration has to start at the moment each energy source enters the system. Take a cubic yard of tar sand fresh from the pit mine, with the sand still in it, or a cubic yard of oil shale with the oil still trapped in the rock, and you’ve simply got less energy per unit volume than you do if you’ve got a cubic yard of light sweet crude fresh from the well, or even a cubic yard of good permeable sandstone with light sweet crude oozing out of every pore.
It’s an article of faith in contemporary culture that such differences don’t matter, but that’s just another aspect of our cornucopian myth. The energy needed to get the sand out of the tar sands or the oil out of the shale oil has to come from somewhere, and that energy, in turn, is not available for other uses. The result, however you slice it conceptually, is that the upper limit of complexity begins moving down. That sounds abstract, but it adds up to a great deal of very concrete misery, because as already noted, the complexity of a society determines such things as the number of different occupational specialties it can support, the number of employees who are involved in the production and distribution of a given good or service, and so on. There’s a useful phrase for a sustained contraction in the usual measures of complexity in a human ecosystem: “economic depression.”
The economic troubles that are shaking the industrial world more and more often these days, in other words, are symptoms of a disastrous mismatch between the level of complexity that our remaining concentration differential can support, and the level of complexity that our preferred ideologies insist we ought to have. As those two things collide, there’s no question which of them is going to win. Adding to our total stock of knowledge won’t change that result, since knowledge is a necessary condition for economic expansion but not a sufficient one: if the upper limit of complexity set by the laws of thermodynamics drops below the level that your knowledge base would otherwise support, further additions to the knowledge base simply mean that there will be a growing number of things that people know how to do in theory, but that nobody has the resources to do in practice.
Knowledge, in other words, is not a magic wand, a surrogate messiah, or a source of miracles. It can open the way to exploiting energy more efficiently than otherwise, and it can figure out how to use energy resources that were not previously being used at all, but it can’t conjure energy out of thin air. Even if the energy resources are there, for that matter, if other factors prevent them from being used, the knowledge of how they might be used offers no consolation—quite the contrary.
That latter point, I think, sums up the tragedy of William Catton’s career. He knew, and could explain with great clarity, why industrialism would bring about its own downfall, and what could be done to salvage something from its wreck. That knowledge, however, was not enough to make things happen; only a few people ever listened, most of them promptly plugged their ears and started chanting “La, la, la, I can’t hear you” once Reagan made that fashionable, and the actions that might have spared all of us a vast amount of misery never happened. When I spoke to him in 2011, he was perfectly aware that his life’s work had done essentially nothing to turn industrial society aside from its rush toward the abyss. That’s got to be a bitter thing to contemplate in your final hours, and I hope his thoughts were on something else last month as the night closed in at last.