AODA Blog

A Field Guide to Negative Progress

Wed, 2015-04-22 17:23
I've commented before in these posts that writing is always partly a social activity. What Mortimer Adler used to call the Great Conversation, the dance of ideas down the corridors of the centuries, shapes every word in a writer’s toolkit; you can hardly write a page in English without drawing on a shade of meaning that Geoffrey Chaucer, say, or William Shakespeare, or Jane Austen first put into the language. That said, there’s also a more immediate sense in which any writer who interacts with his or her readers is part of a social activity, and one of the benefits came my way just after last week’s post.
That post began with a discussion of the increasingly surreal quality of America’s collective life these days, and one of my readers—tip of the archdruidical hat to Anton Mett—had a fine example to offer. He’d listened to an economic report on the media, and the talking heads were going on and on about the US economy’s current condition of, ahem, “negative growth.” Negative growth? Why yes, that’s the opposite of growth, and it’s apparently quite a common bit of jargon in economics just now.
Of course the English language, as used by the authors named earlier among many others, has no shortage of perfectly clear words for the opposite of growth. “Decline” comes to mind; so does “decrease,” and so does “contraction.” Would it have been so very hard for the talking heads in that program, or their many equivalents in our economic life generally, to draw in a deep breath and actually come right out and say “The US economy has contracted,” or “GDP has decreased,” or even “we’re currently in a state of economic decline”? Come on, economists, you can do it!
But of course they can’t.  Economists in general are supposed to provide, shall we say, negative clarity when discussing certain aspects of contemporary American economic life, and talking heads in the media are even more subject to this rule than most of their peers. Among the things about which they’re supposed to be negatively clear, two are particularly relevant here; the first is that economic contraction happens, and the second is that that letting too much of the national wealth end up in too few hands is a very effective way to cause economic contraction. The logic here is uncomfortably straightforward—an economy that depends on consumer expenditures only prospers if consumers have plenty of money to spend—but talking about that equation would cast an unwelcome light on the culture of mindless kleptocracy entrenched these days at the upper end of the US socioeconomic ladder. So we get to witness the mass production of negative clarity about one of the main causes of negative growth.
It’s entrancing to think of other uses for this convenient mode of putting things. I can readily see it finding a role in health care—“I’m sorry, ma’am,” the doctor says, “but your husband is negatively alive;” in sports—“Well, Joe, unless the Orioles can cut down that negative lead of theirs, they’re likely headed for a negative win;” and in the news—“The situation in Yemen is shaping up to be yet another negative triumph for US foreign policy.” For that matter, it’s time to update one of the more useful proverbs of recent years: what do you call an economist who makes a prediction? Negatively right.
Come to think of it, we might as well borrow the same turn of phrase for the subject of last week’s post, the deliberate adoption of older, simpler, more independent technologies in place of today’s newer, more complex, and more interconnected ones. I’ve been talking about that project so far under the negatively mealy-mouthed label “intentional technological regress,” but hey, why not be cool and adopt the latest fashion? For this week, at least, we’ll therefore redefine our terms a bit, and describe the same thing as “negative progress.” Since negative growth sounds like just another kind of growth, negative progress ought to pass for another kind of progress, right?
With this in mind, I’d like to talk about some of the reasons that individuals, families, organizations, and communities, as they wend their way through today’s cafeteria of technological choices, might want to consider loading up their plates with a good hearty helping of negative progress.
Let’s start by returning to one of the central points raised here in earlier posts, the relationship between progress and the production of externalities. By and large, the more recent a technology is, the more of its costs aren’t paid by the makers or the users of the technology, but are pushed off onto someone else. As I pointed out a post two months ago, this isn’t accidental; quite the contrary, as noted in the post just cited, it’s hardwired into the relationship between progress and market economics, and bids fair to play a central role in the unraveling of the entire project of industrial civilization.
The same process of increasing externalities, though, has another face when seen from the point of view of the individual user of any given technology. When you externalize any cost of a technology, you become dependent on whoever or whatever picks up the cost you’re not paying. What’s more, you become dependent on the system that does the externalizing, and on whoever controls that system. Those dependencies aren’t always obvious, but they impose costs of their own, some financial and some less tangible. What’s more, unlike the externalized costs, a great many of these secondary costs land directly on the user of the technology.
It’s interesting, and may not be entirely accidental, that there’s no commonly used term for the entire structure of externalities and dependencies that stand behind any technology. Such a term is necessary here, so for the present purpose,  we’ll call the structure just named the technology’s externality system. Given that turn of phrase, we can restate the point about progress made above: by and large, the more recent a technology is, the larger the externality system on which it depends.
An example will be useful here, so let’s compare the respective externality systems of a bicycle and an automobile. Like most externality systems, these divide up more or less naturally into three categories: manufacture, maintenance, and use. Everything that goes into fabricating steel parts, for instance, all the way back to the iron ore in the mine, is an externality of manufacture; everything that goes into making lubricating oil, all the way back to drilling for the oil well, is an externality of maintenance; everything that goes into building roads suitable for bikes and cars is an externality of use.
Both externality systems are complex, and include a great many things that aren’t obvious at first glance. The point I want to make here, though, is that the car’s externality system is far and away the more complex of the two. In fact, the bike’s externality system is a subset of the car’s, and this reflects the specific historical order in which the two technologies were developed. When the technologies that were needed for a bicycle’s externality system came into use, the first bicycles appeared; when all the additional technologies needed for a car’s externality system were added onto that foundation, the first cars followed. That sort of incremental addition of externality-generating technologies is far and away the most common way that technology progresses.
We can thus restate the pattern just analyzed in a way that brings out some of its less visible and more troublesome aspects: by and large, each new generation of technology imposes more dependencies on its users than the generation it replaces. Again, a comparison between bicycles and automobiles will help make that clear. If you want to ride a bike, you’ve committed yourself to dependence on all the technical, economic, and social systems that go into manufacturing, maintaining, and using the bike; you can’t own, maintain, and ride a bike without the steel mills that produce the frame, the chemical plants that produce the oil you squirt on the gears, the gravel pits that provide raw material for roads and bike paths, and so on.
On the other hand, you’re not dependent on a galaxy of other systems that provide the externality system for your neighbor who drives. You don’t depend on the immense network of pipelines, tanker trucks, and gas stations that provide him with fuel; you don’t depend on the interstate highway system or the immense infrastructure that supports it; if you did the sensible thing and bought a bike that was made by a local craftsperson, your dependence on vast multinational corporations and all of their infrastructure, from sweatshop labor in Third World countries to financial shenanigans on Wall Street, is considerably smaller than that of your driving neighbor. Every dependency you have, your neighbor also has, but not vice versa.
Whether or not these dependencies matter is a complex thing. Obviously there’s a personal equation—some people like to be independent, others are fine with being just one more cog in the megamachine—but there’s also a historical factor to consider. In an age of economic expansion, the benefits of dependency very often outweigh the costs; standards of living are rising, opportunities abound, and it’s easy to offset the costs of any given dependency. In a stable economy, one that’s neither growing nor contracting, the benefits and costs of any given dependency need to be weighed carefully on a case by case basis, as one dependency may be worth accepting while another costs more than it’s worth.
On the other hand, in an age of contraction and decline—or, shall we say, negative expansion?—most dependencies are problematic, and some are lethal. In a contracting economy, as everyone scrambles to hold onto as much as possible of the lifestyles of a more prosperous age, your profit is by definition someone else’s loss, and dependency is just another weapon in the Hobbesian war of all against all. By many measures, the US economy has been contracting since before the bursting of the housing bubble in 2008; by some—in particular, the median and modal standards of living—it’s been contracting since the 1970s, and the unmistakable hissing sound as air leaks out of the fracking bubble just now should be considered fair warning that another round of contraction is on its way.
With that in mind, it’s time to talk about the downsides of dependency.
First of all, dependency is expensive. In the struggle for shares of a shrinking pie in a contracting economy, turning any available dependency into a cash cow is an obvious strategy, and one that’s already very much in play. Consider the conversion of freeways into toll roads, an increasingly popular strategy in large parts of the United States. Consider, for that matter, the soaring price of health care in the US, which hasn’t been accompanied by any noticeable increase in quality of care or treatment outcomes. In the dog-eat-dog world of economic contraction, commuters and sick people are just two of many captive populations whose dependencies make them vulnerable to exploitation. As the spiral of decline continues, it’s safe to assume that any dependency that can be exploited will be exploited, and the more dependencies you have, the more likely you are to be squeezed dry.
The same principle applies to power as well as money; thus, whoever owns the systems on which you depend, owns you. In the United States, again, laws meant to protect employees from abusive behavior on the part of employers are increasingly ignored; as the number of the permanently unemployed keeps climbing year after year, employers know that those who still have jobs are desperate to keep them, and will put up with almost anything in order to keep that paycheck coming in. The old adage about the inadvisability of trying to fight City Hall has its roots in this same phenomenon; no matter what rights you have on paper, you’re not likely to get far with them when the other side can stop picking up your garbage and then fine you for creating a public nuisance, or engage in some other equally creative use of their official prerogatives. As decline accelerates, expect to see dependencies increasingly used as levers for exerting various kinds of economic, political, and social power at your expense.
Finally, and crucially, if you’re dependent on a failing system, when the system goes down, so do you. That’s not just an issue for the future; it’s a huge if still largely unmentioned reality of life in today’s America, and in most other corners of the industrial world as well. Most of today’s permanently unemployed got that way because the job on which they depended for their livelihood got offshored or automated out of existence; much of the rising tide of poverty across the United States is a direct result of the collapse of political and social systems that once countered the free market’s innate tendency to drive the gap between rich and poor to Dickensian extremes. For that matter, how many people who never learned how to read a road map are already finding themselves in random places far from help because something went wrong with their GPS units?
It’s very popular among those who recognize the problem with being shackled to a collapsing system to insist that it’s a problem for the future, not the present.  They grant that dependency is going to be a losing bet someday, but everything’s fine for now, so why not enjoy the latest technological gimmickry while it’s here? Of course that presupposes that you enjoy the latest technological gimmicry, which isn’t necessarily a safe bet, and it also ignores the first two difficulties with dependency outlined above, which are very much present and accounted for right now. We’ll let both those issues pass for the moment, though, because there’s another factor that needs to be included in the calculation.
A practical example, again, will be useful here. In my experience, it takes around five years of hard work, study, and learning from your mistakes to become a competent vegetable gardener. If you’re transitioning from buying all your vegetables at the grocery store to growing them in your backyard, in other words, you need to start gardening about five years before your last trip to the grocery store. The skill and hard work that goes into growing vegetables is one of many things that most people in the world’s industrial nations externalize, and those things don’t just pop back to you when you leave the produce section of the store for the last time. There’s a learning curve that has to be undergone.
Not that long ago, there used to be a subset of preppers who grasped the fact that a stash of cartridges and canned wieners in a locked box at their favorite deer camp cabin wasn’t going to get them through the downfall of industrial civilization, but hadn’t factored in the learning curve. Businesses targeting the prepper market thus used to sell these garden-in-a-box kits, which had seed packets for vegetables, a few tools, and a little manual on how to grow a garden. It’s a good thing that Y2K, 2012, and all those other dates when doom was supposed to arrive turned out to be wrong, because I met a fair number of people who thought that having one of those kits would save them even though they last grew a plant from seed in fourth grade. If the apocalypse had actually arrived, survivors a few years later would have gotten used to a landscape scattered with empty garden-in-a-box kits, overgrown garden patches, and the skeletal remains of preppers who starved to death because the learning curve lasted just that much longer than they did.
The same principle applies to every other set of skills that has been externalized by people in today’s industrial society, and will be coming back home to roost as economic contraction starts to cut into the viability of our externality systems. You can adopt them now, when you have time to get through the learning curve while there’s still an industrial society around to make up for the mistakes and failures that are inseparable from learning, or you can try to adopt them later, when those same inevitable mistakes and failures could very well land you in a world of hurt. You can also adopt them now, when your dependencies haven’t yet been used to empty your wallet and control your behavior, or you can try to adopt them later, when a much larger fraction of the resources and autonomy you might have used for the purpose will have been extracted from you by way of those same dependencies.
This is a point I’ve made in previous posts here, but it applies with particular force to negative progress—that is, to the deliberate adoption of older, simpler, more independent technologies in place of the latest, dependency-laden offerings from the corporate machine. As decline—or, shall we say, negative growth—becomes an inescapable fact of life in postprogress America, decreasing your dependence on sprawling externality systems is going to be an essential tactic.
Those who become early adopters of the retro future, to use an edgy term from last week’s post, will have at least two, and potentially three, significant advantages. The first, as already noted, is that they’ll be much further along the learning curve by the time rising costs, increasing instabilities, and cascading systems failures either put the complex technosystems out of reach or push the relationship between costs and benefits well over into losing-proposition territory. The second is that as more people catch onto the advantages of older, simpler, more sustainable technologies, surviving examples will become harder to find and more expensive to buy; in this case as in many others, collapsing first ahead of the rush is, among other things, the more affordable option.
The third advantage? Depending on exactly which old technologies you happen to adopt, and whether or not you have any talent for basement-workshop manufacture and the like, you may find yourself on the way to a viable new career as most other people will be losing their jobs—and their shirts. As the global economy comes unraveled and people in the United States lose their current access to shoddy imports from Third World sweatshops, there will be a demand for a wide range of tools and simple technologies that still make sense in a deindustrializing world. Those who already know how to use such technologies will be prepared to teach others how to use them; those who know how to repair, recondition, or manufacture those technologies will be prepared to barter, or to use whatever form of currency happens to replace today’s mostly hallucinatory forms of money, to good advantage.
My guess, for what it’s worth, is that salvage trades will be among the few growth industries in the 21st century, and the crafts involved in turning scrap metal and antique machinery into tools and machines that people need for their homes and workplaces will be an important part of that economic sector. To understand how that will work, though, it’s probably going to be necessary to get a clearer sense of the way that today’s complex technostructures are likely to come apart. Next week, with that in mind, we’ll spend some time thinking about the unthinkable—the impending death of the internet.

The Retro Future

Wed, 2015-04-15 18:16
Is it just me, or has the United States taken yet another great leap forward into the surreal over the last few days? Glancing through the news, I find another round of articles babbling about how fracking has guaranteed America a gaudy future as a petroleum and natural gas exporter. Somehow none of these articles get around to mentioning that the United States is a major net importer of both commodities, that most of the big-name firms in the fracking industry have been losing money at a rate of billions a year since the boom began, and that the pileup of bad loans to fracking firms is pushing the US banking industry into a significant credit crunch, but that’s just par for the course nowadays.
Then there’s the current tempest in the media’s teapot, Hillary Clinton’s presidential run. I’ve come to think of Clinton as the Khloe Kardashian of American politics, since she owed her original fame to the mere fact that she’s related to someone else who once caught the public eye. Since then she’s cycled through various roles because, basically, that’s what Famous People do, and the US presidency is just the next reality-TV gig on her bucket list. I grant that there’s a certain wry amusement to be gained from watching this child of privilege, with the help of her multimillionaire friends, posturing as a champion of the downtrodden, but I trust that none of my readers are under the illusion that this rhetoric will amount to anything more than all that chatter about hope and change eight years ago.
Let us please be real: whoever mumbles the oath of office up there on the podium in 2017, whether it’s Clinton or the interchangeably Bozoesque figures currently piling one by one out of the GOP’s clown car to contend with her, we can count on more of the same: more futile wars, more giveaways to the rich at everyone else’s expense, more erosion of civil liberties, more of all the other things Obama’s cheerleaders insisted back in 2008 he would stop as soon as he got into office.  As Arnold Toynbee pointed out a good many years ago, one of the hallmarks of a nation in decline is that the dominant elite sinks into senility, becoming so heavily invested in failed policies and so insulated from the results of its own actions that nothing short of total disaster will break its deathgrip on the body politic.
While we wait for the disaster in question, though, those of us who aren’t part of the dominant elite and aren’t bamboozled by the spectacle du jour might reasonably consider what we might do about it all. By that, of course, I don’t mean that it’s still possible to save industrial civilization in general, and the United States in particular, from the consequences of their history. That possibility went whistling down the wind a long time ago. Back in 2005, the Hirsch Report showed that any attempt to deal with the impending collision with the hard ecological limits of a finite planet had to get under way at least twenty years before the peak of global conventional petroleum reserves, if there was to be any chance of avoiding massive disruptions. As it happens, 2005 also marked the peak of conventional petroleum production worldwide, which may give you some sense of the scale of the current mess.
Consider, though, what happened in the wake of that announcement. Instead of dealing with the hard realities of our predicament, the industrial world panicked and ran the other way, with the United States well in the lead. Strident claims that ethanol—er, solar—um, biodiesel—okay, wind—well, fracking, then—would provide a cornucopia of cheap energy to replace the world’s rapidly depleting reserves of oil, coal, and natural gas took the place of a serious energy policy, while conservation, the one thing that might have made a difference, was as welcome as garlic aioli at a convention of vampires.
That stunningly self-defeating response had a straightforward cause, which was that everyone except a few of us on the fringes treated the whole matter as though the issue was how the privileged classes of the industrial world could maintain their current lifestyles on some other resource base.  Since that question has no meaningful answer, questions that could have been answered—for example, how do we get through the impending mess with at least some of the achievements of the last three centuries intact?—never got asked at all. At this point, as a result, ten more years have been wasted trying to come up with answers to the wrong question, and most of the  doors that were still open in 2005 have been slammed shut by events since that time.
Fortunately, there are still a few possibilities for constructive action open even this late in the game. More fortunate still, the ones that will likely matter most don’t require Hillary Clinton, or any other member of America’s serenely clueless ruling elite, to do something useful for a change. They depend, rather, on personal action, beginning with individuals, families, and local communities and spiraling outward from there to shape the future on wider and wider scales.
I’ve talked about two of these possibilities at some length in posts here. The first can be summed up simply enough in a cheery sentence:  “Collapse now and avoid the rush!”  In an age of economic contraction—and behind the current facade of hallucinatory paper wealth, we’re already in such an age—nothing is quite so deadly as the attempt to prop up extravagant lifestyles that the real economy of goods and services will no longer support. Those who thrive in such times are those who downshift ahead of the economy, take the resources that would otherwise be wasted on attempts to sustain the unsustainable, and apply them to the costs of transition to less absurd ways of living. The acronym L.E.S.S.—“Less Energy, Stuff, and Stimulation”—provides a good first approximation of the direction in which such efforts at controlled collapse might usefully move.
The point of this project isn’t limited to its advantages on the personal scale, though these are fairly substantial. It’s been demonstrated over and over again that personal example is far more effective than verbal rhetoric at laying the groundwork for collective change. A great deal of what keeps so many people pinned in the increasingly unsatisfying and unproductive lifestyles sold to them by the media is simply that they can’t imagine a better alternative. Those people who collapse ahead of the rush and demonstrate that it’s entirely possible to have a humane and decent life on a small fraction of the usual American resource footprint are already functioning as early adopters; with every month that passes, I hear from more people—especially young people in their teens and twenties—who are joining them, and helping to build a bridgehead to a world on the far side of the impending crisis.
The second possibility is considerably more complex, and resists summing up so neatly. In a series of posts here  in 2010 and 2011, and then in my book Green Wizardry, I sketched out the toolkit of concepts and approaches that were central to the appropriate technology movement back in the 1970s, where I had my original education in the subjects central to this blog. I argued then, and still believe now, that by whatever combination of genius and sheer dumb luck, the pioneers of that movement managed to stumble across a set of approaches to the work of sustainability that are better suited to the needs of our time than anything that’s been proposed since then.
Among the most important features of what I’ve called the “green wizardry” of appropriate tech is the fact that those who want to put it to work don’t have to wait for the Hillary Clintons of the world to lift a finger. Millions of dollars in government grants and investment funds aren’t necessary, or even particularly useful. From its roots in the Sixties counterculture, the appropriate tech scene inherited a focus on do-it-yourself projects that could be done with hand tools, hard work, and not much money. In an age of economic contraction, that makes even more sense than it did back in the day, and the ability to keep yourself and others warm, dry, fed, and provided with many of the other needs of life without potentially lethal dependencies on today’s baroque technostructures has much to recommend it.
Nor, it has to be said, is appropriate tech limited to those who can afford a farm in the country; many of the most ingenious and useful appropriate tech projects were developed by and for people living in ordinary homes and apartments, with a small backyard or no soil at all available for gardening. The most important feature of appropriate tech, though, is that the core elements of its toolkit—intensive organic gardening and small-scale animal husbandry, homescale solar thermal technologies, energy conservation, and the like—are all things that will still make sense long after the current age of fossil fuel extraction has gone the way of the dinosaurs. Getting these techniques into as many hands as possible now is thus not just a matter of cushioning the impacts of the impending era of crisis; it’s also a way to start building the sustainable world of the future right now.
Those two strategies, collapsing ahead of the rush and exploring the green wizardry of appropriate technology, have been core themes of this blog for quite a while now. There’s a third project, though, that I’ve been exploring in a more abstract context here for a while now, and it’s time to talk about how it can be applied to some of the most critical needs of our time.
In the early days of this blog, I pointed out that technological progress has a feature that’s not always grasped by its critics, much less by those who’ve turned faith in progress into the established religion of our time. Very few new technologies actually meet human needs that weren’t already being met, and so the arrival of a new technology generally leads to the abandonment of an older technology that did the same thing. The difficulty here is that new technologies nowadays are inevitably more dependent on global technostructures, and the increasingly brittle and destructive economic systems that support them, than the technologies they replace. New technologies look more efficient than old ones because more of the work is being done somewhere else, and can therefore be ignored—for now.
This is the basis for what I’ve called the externality trap. As technologies get more complex, that complexity allows more of their costs to be externalized—that is to say, pushed onto someone other than the makers or users of the technology. The pressures of a market economy guarantee that those economic actors who externalize more of their costs will prosper at the expense of those who externalize less. The costs thus externalized, though, don’t go away; they get passed from hand to hand like hot potatoes and finally pile up in the whole systems—the economy, the society, the biosphere itself—that have no voice in economic decisions, but are essential to the prosperity and survival of every economic actor, and sooner or later those whole systems will break down under the burden.  Unlimited technological progress in a market economy thus guarantees the economic, social, and/or environmental destruction of the society that fosters it.
The externality trap isn’t just a theoretical possibility. It’s an everyday reality, especially but not only in the United States and other industrial societies. There are plenty of forces driving the rising spiral of economic, social, and environmental disruption that’s shaking the industrial world right down to its foundations, but among the most important is precisely the unacknowledged impact of externalized costs on the whole systems that support the industrial economy. It’s fashionable these days to insist that increasing technological complexity and integration will somehow tame that rising spiral of crisis, but the externality trap suggests that exactly the opposite is the case—that the more complex and integrated technologies become, the more externalities they will generate. It’s precisely because technological complexity makes it easy to ignore externalized costs that progress becomes its own nemesis.
Yes, I know, suggesting that progress isn’t infallibly beneficent is heresy, and suggesting that progress will necessarily terminate itself with extreme prejudice is heresy twice over. I can’t help that; it so happens that in most declining civilizations, ours included, the things that most need to be said are the things that, by and large, nobody wants to hear. That being the case, I might as well make it three for three and point out that the externality trap is a problem rather than a predicament. The difference, as longtime readers know, is that problems can be solved, while predicaments can only be faced. We don’t have to keep loading an ever-increasing burden of externalized costs on the whole systems that support us—which is to say, we don’t have to keep increasing the complexity and integration of the technologies that we use in our daily lives. We can stop adding to the burden; we can even go the other way.
Now of course suggesting that, even thinking it, is heresy on the grand scale. I’m reminded of a bit of technofluff in the Canadian media a week or so back that claimed to present a radically pessimistic view of the next ten years. Of course it had as much in common with actual pessimism as lite beer has with a pint of good brown ale; the worst thing the author, one Douglas Coupland, is apparently able to imagine is that industrial society will keep on doing what it’s doing now—though the fact that more of what’s happening now apparently counts as radical pessimism these days is an interesting point, and one that deserves further discussion.
The detail of this particular Dystopia Lite that deserves attention here, though, is Coupland’s dogmatic insistence that “you can never go backward to a lessened state of connectedness.” That’s a common bit of rhetoric out of the mouths of tech geeks these days, to be sure, but it isn’t even remotely true. I know quite a few people who used to be active on social media and have dropped the habit. I know others who used to have allegedly smart phones and went back to ordinary cell phones, or even to a plain land line, because they found that the costs of excess connectedness outweighed the benefits. Technological downshifting is already a rising trend, and there are very good reasons for that fact.
Most people find out at some point in adolescence that there really is such a thing as drinking too much beer. I think a lot of people are slowly realizing that the same thing is true of connectedness, and of the other prominent features of today’s fashionable technologies. One of the data points that gives me confidence in that analysis is the way that people like Coupland angrily dismiss the possibility. Part of his display of soi-disant pessimism is the insistence that within a decade, people who don’t adopt the latest technologies will be dismissed as passive-aggressive control freaks. Now of course that label could be turned the other way just as easily, but the point I want to make here is that nobody gets that bent out of shape about behaviors that are mere theoretical possibilities. Clearly, Coupland and his geek friends are already contending with people who aren’t interested in conforming to the technosphere.
It’s not just geek technologies that are coming in for that kind of rejection, either. These days, in the town where I live, teenagers whose older siblings used to go hotdogging around in cars ten years ago are doing the same thing on bicycles today. Granted, I live in a down-at-the-heels old mill town in the north central Appalachians, but there’s more to it than that. For a lot of these kids, the costs of owning a car outweigh the benefits so drastically that cars aren’t cool any more. One consequence of that shift in cultural fashion is that these same kids aren’t contributing anything like so much to the buildup of carbon dioxide in the atmosphere, or to the other externalized costs generated by car ownership.
I’ve written here already about deliberate technological regression as a matter of public policy. Over the last few months, though, it’s become increasingly clear to me that deliberate technological regression as a matter of personal choice is also worth pursuing. Partly this is because the deathgrip of failed policies on the political and economic order of the industrial world, as mentioned earlier, is tight enough that any significant change these days has to start down here at the grassroots level, with individuals, families, and communities, if it’s going to get anywhere at all; partly, it’s because technological regression, like anything else that flies in the face of the media stereotypes of our time, needs the support of personal example in order to get a foothold; partly, it’s because older technologies, being less vulnerable to the impacts of whole-system disruptions, will still be there meeting human needs when the grid goes down, the economy freezes up, or something really does break the internet, and many of them will still be viable when the fossil fuel age is a matter for the history books.
Still, there’s another aspect, and it’s one that the essay by Douglas Coupland mentioned above managed to hit squarely: the high-tech utopia ballyhooed by the first generation or so of internet junkies has turned out in practice to be a good deal less idyllic, and in fact a good deal more dystopian, than its promoters claimed. All the wonderful things we were supposedly going to be able to do turned out in practice to consist of staring at little pictures on glass screens and pushing buttons, and these are not exactly the most interesting activities in the world, you know. The people who are dropping out of social media and ditching their allegedly smart phones for a less connected lifestyle have noticed this.
What’s more, a great many more people—the kids hotdogging on bikes here in Cumberland are among them—are weighing  the costs and benefits of complex technologies with cold eyes, and deciding that an older, simpler technology less dependent on global technosystems is not just more practical, but also, and importantly, more fun. True believers in the transhumanist cyberfuture will doubtless object to that last point, but the deathgrip of failed ideas on societies in decline isn’t limited to the senile elites mentioned toward the beginning of this post; it can also afflict the fashionable intellectuals of the day, and make them proclaim the imminent arrival of the future’s rising waters when the tide’s already turned and is flowing back out to sea.
I’d like to suggest, in fact, that it’s entirely possible that we could be heading toward a future in which people will roll their eyes when they think of Twitter, texting, 24/7 connectivity, and the rest of today’s overblown technofetishism—like, dude, all that stuff is so twenty-teens! Meanwhile, those of us who adopt the technologies and habits of earlier eras, whether that adoption is motivated by mere boredom with little glass screens or by some more serious set of motives, may actually be on the cutting edge: the early adopters of the Retro Future. We’ll talk about that more in the weeks ahead.

The Burden of Denial

Wed, 2015-04-08 16:29
It occurred to me the other day that quite a few of the odder features of contemporary American culture make perfect sense if you assume that everybody knows exactly what’s wrong and what’s coming as our society rushes, pedal to the metal, toward its face-first collision with the brick wall of the future. It’s not that they don’t get it; they get it all too clearly, and they just wish that those of us on the fringes would quit reminding them of the imminent impact, so they can spend whatever time they’ve got left in as close to a state of blissful indifference as they can possibly manage.  
  I grant that this realization probably had a lot to do with the context in which it came to me. I was sitting in a restaurant, as it happens, with a vanload of fellow Freemasons.  We’d carpooled down to Baltimore, some of us to receive one of the higher degrees of Masonry and the rest to help with the ritual work, and we stopped for dinner on the way back home. I’ll spare you the name of the place we went; it was one of those currently fashionable beer-and-burger joints where the waitresses have all been outfitted with skirts almost long enough to cover their underwear, bare midriffs, and the sort of push-up bras that made them look uncomfortably like inflatable dolls—an impression that their too obviously scripted jiggle-and-smile routines did nothing to dispell.
Still, that wasn’t the thing that made the restaurant memorable. It was the fact that every wall in the place had television screens on it. By this I don’t mean that there was one screen per wall; I mean that they were lined up side by side right next to each other, covering the upper part of every single wall in the place, so that you couldn’t raise your eyes above head level without looking at one. They were all over the interior partitions of the place, too. There must have been forty of them in one not too large restaurant, each one blaring something different into the thick air, while loud syrupy music spattered down on us from speakers on the ceiling and the waitresses smiled mirthlessly and went through their routines. My burger and fries were tolerably good, and two tall glasses of Guinness will do much to ameliorate even so charmless a situation; still, I was glad to get back on the road.
The thing I’d point out is that all this is quite recent. Not that many years ago, it was tolerably rare to see a TV screen in an American restaurant, and even those bars that had a television on the premises for the sake of football season generally had the grace to leave the thing off the rest of the time. Within the last decade, I’ve watched televisions sprout in restaurants and pubs I used to enjoy, for all the world like buboes on the body of a plague victim: first one screen, then several, then one on each wall, then metastatizing across the remaining space. Meanwhile, along the same lines, people who used to go to coffee shops and the like to read the papers, talk with other patrons, or do anything else you care to name are now sitting in the same coffee shops in total silence, hunched over their allegedly smart phones like so many scowling gargoyles on the walls of a medieval cathedral.
Yes, there were people in the restaurant crouched in the gargoyle pose over their allegedly smart phones, too, and that probably also had something to do with my realization that evening.  It so happens that the evening before my Baltimore trip, I’d recorded a podcast interview with Chris Martenson on his Peak Prosperity show, and he’d described to me a curious response he’d been fielding from people who attended his talks on the end of the industrial age and the unwelcome consequences thereof. He called it “the iPhone moment”—the point at which any number of people in the audience pulled that particular technological toy out of their jacket pockets and waved it at him, insisting that its mere existence somehow disproved everything he was saying.
You’ve got to admit, as modern superstitions go, this one is pretty spectacular.  Let’s take a moment to look at it rationally. Do iPhones produce energy? Nope. Will they refill our rapidly depleting oil and gas wells, restock the ravaged oceans with fish, or restore the vanishing topsoil from the world’s  fields? Of course not. Will they suck carbon dioxide from the sky, get rid of the vast mats of floating plastic that clog the seas, or do something about the steadily increasing stockpiles of nuclear waste that are going to sicken and kill people for the next quarter of a million years unless the waste gets put someplace safe—if there is anywhere safe to put it at all? Not a chance. As a response to any of the predicaments that are driving the crisis of our age, iPhones are at best irrelevant.  Since they consume energy and resources, and the sprawling technosystems that make them function consume energy and resources at a rate orders of magnitude greater, they’re part of the problem, not any sort of a solution
Now of course the people waving their iPhones at Chris Martenson aren’t thinking about any of these things. A good case could be made that they’re not actually thinking at all. Their reasoning, if you want to call it that, seems to be that the existence of iPhones proves that progress is still happening, and this in turn somehow proves that progress will inevitably bail us out from the impacts of every one of the predicaments we face. To call this magical thinking is an insult to honest sorcerers; rather, it’s another example of the arbitrary linkage of verbal noises to emotional reactions that all too often passes for thinking in today’s America. Readers of classic science fiction may find all this weirdly reminiscent of a scene from some edgily updated version of H.G. Wells’ The Island of Doctor Moreau: “Not to doubt Progress: that is the Law. Are we not Men?”
Seen from a certain perspective, though, there’s a definite if unmentionable logic to “the iPhone moment,” and it has much in common with the metastatic spread of television screens across pubs and restaurants in recent years. These allegedly smart phones don’t do anything to fix the rising spiral of problems besetting industrial civilization, but they make it easier for people to distract themselves from those problems for a little while longer. That, I’d like to suggest, is also what’s driving the metastasis of television screens in the places that people used to go to enjoy a meal, a beer, or a cup of coffee and each other’s company. These days, that latter’s too risky; somebody might mention a friend who lost his job and can’t get another one, a spouse who gets sicker with each overpriced prescription the medical industry pushes on her, a kid who didn’t come back from Afghanistan, or the like, and then it’s right back to the reality that everyone’s trying to avoid. It’s much easier to sit there in silence staring at little colored pictures on a glass screen, from which all such troubles have been excluded.
Of course that habit has its own downsides. To begin with, those who are busy staring at the screens have to know, on some level, that sooner or later it’s going to be their turn to lose their jobs, or have their health permanently wrecked by the side effects their doctors didn’t get around to telling them about, or have their kids fail to come back from whatever America’s war du jour happens to be just then, or the like. That’s why so many people these days put so much effort into insisting as loudly as possible that the poor and vulnerable are to blame for their plight. The people who say this know perfectly well that it’s not true, but repeating such claims over and over again is the only defense they’ve got against the bitter awareness that their jobs, their health, and their lives or those of the people they care about could all too easily be next on the chopping block.
What makes this all the more difficult for most Americans to face is that none of these events are happening in a vacuum.  They’re part of a broader process, the decline and fall of modern industrial society in general and the United States of America in particular. Outside the narrowing circles of the well-to-do, standards of living for most Americans have been declining since the 1970s, along with standards of education, public health, and most of the other things that make for a prosperous and stable society. Today, a nation that once put human bootprints on the Moon can’t afford to maintain its roads and bridges or keep its cities from falling into ruin. Hiding from that reality in an imaginary world projected onto glass screens may be comforting in the short term; the mere fact that realities don’t go away just because they’re ignored does nothing to make this choice any less tempting.
What’s more, the world into which that broader process of decline is bringing us is not one in which staring at little colored pictures on a glass screen will count for much. Quite the contrary, it promises to be a world in which raw survival, among other things, will depend on having achieved at least a basic mastery of one or more of a very different range of skills. There’s no particular mystery about those latter skills; they were, in point of fact, the standard set of basic human survival skills for thousands of years before those glass screens were invented, and they’ll still be in common use when the last of the glass screens has weathered away into sand; but they have to be learned and practiced before they’re needed, and there may not be all that much time left to learn and practice them before hard necessity comes knocking at the door.
I think a great many people who claim that everything’s fine are perfectly aware of all this. They know what the score is; it’s doing something about it that’s the difficulty, because taking meaningful action at this very late stage of the game runs headlong into at least two massive obstacles. One of them is practical in nature, the other psychological, and human nature being what it is, the psychological dimension is far and away the most difficult of the two.
Let’s deal with the practicalities first. The non-negotiable foundation of any meaningful response to the crisis of our time, as I’ve pointed out more than once here, can be summed up conveniently with the acronym L.E.S.S.—that is, Less Energy, Stuff, and Stimulation. We are all going to have much less of these things at our disposal in the future.  Using less of them now frees up time, money, and other resources that can be used to get ready for the inevitable transformations. It also makes for decreased dependence on systems and resources that in many cases are already beginning to fail, and in any case will not be there indefinitely in a future of hard limits and inevitable scarcities.
On the other hand, using L.E.S.S. flies in the face of two powerful forces in contemporary culture. The first is the ongoing barrage of advertising meant to convince people that they can’t possibly be happy without the latest time-, energy-, and resource-wasting trinket that corporate interests want to push on them. The second is the stark shivering terror that seizes most Americans at the thought that anybody might think that they’re poorer than they actually are. Americans like to think of themselves as proud individualists, but like so many elements of the American self-image, that’s an absurd fiction; these days, as a rule, Americans are meek conformists who shudder with horror at the thought that they might be caught straying in the least particular from whatever other people expect of them.
That’s what lies behind the horrified response that comes up the moment someone suggests that using L.E.S.S. might be a meaningful part of our response to the crises of our age. When people go around insisting that not buying into the latest overhyped and overpriced lump of technogarbage is tantamount to going back to the caves—and yes, I field such claims quite regularly—you can tell that what’s going on in their minds has nothing to do with the realities of the situation and everything to do with stark unreasoning fear. Point out that a mere thirty years ago, people got along just fine without email and the internet, and you’re likely to get an even more frantic and abusive reaction, precisely because your listener knows you’re right and can’t deal with the implications.
This is where we get into the psychological dimension. What James Howard Kunstler has usefully termed the psychology of previous investment is a massive cultural force in today’s America. The predicaments we face today are in very large part the product of a long series of really bad decisions that were made over the last four decades or so. Most Americans, even those who had little to do with making those decisions, enthusiastically applauded them, and treated those who didn’t with no small amount of abuse and contempt. Admitting just how misguided those decisions turned out to be thus requires a willingness to eat crow that isn’t exactly common among Americans these days. Thus there’s a strong temptation to double down on the bad decisions, wave those iPhones in the air, and put a few more television screens on the walls to keep the cognitive dissonance at bay for a little while longer.
That temptation isn’t an abstract thing. It rises out of the raw emotional anguish woven throughout America’s attempt to avoid looking at the future it’s made for itself. The intensity of that anguish can be measured most precisely, I think, in one small but telling point: the number of people whose final response to the lengthening shadow of the future is, “I hope I’ll be dead before it happens.”
Think about those words for a moment. It used to be absolutely standard, and not only in America, for people of every social class below the very rich to work hard, save money, and do without so that their children could have a better life than they had. That parents could say to their own children, “I got mine, Jack; too bad your lives are going to suck,” belonged in the pages of lurid dime novels, not in everyday life. Yet that’s exactly what the words “I hope I’ll be dead before it happens” imply.  The destiny that’s overtaking the industrial world isn’t something imposed from outside; it’s not an act of God or nature or callous fate; rather, it’s unfolding with mathematical exactness from the behavior of those who benefit from the existing order of things.  It could be ameliorated significantly if those same beneficiaries were to let go of the absurd extravagance that characterizes what passes for a normal life in the modern industrial world these days—it’s just that the act of letting go involves an emotional price that few people are willing to pay.
Thus I don’t think that anyone says “I hope I’ll be dead before it happens” lightly. I don’t think the people who are consigning their own children and grandchildren to a ghastly future, and placing their last scrap of hope on the prospect that they themselves won’t live to see that future arrive, are making that choice out of heartlessness or malice. The frantic concentration on glass screens, the bizarre attempts to banish unwelcome realities by waving iPhones in their faces, and the other weird behavior patterns that surround American society’s nonresponse to its impending future, are signs of the enormous strain that so many Americans these days are under as they try to keep pretending that nothing is wrong in the teeth of the facts.
Denying a reality that’s staring you in the face is an immensely stressful process, and the stress gets worse as the number of things that have to be excluded from awareness mounts up. These days, that list is getting increasingly long. Look away from the pictures on the glass screens, and the United States is visibly a nation in rapid decline: its cities collapsing, its infrastructure succumbing to decades of malign neglect, its politics mired in corruption and permanent gridlock, its society frayed to breaking, and the natural systems that support its existence passing one tipping point after another and lurching through chaotic transitions.
Oklahoma has passed California as the most seismically active state in the Union as countless gallons of fracking fluid pumped into deep disposal wells remind us that nothing ever really “goes away.” It’s no wonder that so many shrill voices these days are insisting that nothing is wrong, or that it’s all the fault of some scapegoat or other, or that Jesus or the Space Brothers or somebody will bail us out any day now, or that we’re all going to be wiped out shortly by some colorful Hollywood cataclysm that, please note, is never our fault.
There is, of course, another option.
Over the years since this blog first began to attract an audience, I’ve spoken to quite a few people who broke themselves out of that trap, or were popped out of it willy-nilly by some moment of experience just that little bit too forceful to yield to the exclusionary pressure; many of them have talked about how the initial burst of terror—no, no, you can’t say that, you can’t think that!—gave way to an immense feeling of release and freedom, as the burden of keeping up the pretense dropped away and left them able to face the world in front of them at last.
I suspect, for what it’s worth, that a great many more people are going to be passing through that transformative experience in the years immediately ahead. A majority? Almost certainly not; to judge by historical precedents, the worse things get, the more effort will go into the pretense that nothing is wrong at all, and the majority will cling like grim death to that pretense until it drags them under. That said, a substantial minority might make a different choice: to let go of the burden of denial soon enough to matter, to let themselves plunge through those moments of terror and freedom, and to haul themselves up, shaken but alive, onto the unfamiliar shores of the future.
When they get there, there will be plenty of work for them to do. I’ve discussed some of the options in previous posts on this blog, but there’s at least one that hasn’t gotten a detailed examination yet, and it’s one that I’ve come to think may be of crucial importance in the decades ahead. We’ll talk about that next week.

Atlantis Won't Sink, Experts Agree

Wed, 2015-04-01 17:32
If you’re like most Atlanteans these days, you’ve heard all sorts of unnerving claims about the future of our continent. Some people are even saying that recent earth tremors are harbingers of a cataclysm that will plunge Atlantis to the bottom of the sea. Those old prophecies from the sacred scrolls of the Sun Temple have had the dust blown off them again, adding to the stew of rumors.
So is there anything to it? Should you be worried about the future of Atlantis?
Not according to the experts. I visited some of the most widely respected hierarchs here in the City of the Golden Gates yesterday to ask them about the rumors, and they assured me that there’s no reason to take the latest round of alarmist claims at all seriously.
  ***My first stop was the temple complex of black orichalcum just outside the Palace of the Ten Kings, where Nacil Buper, Grand Priestess of the Temple of Night, took time out of her busy schedule to meet with me. I asked her what she thought about the rumors of imminent catastrophe. “Complete and utter nonsense,” she replied briskly. “There are always people who want to insist that the end is nigh, and they can always find something to use to justify that sort of thing. Remember a few years ago, when everyone was running around insisting that the end of the Forty-First Grand Cycle of Time was going to bring the destruction of the world? This is more of the same silliness.”
Just at that moment, the floor shook beneath us, and I asked her about the earth tremors, pointing out that those seem to be more frequent than they were just a few years back.
“Atlantis has always had earthquakes,” the Grand Priestess reminded me, gesturing with her scepter of human bone.  “There are natural cycles affecting their frequency, and there’s no proof that they’re more frequent because of anything human beings are doing. In fact, I’m far from convinced that they’re any more frequent than they used to be. There are serious questions about whether the priests of the Sun Temple have been fiddling with their data, you know.”
“And the claim from those old prophecies that offering human sacrifices to Mu-Elortep, Lord of Evil, might have something to do with it?” I asked. 
“That’s the most outrageous kind of nonsense,” the Grand Priestess replied. “Atlanteans have been worshipping the Lord of Evil for more than a century and a half. It’s one of the foundations of our society and our way of life, and we should be increasing the number of offerings to Mu-Elortep as rapidly as we can, not listening to crazies from the fringe who insist that there’s something wrong with slaughtering people for the greater glory of the Lord of Evil. We can’t do without Mu-Elortep, not if we’re going to restore Atlantis to full prosperity and its rightful place in the world order, and if that means sacrifices have to be made—and it does—then sacrifices need to be made.”
She leaned forward confidentially, and her necklace of infant’s skulls rattled. “You know as well as I do that all this is just another attempt by the Priests of the Sun to dodge their responsibility for their own bad policies. Nobody would care in the least about all these crazy rumors of imminent doom if the Sun Priest Erogla hadn’t made such a fuss about the old prophecies in the scrolls of the Sun Temple a few years back. The Sun Temple’s the real problem we face. Fortunately, though, we of the Temple of Night have a majority in the Council of the Ten Kings now. We’re working on legislation right now to eradicate poverty in Atlantis by offering up the poor to Mu-Elortep in one grand bonfire. Once that’s done, I’m convinced, Atlantis will be on the road to a full recovery.”
  ***After my conversation with the Grand Priestess, I went uphill to the foot of the Sacred Mountain, where the Sun Temple rises above the golden-roofed palaces of the Patricians of Atlantis. I had made an appointment to see Tarc Omed, the Hierophant of the Priests of the Sun; he met me in his private chamber, and had his servants pour us purple wine from Valusia as we talked.
“I know the kind of thing you must have heard from the Temple of Night,” the Hierophant said wearily. “It’s all our fault the economy’s in trouble. Everything’s our fault. That’s how they avoid responsibility for the consequences of the policies they’ve been pursuing for decades now.”
I asked him what he thought of Nacil Buper’s claim that offering up the poor as human sacrifices would solve all the problems Atlantis faces these days.
“Look,” he said, “everybody knows that we’ve got to wean ourselves off making human sacrifices to the Lord of Evil one of these days. There’s no way we can keep that up indefinitely, and it’s already causing measurable problems. That’s why we’re proposing increased funding for more sustainable forms of worship directed toward other deities, so we can move step by step to a society that doesn’t have to engage in human sacrifice or deal with Mu-Elortep at all.”
And the ground tremors? Do they have anything to do with the sacrifices?
“That’s a good question. It’s hard to say whether any particular burst of tremors is being caused by the prophesied curse, you know, but that’s no reason for complacency.”
A tremor shook the room, and we both steadied our golden goblets of wine on the table. “Doesn’t that lend support to the rumors that Atlantis might sink soon?” I asked.
Tarc Omed looked weary again, and leaned back in his great chair of gold and ivory. “We have to be realistic,” he said. “Right now, Atlantean society depends on human sacrifice, and transitioning away from that isn’t something we can do overnight. We need to get those more sustainable forms of worship up and running first, and that can’t be done without negotiated compromises and the support of as many stakeholders as possible. Alarmism doesn’t further that.”
I thought of one of the things Nacil Buper had said. “But aren’t the prophecies of doom we’re discussing right there in the sacred scrolls of the Sun Temple?”
“We don’t consider that relevant just now,” the Hierophant told me firmly. “What matters right at the moment is to build a coalition strong enough to take back a majority in the Council of the Ten Kings, stop the Temple of Night’s crazy plan to sacrifice all of the poor to Mu-Elortep, and make sure that human sacrifices are conducted in as painless and sanitary a fashion as possible and increased only at the rate that’s really necessary, while we work toward phasing out human sacrifice altogether. Of course we can’t continue on our current path, but I have faith that Atlanteans can and will work together to stop any sort of worst-case scenario from happening.”
  ***From the Temple of the Sun I walked out of the patrician district, into one of the working class neighborhoods overlooking the Old Harbor. The ground shook beneath my feet a couple of times as I went. People working in the taverns and shops looked up at the Sacred Mountain each time, and then went back to their labor. It made me feel good to know that their confidence was shared by both the hierarchs I’d just interviewed.
I decided to do some person-in-the-street interviews for the sake of local color, and stepped into one of the taverns. Introducing myself to the patrons as a reporter, I asked what they thought about the rumors of disaster and the ongoing earth tremors.
“Oh, I’m sure the Priests of the Sun will think of something,” one patron said. I wrote that down on my wax tablet.
“Yeah,” agreed another. “How long have these prophecies been around? And Atlantis is still above water, isn’t it? I’m not worried.”
“I used to believe that stuff back in the day,” said a third patron. “You know, you buy into all kinds of silly things when you’re young and gullible, then grow out of it once it’s time to settle down and deal with the real world.  I sure did.”
That got nods and murmurs of approval all around. “I honestly think a lot of the people who are spreading these rumors actually want Atlantis to sink,” the third patron went on. “All this obsessing about those old prophecies and how awful human sacrifice is—I mean, can we get real, please?”
“You can say that again,” said the second patron. “I bet they do want Atlantis to sink. I bet they’re actually Lemurian sympathizers.”
The third patron turned to look at him.  “You know, that would make a lot of sense—”
Just then another tremor, a really strong one, shook the tavern. The whole room went dead silent for a moment. As the tremor died down, everybody started talking loudly all at once. I said my goodbyes and headed for the door.
As I stopped outside to put my wax tablet into the scribe’s case on my belt, one of the other patrons—a woman who hadn’t said anything—came through the door after me. “If you’re looking for a different point of view,” she told me, “you ought to go down to the Sea Temple. They’ll give you an earful.”
I thanked her, and started downhill toward the Old Harbor.
  ***I’d never been to the Sea Temple before; I don’t think most Atlanteans ever go there, though it’s been right there next to the Old Harbor since time out of mind. When I got there, the big doors facing the harbor were wide open, but the place seemed empty; the only sounds were the flapping of the big blue banners above the temple and the cries of sea birds up overhead.
As another tremor rattled the city, I walked in through the open doors. I didn’t see anyone at first, but after a few moments a woman in the blue robes of a Sea Priestess came out of the sanctuary further inside and hurried toward me. She had a basket of scrolls in her arms.
I introduced myself, explained that I was a journalist, and asked if she minded answering some questions.
“Not if you don’t mind walking with me to the harbor,” she said. “I’m in a bit of a hurry.”
“Sure,” I told her. “So what do you think about all these scary rumors? Do you really think Atlantis could end up underwater?”
We left the temple and started across the plaza outside, toward the harbor. “Have you read the prophecies of Emor Fobulc?” she asked me.
“Can’t say I have.”
“They predicted everything that’s happened: the rise of the cult of Mu-Elortep, the sacrifices, the earth tremors, and now the Sign.”
“The what?”
“When’s the last time you looked at the top of the Sacred Mountain?”
I stopped and looked right then. There was a plume of smoke rising from the great rounded peak. After a moment, I hurried to catch up to her.
“That’s the Sign,” she told me. “It means that the fires of Under-Earth have awakened and Atlantis will soon be destroyed.”
“Seriously?”
“Seriously.”
I thought about it for a moment as we walked, and the ground shook beneath our feet. “There could be plenty of other explanations for that smoke, you know.”
The priestess looked at me for a long moment. “No doubt,” she said dryly. 
By then we were near the edge of the quay, and half a dozen people came hurrying down the gangplank from a ship that was tied up there, an old-fashioned sailing vessel with a single mast and the prow carved to look like a swan. One of them, a younger priestess, bowed, took the basket of scrolls, and hurried back on board the ship. Another, who was dressed like a mariner, bowed too, and said to the priestess I’d spoken with, “Is there anything else, Great Lady?”
“Nothing,” she said. “We should go.” She turned to me. “You may come with us if you wish.”
“I need to have this story back to the pressroom before things shut down this afternoon,” I told her. “Are you going to be coming back within two hours or so?”
I got another of her long silent looks. “No,” she said. “We’ll be much longer than that.”
“Sorry, then—I hate to turn down a cruise, but work is work.”
She didn’t have anything to say to that, and the others more or less bundled her up the gangplank onto the ship. A couple of sailors untied the cables holding the ship against the quay and then climbed on board before it drifted away. A few minutes later the ship was pulling out into the Old Harbor; I could hear the oarsmen belowdecks singing one of their chanteys while the sailors climbed aloft and got the sail unfurled and set to the breeze.
After a few more minutes, I turned and started back up the hill toward the middle of town. As I climbed the slope, I could see more and more of the City of the Golden Gates around me in the afternoon sun: the Palace of the Ten Kings with the Temple of Night beside it, the Sun Temple and the golden roofs of the patricians’ palaces higher up the slope. The ground was shaking pretty much nonstop, but I barely noticed it, I’d gotten so used to the tremors.
The view got better as I climbed. Below, the Old Harbor spread out to one side and the New Harbor to the other. Next to the New Harbor was the charnel ground of Elah-Slio, where smoke was rising from the altars and long lines of victims were being driven forward with whips to be offered up as sacrifices to Mu-Elortep; off the other way, beyond the Old Harbor, I spotted twenty or so sails in the middle distance, heading away from Atlantis, and the ship with the priestess on it hurrying to join them.
That’s when it occurred to me that the Sea Priestess couldn’t have been serious when she said that Atlantis would soon be destroyed. Surely, if the prophecies were true, the Sea Priestesses would have had more important things to do than go on some kind of long vacation cruise. I laughed at how gullible I’d been there for a moment, and kept climbing the hill into the sunlight.
Above the Sacred Mountain, the cloud of smoke had gotten much bigger, and it looked as though some kind of red glow was reflecting off the bottom of it. I wondered what that meant, but figured I’d find out from the news soon enough. It certainly made me feel good to know that there was no reason whatever to worry about the far-fetched notion that Atlantis might end up at the bottom of the sea.

(Note: due to a date-linked transtemporal anomaly, this week’s planned Archdruid Report post got switched with a passage from the Swenyliad, an Atlantean chronicle dating from 9613 BCE. We apologize for any inconvenience.)

Planet of the Space Bats

Wed, 2015-03-25 17:16
As my regular readers know, I’ve been talking for quite a while now here about the speculative bubble that’s built up around the fracking phenomenon, and the catastrophic bust that’s guaranteed to follow so vast and delusional a boom. Over the six months or so, I’ve noted the arrival of one warning sign after another of the impending crash. As the saying has it, though, it’s not over ‘til the fat lady sings, so I’ve been listening for the first notes of the metaphorical aria that, in the best Wagnerian style, will rise above the orchestral score as the fracking industry’s surrogate Valhalla finally bursts into flames and goes crashing down into the Rhine.
 
I think I just heard those first high notes, though, in an improbable place: the email inbox of the Ancient Order of Druids in America (AODA), the Druid order I head.
I have no idea how many of my readers know the first thing about my unpaid day job as chief executive—the official title is Grand Archdruid—of one of the two dozen or so Druid orders in the western world. Most of what goes into that job, and the admittedly eccentric minority religious tradition behind it, has no relevance to the present subject. Still, I think most people know that Druids revere the natural world, and take ecology seriously even when that requires scrapping some of the absurd extravagances that pass for a normal lifestyle these days. Thus a Druid order is arguably the last place that would come to mind if you wanted to sell stock in a fracking company.
Nonetheless, that’s what happened. The bemused AODA office staff the other day fielded a solicitation from a stock firm trying to get Druids to invest their assets in the fracking industry.
Does that sound like a desperation move to you, dear reader? It certainly does to me—and there’s good reason to think that it probably sounds that way to the people who are trying to sell shares in fracking firms to one final round of clueless chumps, too. A recent piece in the Wall Street Journal (available outside the paywall here) noted that American banks have suddenly found themselves stuck with tens of millions of dollars’ worth of loans to fracking firms which they hoped to package up and sell to investors—but suddenly nobody’s buying. Bankruptcies and mass layoffs are becoming an everyday occurrence in the fracking industry, and the price of oil continues to lurch down as producers maximize production for the sake of immediate cash flow.
Why, though, isn’t the drop in the price of oil being met by an upsurge in consumption that drives the price back up, as the accepted rules of economics would predict? That’s the cream of the jest. Here in America, and to a lesser extent elsewhere in the industrial world, four decades of enthusiastically bipartisan policies that benefited the rich at everyone else’s expense managed to prove Henry Ford’s famous argument: if you don’t pay your own employees enough that they can afford to buy your products, sooner or later, you’re going to go broke.
By driving down wages and forcing an ever larger fraction of the US population into permanent unemployment and poverty, the movers and shakers of America’s political class have managed to trigger a classic crisis of overproduction, in which goods go begging for buyers because too few people can afford to buy them at any price that will pay for their production. It’s not just oil that’s affected, either: scores of other commodities are plunging in price as the global economy tips over into depression. There’s a specter haunting the industrial world; it’s the ghost of Karl Marx, laughing with mordant glee as the soi-disant masters of the universe, having crushed his misbegotten Soviet stepchildren, go all out to make his prophecy of capitalism’s self-immolation look remarkably prescient.
The soaring price of crude oil in the wake of the 2005 global peak of conventional oil production should have served notice to the industrial world that, to adapt the title of Richard Heinberg’s excellent 2003 summary of the situation, the party was over:  the long era in which energy supplies had increased year over year was giving way to an unwelcome new reality in which decreasing energy supplies and increasing environmental blowback were the defining themes. As my readers doubtless noticed, though, the only people who willing to grasp that were out here on the fringes where archdruids lurk. Closer to the mainstream of our collective thinking, most people scrunched shut their eyes, plugged their ears with their fingers, and shouted “La, la, la, I can’t hear you” at the top of their lungs, in a desperate attempt to keep reality from getting a word in edgewise.
For the last five years or so, any attempt to talk about the impending twilight of the age of oil thus ran headfirst into a flurry of pro-fracking propaganda. Fatuous twaddle about America’s inevitable future as the world’s new energy superpower took the place of serious discussions of the predicament into which we’ve backed ourselves—and not for the first time, either. That’s what makes the attempt to get Druids to invest their life savings in fracking so funny, in a bleak sort of way: it’s an attempt to do for the fracking boom what the fracking boom attempted to do for industrial civilization as a whole—to pretend, in the teeth of the facts, that the unsustainable can be sustained for just a little while longer.
A few months back, I decided to celebrate this sort of thinking by way of the grand old Druid custom of satire. The Great Squirrel Case Challenge of 2015 solicited mock proposals for solving the world’s energy problems that were even nuttier than the ones in the mainstream media. That was no small challenge—a detail some of my readers pointed up by forwarding any number of clueless stories from the mainstream media loudly praising energy boondoggles of one kind or another.
I’m delighted to say, though, that the response was even better than I’d hoped for.  The contest fielded more than thirty entries, ranging from the merely very good to the sidesplittingly funny. There were two winners, one chosen by the members of the Green Wizardsforum, one chosen by me; in both cases, it was no easy choice, and if I had enough author’s copies of my new book After Progress, I’d probably just up and given prizes to all the entries, they were that good. Still, it’s my honor to announce the winners:
My choice for best squirrel case—drumroll, please—goes to Steve Morgan, for his fine gosh-wow sales prospectus for, ahem, Shares of Hydrocarbons Imported from Titan. The Green Wizards forum choice—drumroll again—goes to Jason Heppenstall for his hilarious parody of a sycophantic media story, King Solomon’s Miners. Please join me in congratulating them. (Steve and Jason, drop me a comment with your mailing addresses, marked not for posting, and I’ll get your prizes on the way.)
Their hard-won triumph probably won’t last long. In the months and years ahead, I expect to see claims even more ludicrous being taken oh-so-seriously by the mainstream media, because the alternative is to face up to just how badly we’ve bungled the opportunities of the last four decades or so and just how rough a road we have ahead of us as a result. What gave the fracking bubble whatever plausibility it ever had, after all, was the way it fed on one of the faith-based credos at the heart of contemporary popular culture: the insistence, as pervasive as it is irrational, that the universe is somehow obligated to hand us abundant new energy sources to replace the ones we’ve already used so profligately. Lacking that blind faith, it would have been obvious to everyone—as it was to those of us in the peak oil community—that the fracking industry was scraping the bottom of the barrel and pretending that this proved the barrel was full.
Read the morning news with eyes freed from the deathgrip of the conventional wisdom and it’s brutally obvious that that’s what happened, and that the decline and fall of our civilization is well under way. Here in the US, a quarter of the country is in the fourth year of record drought, with snowpack on California’s Sierra Nevada mountains about 9% of normal; the Gulf Stream is slowing to a crawl due to the rapid melting of the Greenland ice sheets; permanent joblessness and grinding poverty have become pervasive in this country; the national infrastructure is coming apart after decades of malign neglect—well, I could go on; if you want to know what life is like in a falling civilization, go look out the window.
In the mainstream media, on the occasions when such things are mentioned at all, they’re treated as disconnected factoids irrelevant to the big picture. Most people haven’t yet grasped that these things arethe big picture—that while we’re daydreaming about an assortment of shiny futures that look more or less like the present with more toys, climate change, resource depletion, collapsing infrastructure, economic contraction, and the implosion of political and cultural institutions are creating the future we’re going to inhabit. Too many of us suffer from a weird inability to imagine a future that isn’t simply a continuation of the present, even when such a future stands knocking at our own front doors.
So vast a failure of imagination can’t be overcome by the simple expedient of pointing out the ways that it’s already failed to explain the world in which we live. That said, there are other ways to break the grip of the conventional wisdom, and I’m pleased to say that one of those other ways seems to be making modest but definite headway just now.
Longtime readers here will remember that in 2011, this blog launched a contest for short stories about the kind of future we can actually expect—a future in which no deus ex machina saves industrial civilization from the exhaustion of its resource base, the deterioration of the natural systems that support it, and the normal process of decline and fall. That contest resulted in an anthology, After Oil: SF Stories of a Post-Petroleum Future, which found a surprisingly large audience. On the strength of its success, I ran a second contest in 2014, which resulted in two more volumes—After Oil 2: The Years of Crisis, which is now available, and After Oil 3: The Years of Rebirth, which is in preparation. Demand for the original volume has remained steady, and the second is selling well; after a conversation with the publisher, I’m pleased to announce that we’re going to do it again, with a slight twist.
The basic rules are mostly the same as before:
Stories should be between 2500 and 7500 words in length; They should be entirely the work of their author or authors, and should not borrow characters or setting from someone else’s work;They should be in English, with correct spelling, grammar and punctuation; They should be stories—narratives with a plot and characters—and not simply a guided tour of some corner of the future as the author imagines it; They should be set in our future, not in an alternate history or on some other planet;They should be works of realistic fiction or science fiction, not magical or supernatural fantasy—that is, the setting and story should follow the laws of nature as those are presently understood;They should take place in settings subject to thermodynamic, ecological, and economic limits to growth; and as before,They must not rely on “alien space bats”—that is, dei ex machina inserted to allow humanity to dodge the consequences of the limits to growth. (Aspiring authors might want to read the whole “Alien Space Bats” post for a more detailed explanation of what I mean here; reading the stories from one or both of the published After Oil volumes might also be a good plan.)
This time, though, I’m adding an additional rule:
Stories submitted for this contest must be set at least one thousand years in the future—that is, after March 25, 3015 in our calendar.
That’s partly a reflection of a common pattern in entries for the two previous contests, and partly something deeper. The common pattern? A great many authors submitted stories that were set during or immediately after the collapse of industrial civilization; there’s certainly room for those, enough so that the entire second volume is basically devoted to them, but tales of surviving decline and fall are only a small fraction of the galaxy of potential stories that would fit within the rules listed above.  I’d like to encourage entrants to consider telling something different, at least this time.
The deeper dimension? That’s a reflection of the blindness of the imagination discussed earlier in this post, the inability of so many people to think of a future that isn’t simply a prolongation of the present. Stories set in the immediate aftermath of our civilization don’t necessarily challenge that, and I think it’s high time to start talking about futures that are genuinely other—neither utopia nor oblivion, but different, radically different, from the linear extrapolations from the present that fill so many people’s imaginations these days, and have an embarrassingly large role even in science fiction.
You have to read SF from more than a few decades back to grasp just how tight the grip of a single linear vision of the future has become on what used to be a much more freewheeling literature of ideas. In book after book, and even more in film after film, technologies that are obviously derived from ours, ideologies that are indistinguishable from ours, political and economic arrangements that could pass for ours, and attitudes and ideas that belong to this or that side of today’s cultural struggles get projected onto the future as though they’re the only imaginable options. This takes place even when there’s very good reason to think that the linear continuation of current trends isn’t an option at all—for example, the endlessly regurgitated, done-to-death trope of interstellar travel.
Let us please be real:  we aren’t going to the stars—not in our lifetimes, not in the lifetime of industrial civilization, not in the lifetime of our species. There are equally  good thermodynamic and economic reasons to believe that many of the other standard tropes of contemporary science fiction are just as unreachable—that, for example, limitless energy from gimmicks of the dilithium-crystal variety, artificial intelligences capable of human or superhuman thought, and the like belong to fantasy, not to the kind of science fiction that has any likelihood of becoming science fact. Any of my readers who want to insist that human beings can create anything they can imagine, by the way, are welcome to claim that, just as soon as they provide me with a working perpetual motion machine.
It’s surprisingly common to see people insist that the absence of the particular set of doodads common to today’s science fiction would condemn our descendants to a future of endless boredom. This attitude shows a bizarre stunting of the imagination—not least because stories about interstellar travel normally end up landing the protagonists in a world closely modeled on some past or present corner of the Earth. If our genus lasts as long as the average genus of vertebrate megafauna, we’ve got maybe ten million years ahead of us, or roughly two thousand times as long as all of recorded human history to date: more than enough time for human beings to come up with a dazzling assortment of creative, unexpected, radically differentsocieties, technologies, and ways of facing the universe and themselves.
That’s what I’d like to see in submissions to this year’s Space Bats challenge—yes, it’ll be an annual thing from here on out, as long as the market for such stories remains lively. A thousand years from now, industrial civilization will be as far in the past as the Roman Empire was at the time of the Renaissance, and new human societies will have arisen to pass their own judgment on the relics of our age. Ten thousand years from now, or ten million? Those are also options. Fling yourself into the far future, far enough that today’s crises are matters for the history books, or tales out of ancient myth, or forgotten as completely as the crises and achievements of the Neanderthal people are today, and tell a story about human beings (or, potentially, post-human beings) confronting the challenges of their own time in their own way. Do it with verve and a good readable style, and your story may be be one of the ones chosen to appear in the pages of After Oil 4:  The Future’s Distant Shores.
The mechanics are pretty much the same as before. Write your story and post it to the internet—if you don’t have a blog, you can get one for free from Blogspot or Wordpress. Post a link to it in the comments to The Archdruid Report. You can write more than one story, but please let me know which one you want entered in the competition—there will be only one entry accepted per author this time. Stories must be written and posted online, and a link posted to this blog, by August 30, 2015 to be eligible for inclusion in the anthology.

The View From Outside

Wed, 2015-03-18 17:25
Recently I’ve been reacquainting myself with the stories of Clark Ashton Smith. Though he’s largely forgotten today, Smith was one of the leading lights of Weird Tales magazine during its 1930s golden age, ranking with H.P Lovecraft and Robert Howard as a craftsman of fantasy fiction. Like Lovecraft, Howard, and most of the other authors in the Weird Talesstable, Smith was an outsider; he spent his life in a small town in rural California; he was roundly ignored by the literary scene of his day, and returned the favor with gusto. With the twilight of the pulps, Smith’s work was consigned to the dustbin of literary history.  It was revived briefly during the fantasy boom of the 1970, only to sink from sight again when the fantasy genre drowned in a swamp of faux-medieval clichés thereafter.
There’s no shortage of reasons to give Smith another look today, starting with his mastery of image and atmosphere and the wry humor that shaped the best of his mature work. Still, that’s a theme for another time, and possibly another forum. The theme that’s relevant to this blog is woven into one of  Smith’s classic stories, The Dark Age. First published in 1938, it’s among the earliest science fiction stories I know of that revolves around an organized attempt to preserve modern science through a future age of barbarism.
The story’s worth reading in its own right, so I won’t hand out spoilers here. Still, I don’t think it will give away anything crucial to mention that one of the mainsprings of the story is the inability of the story’s scientists to find or make common ground with the neo-barbarian hill tribes around them. That aspect of the story has been much on my mind of late. Despite the rockets and rayguns that provide so much of its local color, science fiction is always about the present, which it displays in an unfamiliar light by showing a view from outside, from the distant perspective of an imaginary future.
That’s certainly true of Smith’s tale, which drew much of its force at the time of its composition from the widening chasm between the sciences and the rest of human culture that C.P. Snow discussed two decades later in his famous work “The Two Cultures.” That chasm has opened up a good deal further since Smith’s time, and its impact on the future deserves discussion here, not least because it’s starting to come into sight even through the myopic lenses of today’s popular culture.
I’m thinking here, for example, of a recent blog post by Scott Adams, the creator of the “Dilbert” comic strip. There’s a certain poetic justice in seeing popular culture’s acknowledged expert on organizational failure skewer one of contemporary science’s more embarrassing habits, but there’s more to the spectacle than a Dilbertesque joke. As Adams points out, there’s an extreme mismatch between the way that science works and the way that scientists expect their claims to be received by the general public. Within the community of researchers, the conclusions of the moment are, at least in theory, open to constant challenge—but only from within the scientific community.
The general public is not invited to take part in those challenges. Quite the contrary, it’s supposed to treat the latest authoritative pronouncement as truth pure and simple, even when that contradicts the authoritative pronouncements of six months before. Now of course there are reasons why scientists might not want to field a constant stream of suggestions and challenges from people who don’t have training in relevant disciplines, but the fact remains that expecting people to blindly accept whatever scientists say about nutrition, when scientific opinion on that subject has been whirling around like a weathercock for decades now, is not a strategy with a long shelf life. Sooner or later people start asking why they should take the latest authoritative pronouncement seriously, when so many others landed in the trash can of discarded opinions a few years further on.
There’s another, darker reason why such questions are increasingly common just now. I’m thinking here of the recent revelation that the British scientists tasked by the government with making dietary recommendations have been taking payola of various kinds from the sugar industry.  That’s hardly a new thing these days. Especially but not only in those branches of science concerned with medicine, pharmacology, and nutrition, the prostitution of the scientific process by business interests has become an open scandal. When a scientist gets behind a podium and makes a statement about the safety or efficacy of a drug, a medical treatment, or what have you, the first question asked by an ever-increasing number of people outside the scientific community these days is “Who’s paying him?”
It would be bad enough if that question was being asked because of scurrilous rumors or hostile propaganda. Unfortunately, it’s being asked because there’s nothing particularly unusual about the behavior of the British scientists mentioned above. These days, in any field where science comes into contact with serious money, scientific studies are increasingly just another dimension of marketing. From influential researchers being paid to put their names on dubious studies to give them unearned credibility to the systematic concealment of “outlying” data that doesn’t support the claims made for this or that lucrative product, the corruption of science is an ongoing reality, and one that existing safeguards within the scientific community are not effectively countering.
Scientists have by and large treated the collapse in scientific ethics as an internal matter. That’s a lethal mistake, because the view that matters here is the view from outside. What looks to insiders like a manageable problem that will sort itself out in time, looks from outside the laboratory and the faculty lounge like institutionalized corruption on the part of a self-proclaimed elite whose members cover for each other and are accountable to no one. It doesn’t matter, by the way, how inaccurate that view is in specific cases, how many honest men and women are laboring at lab benches, or how overwhelming the pressure to monetize research that’s brought to bear on scientists by university administrations and corporate sponsors: none of that finds its way into the view from outside, and in the long run, the view from outside is the one that counts..
The corruption of science by self-interest is an old story, and unfortunately it’s most intense in those fields where science impacts the lives of nonscientists most directly:  yes, those would be medicine, pharmacology, and nutrition. I mentioned in an earlier blog post here a friend whose lifelong asthma, which landed her in the hospital repeatedly and nearly killed her twice, was cured at once by removing a common allergen from her diet. Mentioning this to her physician led to the discovery that he’d known about the allergy issue all along, but as he explained, “We prefer to medicate for that.” Understandably so, as a patient who’s cured of an ailment is a good deal less lucrative for the doctor than one who has to keep on receiving regular treatments and prescriptions—but as a result of that interaction among others, the friend in question has lost most of what respect she once had for mainstream medicine, and is now learning herbalism to meet her health care needs.
It’s an increasingly common story these days, and I could add plenty of other accounts here. The point I want to make, though, is that it’s painfully obvious that the physician who preferred to medicate never thought about the view from outside. I have no way of knowing what combination of external pressures and personal failings led him to conceal a less costly cure from my friend, and keep her on expensive and ineffective drugs with a gallery of noxious side effects instead, but from outside the walls of the office, it certainly looked like a callous betrayal of whatever ethics the medical profession might still have left—and again, the view from outside is the one that counts.
It counts because institutional science only has the authority and prestige it possesses today because enough of those outside the scientific community accept its claim to speak the truth about nature. Not that many years ago, all things considered, scientists didn’t have the authority or the prestige, and no law of nature or of society guarantees that they’ll keep either one indefinitely. Every doctor who would rather medicate than cure, every researcher who treats conflicts of interest as just another detail of business as usual, every scientist who insists in angry tones that nobody without a Ph.D. in this or that discipline is entitled to ask why this week’s pronouncement should be taken any more seriously than the one it just disproved—and let’s not even talk about the increasing, and increasingly public, problem of overt scientific fraud in the pharmaceutical field among others—is hastening the day when modern science is taken no more seriously by the general public than, say, academic philosophy is today.
That day may not be all that far away. That’s the message that should be read, and is far too rarely read, in the accelerating emergence of countercultures that reject the authority of science in one field. As a recent and thoughtful essay in Slate pointed out, that crisis of authority is what gives credibility to such movements as climate denialists and “anti-vaxxers” (the growing number of parents who refuse to have their children vaccinated). A good many any people these days, when the official voices of the scientific community say this or that, respond by asking “Why should we believe you?”—and too many of them don’t get a straightforward answer that addresses their concerns.
A bit of personal experience from a different field may be relevant here. Back in the late 1980s and early 1990s, when I lived in Seattle, I put a fair amount of time into collecting local folklore concerning ghosts and other paranormal phenomena. I wasn’t doing this out of any particular belief, or for that matter any particular unbelief; I was seeking a sense of the mythic terrain of the Puget Sound region, the landscapes of belief and imagination that emerged from the experiences of people on the land, with an eye toward the career writing fiction that I then hoped to launch. While I was doing this research, when something paranormal was reported anywhere in the region, I generally got to hear about it fairly quickly, and in the process I got to watch a remarkable sequence of events that repeated itself like a broken record in more cases than I can count.
Whether the phenomenon that was witnessed was an unusual light in the sky, a seven-foot-tall hairy biped in the woods, a visit from a relative who happened to be dead at the time, or what have you, two things followed promptly once the witness went public. The first was the arrival of a self-proclaimed skeptic, usually a member of CSICOP (the Committee for Scientific Investigation of Claims of the Paranormal), who treated the witness with scorn and condescension, made dogmatic claims about what must have happened, and responded to any disagreement with bullying and verbal abuse. The other thing that followed was the arrival of an investigator from one of the local paranormal-research organizations, who was invariably friendly and supportive, listened closely to the account of the witness, and took the incident seriously. I’ll let you guess which of the proposed explanations the witness usually ended up embracing, not to mention which organization he or she often joined.
The same process on a larger and far more dangerous scale is shaping attitudes toward science across a wide and growing sector of American society. Notice that unlike climate denialism, the anti-vaxxer movement isn’t powered by billions of dollars of grant money, but it’s getting increasing traction. The reason is as simple as it is painful: parents are asking physicians and scientists, “How do I know this substance you want to put into my child is safe?”—and the answers they’re getting are not providing them with the reassurance they need.
It’s probably necessary here to point out that I’m no fan of the anti-vaxxer movement. Since epidemic diseases are likely to play a massive role in the future ahead of us, I’ve looked into anti-vaxxer arguments with some care, and they don’t convince me at all. It’s clear from the evidence that vaccines do far more often than not provide protection against dangerous diseases; while some children are harmed by the side effects of vaccination, that’s true of every medical procedure, and the toll from side effects is orders of magnitude smaller than the annual burden of deaths from these same diseases in the pre-vaccination era.
Nor does the anti-vaxxer claim that vaccines cause autism hold water. (I have Aspergers syndrome, so the subject’s of some personal interest to me.)  The epidemiology of autism spectrum disorders simply doesn’t support that claim; to my educated-layperson’s eyes, at least, it matches that of an autoimmune disease instead, complete with the rapid increase in prevalence in recent years. The hypothesis I’d be investigating now, if I’d gone into biomedical science rather than the history of ideas, is that autism spectrum disorders are sequelae of an autoimmune disease that strikes in infancy or early childhood, and causes damage to any of a variety of regions in the central nervous system—thus the baffling diversity of neurological deficits found in those of us on the autism spectrum.
Whether that’s true or not will have to be left to trained researchers. The point that I want to make here is that I don’t share the beliefs that drive the anti-vaxxer movement. Similarly, I’m sufficiently familiar with the laws of thermodynamics and the chemistry of the atmosphere to know that when the climate denialists insist that dumping billions of tons of carbon dioxide into the atmosphere can’t change its capacity to retain heat, they’re smoking their shorts.  I’ve retained enough of a childhood interest in paleontology, and studied enough of biology and genetics since then, to be able to follow the debates between evolutionary biology and so-called “creation science,” and I’m solidly on Darwin’s side of the bleachers. I could go on; I have my doubts about a few corners of contemporary scientific theory, but then so do plenty of scientists.
That is to say, I don’t agree with the anti-vaxxers, the climate denialists, the creationists, or their equivalents, but I think I understand why they’ve rejected the authority of science, and it’s not because they’re ignorant cretins, much as though the proponents and propagandists of science would like to claim that. It’s because they’ve seen far too much of the view from outside. Parents who encounter a medical industry that would rather medicate than heal are more likely to listen to anti-vaxxers; Americans who watch climate change activists demand that the rest of the world cut its carbon footprint, while the activists themselves get to keep cozy middle-class lifestyles, are more likely to believe that global warming is a politically motivated hoax; Christians who see atheists using evolution as a stalking horse for their ideology are more likely to turn to creation science—and all three, and others, are not going to listen to scientists who insist that they’re wrong, until and unless the scientists stop and take a good hard look at how they and their proclamations look when viewed from outside.
I’m far from sure that anybody in the scientific community is willing to take that hard look. It’s possible; these days, even committed atheists are starting to notice that whenever Richard Dawkins opens his mouth, twenty people who were considering atheism decide to give God a second chance. The arrogant bullying that used to be standard practice among the self-proclaimed skeptics and “angry atheists” has taken on a sullen and defensive tone recently, as though it’s started to sink in that yelling abuse at people who disagree with you might not be the best way to win their hearts and minds. Still, for that same act of reflection to get any traction in the scientific community, a great many people in that community are going to have to rethink the way they handle dealing with the public, especially when science, technology, and medicine cause harm. That, in turn, is only going to happen if enough of today’s scientists remember the importance of the view from outside.
In the light of the other issues I’ve tried to discuss over the years in this blog, that view has another dimension, and it’s a considerably harsher one. Among the outsiders whose opinion of contemporary science matters most are some that haven’t been born yet: our descendants, who will inhabit a world shaped by science and the technologies that have resulted from scientific research. It’s still popular to insist that their world will be a Star Trek fantasy of limitlessness splashed across the galaxy, but I think most people are starting to realize just how unlikely that future actually is.
Instead, the most likely futures for our descendants are those in which the burdens left behind by today’s science and technology are much more significant than the benefits.  Those most likely futures will be battered by unstable climate and rising oceans due to anthropogenic climate change, stripped of most of the world's topsoil, natural resources, and ecosystems, strewn with the radioactive and chemical trash that our era produced in such abundance and couldn’t be bothered to store safely—and most of today’s advanced technologies will have long since rusted into uselessness, because the cheap abundant energy and other nonrenewable resources that were needed to keep them running all got used up in our time.
People living in such a future aren’t likely to remember that a modest number of scientists signed petitions and wrote position papers protesting some of these things. They’re even less likely to recall the utopian daydreams of perpetual progress and limitless abundance that encouraged so many other people in the scientific community to tell themselves that these things didn’t really matter—and if by chance they do remember those daydreams, their reaction to them won’t be pretty. That science today, like every other human institution in every age, combines high ideals and petty motives in the usual proportions will not matter to them in the least.
Unless something changes sharply very soon, their view from outside may well see modern science—all of it, from the first gray dawn of the scientific revolution straight through to the flamelit midnight when the last laboratory was sacked and burned by a furious mob—as a wicked dabbling in accursed powers that eventually brought down just retribution upon a corrupt and arrogant age. So long as the proponents and propagandists of science ignore the view from outside, and blind themselves to the ways that their own defense of science is feeding the forces that are rising against it, the bleak conclusion of the Clark Ashton Smith story cited at the beginning of this post may yet turn out to be far more prophetic than the comfortable fantasies of perpetual scientific advancement cherished by so many people today.
********
On a less bleak but not wholly unrelated subject, I’m pleased to announce that my forthcoming book After Progress is rolling off the printing press as I write this. There were a few production delays, and so it’ll be next month before orders from the publisher start being shipped; the upside to this is that the book can still be purchased for 20% off the cover price. I’m pretty sure that this book will offend people straight across the spectrum of acceptable opinion in today’s industrial society, so get your copy now, pop some popcorn, and get ready to enjoy the show.

The Prosthetic Imagination

Wed, 2015-03-11 19:16
Two news stories and an op-ed piece in the media in recent days provide a useful introduction to the theme of this week’s post here on The Archdruid Report. The first news story followed the official announcement that the official unemployment rate here in the United States dropped to 5.5% last month. This was immediately hailed by pundits and politicians as proof that the recession we weren’t in is over at last, and the happy days that never went away are finally here again.
This jubilation makes perfect sense so long as you don’t happen to know that the official unemployment rate in the United States doesn’t actually depend on the number of people who are out of work. What it indicates is the percentage of US residents who happen to be receiving unemployment benefits—which, as I think most people know at this point, run out after a certain period. Right now there are a huge number of Americans who exhausted their unemployment benefits a long time ago, can’t find work, and would count as unemployed by any measure except the one used by the US government these days.  As far as officialdom is concerned, they are nonpersons in very nearly an Orwellian sense, their existence erased to preserve a politically expedient fiction of prosperity.
How many of these economic nonpersons are there in the United States today? That figure’s not easy to find amid the billowing statistical smokescreens. Still, it’s worth noting that 92,898,000 Americans of working age are not currently in the work force—that is, more than 37 per cent of the working age population. If you spend time around people who don’t belong to this nation’s privileged classes, you already know that a lot of those people would gladly take jobs if there were jobs to be had, but again, that’s not something that makes it through the murk.
We could spend quite a bit of time talking about the galaxy of ways in which economic statistics are finessed and/or fabricated these days, but the points already raised are enough for the present purpose. Let’s move on. The op-ed piece comes from erstwhile environmentalist Stewart Brand, whose long journey from editing CoEvolution Quarterly to channeling Bjorn Lomborg is as perfect a microcosm of the moral collapse of 20th century American environmentalism as you could hope to find. Brand’s latest piece claims that despite all evidence to the contrary—and of course there’s quite a bit of that these days—the environment is doing just fine: the economy has decoupled from resource use in recent decades, at least here in America, and so we can continue to wallow in high-tech consumer goodies without worrying about what we’re doing to the planet.
There’s a savage irony in the fact that in 1975, when his magazine was the go-to place to read about the latest ideas in systems theory and environmental science, Brand could have pointed out the gaping flaw in that argument in a Sausalito minute. Increasing prosperity in the United States has “decoupled” from resource use for two reasons: first, only a narrowing circle of privileged Americans get to see any of the paper prosperity we’re discussing—the standard of living for most people in this country has been contracting steadily for four decades—and second, the majority of consumer goods used in the United States are produced overseas, and so the resource use and environmental devastation involved in manufacturing the goodies we consume so freely takes place somewhere else.
That is to say, what Brand likes to call decoupling is our old friend, the mass production of ecological externalities. Brand can boast about prosperity without environmental cost because the great majority of the costs are being carried by somebody else, somewhere else, and so don’t find their way into his calculations.  The poor American neighborhoods where people struggle to get by without jobs are as absent from his vision of the world as they are from the official statistics; the smokestacks, outflow pipes, toxic-waste dumps, sweatshopped factories, and open-pit mines worked by slave labor that prop up his high-tech lifestyle are overseas, so they don’t show up on US statistics either. As far as Brand is concerned, that means they don’t count.
We could talk more about the process by which a man who first became famous for pressuring NASA into releasing a photo of the whole earth is now insisting that the only view that matters is the one from his living room window, but let’s go on. The other news item is the simplest and, in a bleak sort of way, the funniest of the lot.  According to recent reports, state government officials in Florida are being forbidden from using the phrase “climate change” when discussing the effects of, whisper it, climate change.
This is all the more mordantly funny because Florida is on the front lines of climate change right now.  Even the very modest increases in sea level we’ve seen so far, driven by thermal expansion and the first rounds of Greenland and Antarctic meltwater, are sending seawater rushing out of the storm sewers into the streets of low-lying parts of coastal Florida towns whenever the tide is high and an onshore wind blows hard enough. As climate change accelerates—and despite denialist handwaving, it does seem to be doing that just now—a lot of expensive waterfront property in Florida is going to end up underwater in more than a financial sense.  The state government’s response to this clear and present danger? Prevent state officials from talking about it.
We could look at a range of other examples of this same kind, but these three will do for now. What I want to discuss now is what’s going on here, and what it implies.
Let’s begin with the obvious. In all three of the cases I’ve cited, an uncomfortable reality is being dismissed by manipulating abstractions. An abstraction called “the unemployment rate” has been defined so that the politicians and bureaucrats who cite it don’t have to deal with just how many Americans these days can’t get paid employment; an abstraction called “decoupling” and a range of equally abstract (and cherrypicked) measures of environmental health are being deployed so that Brand and his readers don’t have to confront the soaring ecological costs of computer technology in particular and industrial society in general; an abstraction called “climate change,” finally, is being banned from use by state officials because it does too good a job of connecting certain dots that, for political reasons, Florida politicians don’t want people to connect.
To a very real extent, this sort of thing is pervasive in human interaction, and has been since the hoots and grunts of hominin vocalization first linked up with a few crude generalizations in the dazzled mind of an eccentric australopithecine. Human beings everywhere use abstract categories and the words that denote them as handles by which to grab hold of unruly bundles of experience. We do it far more often, and far more automatically, than most of us ever notice.  It’s only under special circumstances—waking up at night in an unfamiliar room, for example, and finding that the vague somethings around us take a noticeable amount of time to coalesce into ordinary furniture—that the mind’s role in assembling the fragmentary data of sensation into the objects of our experience comes to light.
When you look at a tree, for example, it’s common sense to think that the tree is sitting out there, and your eyes and mind are just passively receiving a picture of it—but then it’s common sense to think that the sun revolves around the earth. In fact, as philosophers and researchers into the psychophysics of sensation both showed a long time ago, what happens is that you get a flurry of fragmentary sense data—green, brown, line, shape, high contrast, low contrast—and your mind constructs a tree out of it, using its own tree-concept (as well as a flurry of related concepts such as “leaf,” “branch,” “bark,” and so on) as a template. You do that with everything you see, and the reason you don’t notice it is that it was the very first thing you learned how to do, as a newborn infant, and you’ve practiced it so often you don’t have to think about it any more.
You do the same thing with every representation of a sensory object. Let’s take visual art for an example.  Back in the 1880s, when the Impressionists first started displaying their paintings, it took many people a real effort to learn how to look at them, and a great many never managed the trick at all. Among those who did, though, it was quite common to hear comments about how this or that painting had taught them to see a landscape, or what have you, in a completely different way. That wasn’t just hyperbole:  the Impressionists had learned how to look at things in a way that brought out features of their subjects that other people in late 19th century Europe and America had never gotten around to noticing, and highlighted those things in their paintings so forcefully that the viewer had to notice them.
The relation between words and the things they denote is thus much more complex, and much more subjective, than most people ever quite get around to realizing. That’s challenging enough when we’re talking about objects of immediate experience, where the concept in the observer’s mind has the job of fitting fragmentary sense data into a pattern that can be verified by other forms of sense data—in the example of the tree, by walking up to it and confirming by touch that the trunk is in fact where the sense of sight said it was. It gets far more difficult when the raw material that’s being assembled by the mind consists of concepts rather than sensory data: when, let’s say, you move away from your neighbor Joe, who can’t find a job and is about to lose his house, start thinking about all the people in town who are in a similar predicament, and end up dealing with abstract concepts such as unemployment, poverty, the distribution of wealth, and so on.
Difficult or not, we all do this, all the time. There’s a common notion that dealing in abstractions is the hallmark of the intellectual, but that puts things almost exactly backwards; it’s the ordinary unreflective person who thinks in abstractions most of the time, while the thinker’s task is to work back from the abstract category to the raw sensory data on which it’s based. That’s what the Impressionists did:  staring at a snowbank as Monet did, until he could see the rainbow play of colors behind the surface impression of featureless white, and then painting the colors into the representation of the snowbank so that the viewer was shaken out of the trance of abstraction (“snow” = “white”) and saw the colors too—first in the painting, and then when looking at actual snow.
Human thinking, and human culture, thus dance constantly between the concrete and the abstract, or to use a slightly different terminology, between immediate experience and a galaxy of forms that reflect experience back in mediated form. It’s a delicate balance: too far into the immediate and experience disintegrates into fragmentary sensation; too far from the immediate and experience vanishes into an echo chamber of abstractions mediating one another. The most successful and enduring creations of human culture have tended to be those that maintain the balance. Representational painting is one of those; another is literature. Read the following passage closely:
“Eastward the Barrow-downs rose, ridge behind ridge into the morning, and vanished out of eyesight into a guess: it was no more than a guess of blue and a remote white glimmer blending with the hem of the sky, but it spoke to them, out of memory and old tales, of the high and distant mountains.”
By the time you finished reading it, you likely had a very clear sense of what Frodo Baggins and his friends were seeing as they looked off to the east from the hilltop behind Tom Bombadil’s house. So did I, as I copied the sentence, and so do most people who read that passage—but no two people see the same image, because the image each of us sees is compounded out of bits of our own remembered experiences. For me, the image that comes to mind has always drawn heavily on the view eastwards from the suburban Seattle neighborhoods where I grew up, across the rumpled landscape to the stark white-topped rampart of the Cascade Mountains. I know for a fact that that wasn’t the view that Tolkien himself had in mind when he penned that sentence; I suspect he was thinking of the view across the West Midlands toward the Welsh mountains, which I’ve never seen; and I wonder what it must be like for someone to read that passage whose concept of ridges and mountains draws on childhood memories of the Urals, the Andes, or Australia’s Great Dividing Range instead.
That’s one of the ways that literature takes the reader through the mediation of words back around to immediate experience. If I ever do have the chance to stand on a hill in the West Midlands and look off toward the Welsh mountains, Tolkien’s words are going to be there with me, pointing me toward certain aspects of the view I might not otherwise have noticed, just as they did in my childhood. It’s the same trick the Impressionists managed with a different medium: stretching the possibilities of experience by representing (literally re-presenting) the immediate in a mediated form.
Now think about what happens when that same process is hijacked, using modern technology, for the purpose of behavioral control.
That’s what advertising does, and more generally what the mass media do. Think about the fast food company that markets its product under the slogan “I’m loving it,” complete with all those images of people sighing with post-orgasmic bliss as they ingest some artificially flavored and colored gobbet of processed pseudofood. Are they loving it? Of course not; they’re hack actors being paid to go through the motions of loving it, so that the imagery can be drummed into your brain and drown out your own recollection of the experience of not loving it. The goal of the operation is to keep you away from immediate experience, so that a deliberately distorted mediation can be put in its place.
You can do that with literature and painting, by the way. You can do it with any form of mediation, but it’s a great deal more effective with modern visual media, because those latter short-circuit the journey back to immediate experience. You see the person leaning back with the sigh of bliss after he takes a bite of pasty bland bun and tasteless gray mystery-meat patty, and you see it over and over and over again. If you’re like most Americans, and spend four or five hours a day staring blankly at little colored images on a glass screen, a very large fraction of your total experience of the world consists of this sort of thing: distorted imitations of immediate experience, intended to get you to think about the world in ways that immediate experience won’t justify.
The externalization of the human mind and imagination via the modern mass media has no shortage of problematic features, but the one I want totalk about here is the way that it feeds into the behavior discussed at the beginning of this post: the habit, pervasive in modern industrial societies just now, of responding to serious crises by manipulating abstractions to make them invisible. That kind of thing is commonplace in civilizations on their way out history’s exit door, for reasons I’ve discussed in an earlier sequence of posts here, but modern visual media make it an even greater problem in the present instance. These latter function as a prosthetic for the imagination, a device for replacing the normal image-making functions of the human mind with electromechanical equivalents. What’s more, you don’t control the prosthetic imagination; governments and corporations control it, and use it to shape your thoughts and behavior in ways that aren’t necessarily in your best interests.
The impact on the prosthetic imagination on the crisis of our time is almost impossible to overstate. I wonder, for example, how many of my readers have noticed just how pervasive references to science fiction movies and TV shows have become in discussions of the future of technology. My favorite example just now is the replicator, a convenient gimmick from the Star Trek universe: you walk up to it and order something, and the replicator pops it into being out of nothing.
It’s hard to think of a better metaphor for the way that people in the privileged classes of today’s industrial societies like to think of the consumer economy. It’s also hard to think of anything that’s further removed from the realities of the consumer economy. The replicator is the ultimate wet dream of externalization: it has no supply chains, no factories, no smokestacks, no toxic wastes, just whatever product you want any time you happen to want it. That’s exactly the kind of thinking that lies behind Stewart Brand’s fantasy of “decoupling”—and it’s probably no accident that more often than not, when I’ve had conversations with people who think that 3-D printers are the solution to everything, they bring Star Trek replicators into the discussion.
3-D printers are not replicators. Their supply chains and manufacturing costs include the smokestacks, outflow pipes, toxic-waste dumps, sweatshopped factories, and open-pit mines worked by slave labor mentioned earlier, and the social impacts of their widespread adoption would include another wave of mass technological unemployment—remember, it’s only in the highly mediated world of current economic propaganda that people who lose their jobs due to automation automatically get new jobs in some other field; in the immediate world, that’s become increasingly uncommon. As long as people look at 3-D printers through minds full of little pictures of Star Trekreplicators, though, those externalized ecological and social costs are going to be invisible to them.
That, in turn, defines the problem with the externalization of the human mind and imagination: no matter how frantically you manipulate abstractions, the immediate world is still what it is, and it can still clobber you. Externalizing a cost doesn’t make it go away; it just guarantees that you won’t see it in time to do anything but suffer the head-on impact.

Peak Meaninglessness

Wed, 2015-03-04 19:00
Last week’s discussion of externalities—costs of doing business that get dumped onto the economy, the community, or the environment, so that those doing the dumping can make a bigger profit—is, I’m glad to say, not the first time this issue has been raised recently.  The long silence that closed around such things three decades ago is finally cracking; they’re being mentioned again, and not just by archdruids.  One of my readers—tip of the archdruidical hat to Jay McInerney—noted an article in Grist a while back that pointed out the awkward fact that none of the twenty biggest industries in today’s world could break even, much less make a profit, if they had to pay for the damage they do to the environment.
Now of course the conventional wisdom these days interprets that statement to mean that it’s unfair to make those industries pay for the costs they impose on the rest of us—after all, they have a God-given right to profit at everyone else’s expense, right?  That’s certainly the attitude of fracking firms in North Dakota, who recently proposed that  they ought to be exempted from the state’s rules on dumping radioactive waste, because following the rules would cost them too much money. That the costs externalized by the fracking industry will sooner or later be paid by others, as radionuclides in fracking waste work their way up the food chain and start producing cancer clusters, is of course not something anyone in the industry or the media is interested in discussing.
Watch this sort of thing, and you can see the chasm opening up under the foundations of industrial society. Externalized costs don’t just go away; one way or another, they’re going to be paid, and costs that don’t appear on a company’s balance sheet still affect the economy. That’s the argument of The Limits to Growth, still the most accurate (and thus inevitably the most reviled) of the studies that tried unavailingly to turn industrial society away from its suicidal path: on a finite planet, once an inflection point is passed, the costs of economic growth rise faster than growth does, and sooner or later force the global economy to its knees.
The tricks of accounting that let corporations pretend that their externalized costs vanish into thin air don’t change that bleak prognosis. Quite the contrary, the pretense that externalities don’t matter just makes it harder for a society in crisis to recognize the actual source of its troubles. I’ve come to think that that’s the unmentioned context behind a dispute currently roiling those unhallowed regions where economists lurk in the shrubbery: the debate over secular stagnation.
Secular stagnation? That’s the concept, unmentionable until recently, that the global economy could stumble into a rut of slow, no, or negative growth, and stay there for years. There are still plenty of economists who insist that this can’t happen, which is rather funny, really, when you consider that this has basically been the state of the global economy since 2009. (My back-of-the-envelope calculations suggest, in fact, that if you subtract the hallucinatory paper wealth manufactured by derivatives and similar forms of financial gamesmanship from the world’s GDP, the production of nonfinancial goods and services worldwide has actually been declining since before the 2008 housing crash.)
Even among those who admit that what’s happening can indeed happen, there’s no consensus as to how or why such a thing could occur.  On the off chance that any mainstream economists are lurking in the shrubbery in the even more unhallowed regions where archdruids utter unspeakable heresies, and green wizards clink mugs of homebrewed beer together and bay at the moon, I have a suggestion to offer: the most important cause of secular stagnation is the increasing impact of externalities on the economy. The dishonest macroeconomic bookkeeping that leads economists to think that externalized costs go away because they’re not entered into anyone’s ledger books doesn’t actually make them disappear; instead, they become an unrecognized burden on the economy as a whole, an unfelt headwind blowing with hurricane force in the face of economic growth.
Thus there’s a profound irony in the insistence by North Dakota fracking firms that they ought to be allowed to externalize even more of their costs in order to maintain their profit margin. If I’m right, the buildup of externalized costs is what’s causing the ongoing slowdown in economic activity worldwide that’s driving down commodity prices, forcing interest rates in many countries to zero or below, and resurrecting the specter of deflationary depression. The fracking firms in question thus want to respond to the collapse in oil prices—a result of secular stagnation—by doing even more of what’s causing secular stagnation. To say that this isn’t likely to end well is to understate the case considerably.
In the real world, of course, mainstream economists don’t listen to suggestions from archdruids, and fracking firms, like every other business concern these days, can be expected to put their short-term cash flow ahead of the survival of their industry, or for that matter of industrial civilization as a whole. Thus I propose to step aside from the subject of economic externalities for a moment—though I’ll be returning to it at intervals as we proceed with this sequence of posts—in order to discuss a subtler and less crassly financial form of the same phenomenon.
That form came in for discussion in the same post two weeks ago that brought the issue of externalities into this blog’s ongoing conversation. Quite a few readers commented about the many ways in which things labeled “more advanced,” “more progressive,” and the like were actually less satisfactory and less effective at meeting human needs than the allegedly more primitive technologies they replaced. Some of those comments focused, and quite sensibly, on the concrete examples, but others pondered the ways that today’s technology fails systematically at meeting certain human needs, and reflected on the underlying causes for that failure. One of my readers—tip of the archdruidical hat here to Ruben—gave an elegant frame for that discussion by suggesting that the peak of technological complexity in our time may also be described as peak meaninglessness.
I’d like to take the time to unpack that phrase. In the most general sense, technologies can be divided into two broad classes, which we can respectively call tools and prosthetics. The difference is a matter of function. A tool expands human potential, giving people the ability to do things they couldn’t otherwise do. A prosthetic, on the other hand, replaces human potential, doing something that under normal circumstances, people can do just as well for themselves.  Most discussions of technology these days focus on tools, but the vast majority of technologies that shape the lives of people in a modern industrial society are not tools but prosthetics.
Prosthetics have a definite value, to be sure. Consider an artificial limb, the sort of thing on which the concept of technology-as-prosthetic is modeled. If you’ve lost a leg in an accident, say, an artificial leg is well worth having; it replaces a part of ordinary human potential that you don’t happen to have any more, and enables you to do things that other people can do with their own leg. Imagine, though, that some clever marketer were to convince people to have their legs cut off so that they could be fitted for artificial legs. Imagine, furthermore, that the advertising for artificial legs became so pervasive, and so successful, that nearly everybody became convinced that human legs were hopelessly old-fashioned and ugly, and rushed out to get their legs amputated so they could walk around on artificial legs.
Then, of course, the manufacturers of artificial arms got into the same sort of marketing, followed by the makers of sex toys. Before long you’d have a society in which most people were gelded quadruple amputees fitted with artificial limbs and rubber genitals, who spent all their time talking about the wonderful things they could do with their prostheses. Only in the darkest hours of the night, when the TV was turned off, might some of them wonder why it was that a certain hard-to-define numbness had crept into all their interactions with other people and the rest of the world.
In a very real sense, that’s the way modern industrial society has reshaped and deformed human life for its more privileged inmates. Take any human activity, however humble or profound, and some clever marketer has found a way to insert a piece of technology in between the person and the activity. You can’t simply bake bread—a simple, homely, pleasant activity that people have done themselves for thousands of years using their hands and a few simple handmade tools; no, you have to have a bread machine, into which you dump a prepackaged mix and some liquid, push a button, and stand there being bored while it does the work for you, if you don’t farm out the task entirely to a bakery and get the half-stale industrially extruded product that passes for bread these days.
Now of course the bread machine manufacturers and the bakeries pitch their products to the clueless masses by insisting that nobody has time to bake their own bread any more. Ivan Illich pointed out in Energy and Equity a long time ago the logical fallacy here, which is that using a bread machine or buying from a bakery is only faster if you don’t count the time you have to spend earning the money needed to pay for it, power it, provide it with overpriced prepackaged mixes, repair it, clean it, etc., etc., etc. Illich’s discussion focused on automobiles; he pointed out that if you take the distance traveled by the average American auto in a year, and divide that by the total amount of time spent earning the money to pay for the auto, fuel, maintenance, insurance, etc., plus all the other time eaten up by tending to the auto in various ways, the average American car goes about 3.5 miles an hour: about the same pace, that is, that an ordinary human being can walk.
If this seems somehow reminiscent of last week’s discussion of externalities, dear reader, it should. The claim that technology saves time and labor only seems to make sense if you ignore a whole series of externalities—in this case, the time you have to put into earning the money to pay for the technology and into coping with whatever requirements, maintenance needs, and side effects the technology has. Have you ever noticed that the more “time-saving technologies” you bring into your life, the less free time you have? This is why—and it’s also why the average medieval peasant worked shorter hours, had more days off, and kept a larger fraction of the value of his labor than you do.
Something else is being externalized by prosthetic technology, though, and it’s that additional factor that gives Ruben’s phrase “peak meaninglessness” its punch. What are you doing, really, when you use a bread machine? You’re not baking bread; the machine is doing that. You’re dumping a prepackaged mix and some water into a machine, closing the lid, pushing a button, and going away to do something else. Fair enough—but what is this “something else” that you’re doing? In today’s industrial societies, odds are you’re going to go use another piece of prosthetic technology, which means that once again, you’re not actually doing anything. A machine is doing something for you. You can push that button and walk away, but again, what are you going to do with your time? Use another machine?
The machines that industrial society uses to give this infinite regress somewhere to stop—televisions, video games, and computers hooked up to the internet—simply take the same process to its ultimate extreme. Whatever you think you’re doing when you’re sitting in front of one of these things, what you’re actually doing is staring at little colored pictures on a glass screen and pushing some buttons. All things considered, this is a profoundly boring activity, which is why the little colored pictures jump around all the time; that’s to keep your nervous system so far off balance that you don’t notice just how tedious it is to spend hours at a time staring at little colored pictures on a screen.
I can’t help but laugh when people insist that the internet is an information-rich environment. It’s quite the opposite, actually: all you get from it is the very narrow trickle of verbal, visual, and auditory information that can squeeze through the digital bottleneck and turn into little colored pictures on a glass screen. The best way to experience this is to engage in a media fast—a period in which you deliberately cut yourself off from all electronic media for a week or more, preferably in a quiet natural environment. If you do that, you’ll find that it can take two or three days, or even more, before your numbed and dazzled nervous system recovers far enough that you can begin to tap in to the ocean of sensory information and sensual delight that surrounds you at every moment. It’s only then, furthermore, that you can start to think your own thoughts and dream your own dreams, instead of just rehashing whatever the little colored pictures tell you.
A movement of radical French philosophers back in the 1960s, the Situationists, argued that modern industrial society is basically a scheme to convince people to hand over their own human capabilities to the industrial machine, so that imitations of those capabilities can be sold back to them at premium prices. It was a useful analysis then, and it’s even more useful now, when the gap between realities and representations has become even more drastic than it was back then. These days, as often as not, what gets sold to people isn’t even an imitation of some human capability, but an abstract representation of it, an arbitrary marker with only the most symbolic connection to what it represents.
This is one of the reasons why I think it’s deeply mistaken to claim that Americans are materialistic. Americans are arguably the least materialistic people in the world; no actual materialist—no one who had the least appreciation for actual physical matter and its sensory and sensuous qualities—could stand the vile plastic tackiness of America’s built environment and consumer economy for a fraction of a second.  Americans don’t care in the least about matter; they’re happy to buy even the most ugly, uncomfortable, shoddily made and absurdly overpriced consumer products you care to imagine, so long as they’ve been convinced that having those products symbolizes some abstract quality they want, such as happiness, freedom, sexual pleasure, or what have you.
Then they wonder, in the darkest hours of the night, why all the things that are supposed to make them happy and satisfied somehow never manage to do anything of the kind. Of course there’s a reason for that, too, which is that happy and satisfied people don’t keep on frantically buying products in a quest for happiness and satisfaction. Still, the little colored pictures keep showing them images of people who are happy and satisfied because they guzzle the right brand of tasteless fizzy sugar water, and pay for the right brand of shoddily made half-disposable clothing, and keep watching the little colored pictures: that last above all else. “Tune in tomorrow” is the most important product that every media outlet sells, and they push it every minute of every day on every stop and key.
That is to say, between my fantasy of voluntary amputees eagerly handing over the cash for the latest models of prosthetic limbs, and the reality of life in a modern industrial society, the difference is simply in the less permanent nature of the alterations imposed on people here and now.  It’s easier to talk people into amputating their imaginations than it is to convince them to amputate their limbs, but it’s also a good deal easier to reverse the surgery.
What gives this even more importance than it would otherwise have, in turn, is that all this is happening in a society that’s hopelessly out of touch with the realities that support its existence, and that relies on bookkeeping tricks of the sort discussed toward the beginning of this essay to maintain the fantasy that it’s headed somewhere other than history’s well-used compost bin. The externalization of the mind and the imagination plays just as important a role in maintaining that fantasy as the externalization of costs—and the cold mechanical heart of the externalization of the mind and imagination is mediation, the insertion of technological prosthetics into the space between the individual and the world. We’ll talk more about that in next week’s post.
****************In other news, I’m delighted to report the publication of a new book of mine that may be of particular interest to readers of this blog: Collapse Now and Avoid the Rush: The Best of the Archdruid Report, which is just out from Founders House Publishing. As the title suggests, it’s an anthology of twenty-five of the most popular weekly posts from this blog, including such favorites as "Knowing Only One Story," "An Elegy for the Age of Space," "The Next Ten Billion Years," and "The Time of the Seedbearers," as well as the title essay and many more. These are the one-of-a-kind essays that haven’t appeared in my books; if you’re looking for something to hand to the spouse or friend or twelve-year-old kid who wants to know why you keep visiting this sight every Wednesday night, or simply want this blog’s best essays in a more permanent form, this is the book. It’s available in print and e-book formats and can be ordered here.

The Externality Trap, or, How Progress Commits Suicide

Wed, 2015-02-25 19:14
I've commented more than once in these essays about the cooperative dimension of writing:  the way that even the most solitary of writers inevitably takes part in what Mortimer Adler used to call the Great Conversation, the flow of ideas and insights across the centuries that’s responsible for most of what we call culture. Sometimes that conversation takes place second- or third-hand—for example, when ideas from two old books collide in an author’s mind and give rise to a third book, which will eventually carry the fusion to someone else further down the stream of time—but sometimes it’s far more direct.
Last week’s post here brought an example of the latter kind. My attempt to cut through the ambiguities surrounding that slippery word “progress” sparked a lively discussion on the comments page of my blog about just exactly what counted as progress, what factors made one change “progressive” while another was denied that label. In the midst of it all, one of my readers—tip of the archdruidical hat to Jonathan—proposed an unexpected definition:  what makes a change qualify as progress, he suggested, is that it increases the externalization of costs. 
I’ve been thinking about that definition since Jonathan proposed it, and it seems to me that it points up a crucial and mostly unrecognized dimension of the crisis of our time. To make sense of it, though, it’s going to be necessary to delve briefly into economic jargon.
Economists use the term “externalities” to refer to the costs of an economic activity that aren’t paid by either party in an exchange, but are pushed off onto somebody else. You won’t hear a lot of talk about externalities these days; it many circles, it’s considered impolite to mention them, but they’re a pervasive presence in contemporary life, and play a very large role in some of the most intractable problems of our age. Some of those problems were discussed by Garret Hardin in his famous essay on the tragedy of the commons, and more recently by Elinor Ostrom in her studies of how that tragedy can be avoided; still, I’m not sure how often it’s recognized that the phenomena they discussed applies not just to commons systems, but to societies as a whole—especially to societies like ours.
An example may be useful here. Let’s imagine a blivet factory, which turns out three-prong, two-slot blivets in pallet loads for customers. The blivet-making process, like manufacturing of every other kind, produces waste as well as blivets, and we’ll assume for the sake of the example that blivet waste is moderately toxic and causes health problems in people who ingest it. The blivet factory produces one barrel of blivet waste for every pallet load of blivets it ships. The cheapest option for dealing with the waste, and thus the option that economists favor, is to dump it into the river that flows past the factory.
Notice what happens as a result of this choice. The blivet manufacturer has maximized his own benefit from the manufacturing process, by avoiding the expense of finding some other way to deal with all those barrels of blivet waste. His customers also benefit, because blivets cost less than they would if the cost of waste disposal was factored into the price. On the other hand, the costs of dealing with the blivet waste don’t vanish like so much twinkle dust; they are imposed on the people downstream who get their drinking water from the river, or from aquifers that receive water from the river, and who suffer from health problems because there’s blivet waste in their water. The blivet manufacturer is externalizing the cost of waste disposal; his increased profits are being paid for at a remove by the increased health care costs of everyone downstream.
That’s how externalities work. Back in the days when people actually talked about the downsides of economic growth, there was a lot of discussion of how to handle externalities, and not just on the leftward end of the spectrum.  I recall a thoughtful book titled TANSTAAFL—that’s an acronym, for those who don’t know their Heinlein, for “There Ain’t No Such Thing As A Free Lunch”—which argued, on solid libertarian-conservative grounds, that the environment could best be preserved by making sure that everyone paid full sticker price for the externalities they generated. Today’s crop of pseudoconservatives, of course, turned their back on all this a long time ago, and insist at the top of their lungs on their allegedly God-given right to externalize as many costs as they possibly can.  This is all the more ironic in that most pseudoconservatives claim to worship a God who said some very specific things about “what ye do to the least of these,” but that’s a subject for a different post.
Economic life in the industrial world these days can be described, without too much inaccuracy, as an arrangement set up to allow a privileged minority to externalize nearly all their costs onto the rest of society while pocketing as much as possible the benefits themselves. That’s come in for a certain amount of discussion in recent years, but I’m not sure how many of the people who’ve participated in those discussions have given any thought to the role that technological progress plays in facilitating the internalization of benefits and the externalization of costs that drive today’s increasingly inegalitarian societies. Here again, an example will be helpful.
Before the invention of blivet-making machinery, let’s say, blivets were made by old-fashioned blivet makers, who hammered them out on iron blivet anvils in shops that were to be found in every town and village. Like other handicrafts, blivet-making was a living rather than a ticket to wealth; blivet makers invested their own time and muscular effort in their craft, and turned out enough in the way of blivets to meet the demand. Notice also the effect on the production of blivet waste. Since blivets were being made one at a time rather than in pallet loads, the total amount of waste was smaller; the conditions of handicraft production also meant that blivet makers and their families were more likely to be exposed to the blivet waste than anyone else, and so had an incentive to invest the extra effort and expense to dispose of it properly. Since blivet makers were ordinary craftspeople rather than millionaires, furthermore, they weren’t as likely to be able to buy exemption from local health laws.
The invention of the mechanical blivet press changed that picture completely.  Since one blivet press could do as much work as fifty blivet makers, the income that would have gone to those fifty blivet makers and their families went instead to one factory owner and his stockholders, with as small a share as possible set aside for the wage laborers who operate the blivet press. The factory owner and stockholders had no incentive to pay for the proper disposal of the blivet waste, either—quite the contrary, since having to meet the disposal costs cut into their profit, buying off local governments was much cheaper, and if the harmful effects of blivet waste were known, you can bet that the owner and shareholders all lived well upstream from the factory. 
Notice also that a blivet manufacturer who paid a living wage to his workers and covered the costs of proper waste disposal would have to charge a higher price for blivets than one who did neither, and thus would be driven out of business by his more ruthless competitor. Externalities aren’t simply made possible by technological progress, in other words; they’re the inevitable result of technological progress in a market economy, because externalizing the costs of production is in most cases the most effective way to outcompete rival firms, and the firm that succeeds in externalizing the largest share of its costs is the most likely to prosper and survive.
Each further step in the progress of blivet manufacturing, in turn, tightened the same screw another turn. Today, to finish up the metaphor, the entire global supply of blivets is made in a dozen factories in  distant Slobbovia, where sweatshop labor under ghastly working conditions and the utter absence of environmental regulations make the business of blivet fabrication more profitable than anywhere else. The blivets are as shoddily made as possible; the entire blivet supply chain from the open-pit mines worked by slave labor that provide the raw materials to the big box stores with part-time, poorly paid staff selling blivetronic technology to the masses is a human and environmental disaster.  Every possible cost has been externalized, so that the two multinational corporations that dominate the global blivet industry can maintain their profit margins and pay absurdly high salaries to their CEOs.
That in itself is bad enough, but let’s broaden the focus to include the whole systems in which blivet fabrication takes place: the economy as a whole, society as a whole, and the biosphere as a whole. The impact of technology on blivet fabrication in a market economy has predictable and well understood consequences for each of these whole systems, which can be summed up precisely in the language we’ve already used. In order to maximize its own profitability and return on shareholder investment, the blivet industry externalizes costs in every available direction. Since nobody else wants to bear those costs, either, most of them end up being passed onto the whole systems just named, because the economy, society, and the biosphere have no voice in today’s economic decisions.
Like the costs of dealing with blivet waste, though, the other externalized costs of blivet manufacture don’t go away just because they’re externalized. As externalities increase, they tend to degrade the whole systems onto which they’re dumped—the economy, society, and the biosphere. This is where the trap closes tight, because blivet manufacturing exists within those whole systems, and can’t be carried out unless all three systems are sufficiently intact to function in their usual way. As those systems degrade, their ability to function degrades also, and eventually one or more of them breaks down—the economy plunges into a depression, the society disintegrates into anarchy or totalitarianism, the biosphere shifts abruptly into a new mode that lacks adequate rainfall for crops—and the manufacture of blivets stops because the whole system that once supported it has stopped doing so.
Notice how this works out from the perspective of someone who’s benefiting from the externalization of costs by the blivet industry—the executives and stockholders in a blivet corporation, let’s say. As far as they’re concerned, until very late in the process, everything is fine and dandy: each new round of technological improvements in blivet fabrication increases their profits, and if each such step in the onward march of progress also means that working class jobs are eliminated or offshored, democratic institutions implode, toxic waste builds up in the food chain, or what have you, hey, that’s not their problem—and after all, that’s just the normal creative destruction of capitalism, right?
That sort of insouciance is easy for at least three reasons. First, the impacts of externalities on whole systems can pop up a very long way from the blivet factories.  Second, in a market economy, everyone else is externalizing their costs as enthusiastically as the blivet industry, and so it’s easy for blivet manufacturers (and everyone else) to insist that whatever’s going wrong is not their fault.  Third, and most crucially, whole systems as stable and enduring as economies, societies, and biospheres can absorb a lot of damage before they tip over into instability. The process of externalization of costs can thus run for a very long time, and become entrenched as a basic economic habit, long before it becomes clear to anyone that continuing along the same route is a recipe for disaster.
Even when externalized costs have begun to take a visible toll on the economy, society, and the biosphere, furthermore, any attempt to reverse course faces nearly insurmountable obstacles. Those who profit from the existing order of things can be counted on to fight tooth and nail for the right to keep externalizing their costs: after all, they have to pay the full price for any reduction in their ability to externalize costs, while the benefits created by not imposing those costs on whole systems are shared among all participants in the economy, society, and the biosphere respectively. Nor is it necessarily easy to trace back the causes of any given whole-system disruption to specific externalities benefiting specific people or industries. It’s rather like loading hanging weights onto a chain; sooner or later, as the amount of weight hung on the chain goes up, the chain is going to break, but the link that breaks may be far from the last weight that pushed things over the edge, and every other weight on  the chain made its own contribution to the end result
A society that’s approaching collapse because too many externalized costs have been loaded onto on the whole systems that support it thus shows certain highly distinctive symptoms. Things are going wrong with the economy, society, and the biosphere, but nobody seems to be able to figure out why; the measurements economists use to determine prosperity show contradictory results, with those that measure the profitability of individual corporations and industries giving much better readings those that measure the performance of whole systems; the rich are convinced that everything is fine, while outside the narrowing circles of wealth and privilege, people talk in low voices about the rising spiral of problems that beset them from every side. If this doesn’t sound familiar to you, dear reader, you probably need to get out more.
At this point it may be helpful to sum up the argument I’ve developed here:
a) Every increase in technological complexity tends also to increase the opportunities for externalizing the costs of economic activity;
b) Market forces make the externalization of costs mandatory rather than optional, since economic actors that fail to externalize costs will tend to be outcompeted by those that do;
c) In a market economy, as all economic actors attempt to externalize as many costs as possible, externalized costs will tend to be passed on preferentially and progressively to whole systems such as the economy, society, and the biosphere, which provide necessary support for economic activity but have no voice in economic decisions;
d) Given unlimited increases in technological complexity, there is no necessary limit to the loading of externalized costs onto whole systems short of systemic collapse;
e) Unlimited increases in technological complexity in a market economy thus necessarily lead to the progressive degradation of the whole systems that support economic activity;
f) Technological progress in a market economy  is therefore self-terminating, and ends in collapse.
Now of course there are plenty of arguments that could be deployed against this modest proposal. For example, it could be argued that progress doesn’t have to generate a rising tide of externalities. The difficulty with this argument is that externalization of costs isn’t an accidental side effect of technology but an essential aspect—it’s not a bug, it’s a feature. Every technology is a means of externalizing some cost that would otherwise be borne by a human body. Even something as simple as a hammer takes the wear and tear that would otherwise affect the heel of your hand, let’s say, and transfers it to something else: directly, to the hammer; indirectly, to the biosphere, by way of the trees that had to be cut down to make the charcoal to smelt the iron, the plants that were shoveled aside to get the ore, and so on.
For reasons that are ultimately thermodynamic in nature, the more complex a technology becomes, the more costs it generates. In order to outcompete a simpler technology, each more complex technology has to externalize a significant proportion of its additional costs, in order to compete against the simpler technology. In the case of such contemporary hypercomplex technosystems as the internet, the process of externalizing costs has gone so far, through so many tangled interrelationships, that it’s remarkably difficult to figure out exactly who’s paying for how much of the gargantuan inputs needed to keep the thing running. This lack of transparency feeds the illusion that large systems are cheaper than small ones, by making externalities of scale look like economies of scale.
It might be argued instead that a sufficiently stringent regulatory environment, forcing economic actors to absorb all the costs of their activities instead of externalizing them onto others, would be able to stop the degradation of whole systems while still allowing technological progress to continue. The difficulty here is that increased externalization of costs is what makes progress profitable. As just noted, all other things being equal, a complex technology will on average be more expensive in real terms than a simpler technology, for the simple fact that each additional increment of complexity has to be paid for by an investment of energy and other forms of real capital.
Strip complex technologies of the subsidies that transfer some of their costs to the government, the perverse regulations that transfer some of their costs to the rest of the economy, the bad habits of environmental abuse and neglect that transfer some of their costs to the biosphere, and so on, and pretty soon you’re looking at hard economic limits to technological complexity, as people forced to pay the full sticker price for complex technologies maximize their benefits by choosing simpler, more affordable options instead. A regulatory environment sufficiently strict to keep technology from accelerating to collapse would thus bring technological progress to a halt by making it unprofitable.
Notice, however, the flipside of the same argument: a society that chose to stop progressing technologically could maintain itself indefinitely, so long as its technologies weren’t dependent on nonrenewable resources or the like. The costs imposed by a stable technology on the economy, society, and the biosphere would be more or less stable, rather than increasing over time, and it would therefore be much easier to figure out how to balance out the negative effects of those externalities and maintain the whole system in a steady state.  Societies that treated technological progress as an option rather than a requirement, and recognized the downsides to increasing complexity, could also choose to reduce complexity in one area in order to increase it in another, and so on—or they could just raise a monument to the age of progress, and go do something else instead.
The logic suggested here requires a comprehensive rethinking of most of the contemporary world’s notions about technology, progress, and the good society. We’ll begin that discussion in future posts—after, that is, we discuss a second dimension of progress that came out of last week’s discussion.

What Progress Means

Wed, 2015-02-18 18:06
Last week’s post here on The Archdruid Report appears to have hit a nerve. That didn’t come as any sort of a surprise, admittedly.  It’s one thing to point out that going back to the simpler and less energy-intensive technologies of earlier eras could help extract us from the corner into which industrial society has been busily painting itself in recent decades; it’s quite another to point out that doing this can also be great fun, more so than anything that comes out of today’s fashionable technologies, and in a good many cases the results include an objectively better quality of life as well
That’s not one of the canned speeches that opponents of progress are supposed to make. According to the folk mythology of modern industrial culture, since progress always makes things better, the foes of whatever gets labeled as progress are supposed to put on hair shirts and insist that everyone has to suffer virtuously from a lack of progress, for some reason based on sentimental superstition. The Pygmalion effect being what it is, it’s not hard to find opponents of progress who say what they’re expected to say, and thus fulfill their assigned role in contemporary culture, which is to stand there in their hair shirts bravely protesting until the steamroller of progress rolls right over them.
The grip of that particular bit of folk mythology on the collective imagination of our time is tight enough that when somebody brings up some other reason to oppose “progress”—we’ll get into the ambiguities behind that familiar label in a moment—a great many people quite literally can’t absorb what’s actually being said, and respond instead to the canned speeches they expect to hear. Thus I had several people attempt to dispute the comments on last week’s post, castigating my readers with varying degrees of wrath and profanity for thinking that they had to sacrifice the delights of today’s technology and go creeping mournfully back to the unsatisfying lifestyles of an earlier day.
That was all the more ironic in that none of the readers who were commenting on the post were saying anything of the kind. Most of them were enthusiastically talking about how much more durable, practical, repairable, enjoyable, affordable, and user-friendly older technologies are compared to the disposable plastic trash that fills the stores these days. They were discussing how much more fun it is to embrace the delights of outdated technologies than it would be to go creeping mournfully back—or forward, if you prefer—to the unsatisfying lifestyles of the present time. That heresy is far more than the alleged openmindness and intellectual diversity of our age is willing to tolerate, so it’s not surprising that some people tried to pretend that nothing of the sort had been said at all. What was surprising to me, and pleasantly so, was the number of readers who were ready to don the party clothes of some earlier time and join in the Butlerian carnival.
There are subtleties to the project of deliberate technological regress that may not be obvious at first glance, though, and it seems sensible to discuss those here before we proceed.  It’s important, to begin with, to remember that when talking heads these days babble about technology in the singular, as a uniform, monolithic thing that progresses according to some relentless internal logic of its own, they’re spouting balderdash.  In the real world, there’s no such monolith; instead, there are technologies in the plural, a great many of them, clustered more or less loosely in technological suites which may or may not have any direct relation to one another.
An example might be useful here. Consider the technologies necessary to build a steel-framed bicycle. The metal parts require the particular suite of technologies we use to smelt ores, combine the resulting metals into useful alloys, and machine and weld those into shapes that fit together to make a bicycle. The tires, inner tubes, brake pads, seat cushion, handlebar grips, and paint require a different suite of technologies drawing on various branches of applied organic chemistry, and a few other suites also have a place:  for example, the one that’s needed to make and apply lubricants  The suites that make a bicycle have other uses; if you can build a bicycle, as Orville and Wilbur Wright demonstrated, you can also build an aircraft, and a variety of other interesting machines as well; that said, there are other technologies—say, the ones needed to manufacture medicines, or precision optics, or electronics—that require very different technological suites. You can have everything you need to build a bicycle and still be unable to make a telescope or a radio receiver, and vice versa.
Strictly speaking, therefore, nothing requires the project of deliberate technological regress to move in lockstep to the technologies of a specific past date and stay there. It would be wholly possible to dump certain items of modern technology while keeping others. It would be just as possible to replace one modern technological suite with an older equivalent from one decade, another with an equivalent from a different decade and so on. Imagine, for example, a future America in which solar water heaters (worked out by 1920) and passive solar architecture (mostly developed in the 1960s and 1970s) were standard household features, canal boats (dating from before 1800) and tall ships (ditto) were the primary means of bulk transport, shortwave radio (developed in the early 20th century) was the standard long-range communications medium, ultralight aircraft (largely developed in the 1980s) were still in use, and engineers crunched numbers using slide rules (perfected around 1880).
There’s no reason why such a pastiche of technologies from different eras couldn’t work. We know this because what passes for modern technology is a pastiche of the same kind, in which (for example) cars whose basic design dates from the 1890s are gussied up with onboard computers invented a century later. Much of modern technology, in fact, is old technology with a new coat of paint and a few electronic gimmicks tacked on, and it’s old technology that originated in many different eras, too. Part of what differentiates modern technology from older equivalents, in other words, is mere fashion. Another part, though, moves into more explosive territory.
In the conversation that followed last week’s post, one of my readers—tip of the archdruid’s hat to Cathy—recounted the story of the one and only class on advertising she took at college. The teacher invited a well-known advertising executive to come in and talk about the business, and one of the points he brought up was the marketing of disposable razors. The old-fashioned steel safety razor, the guy admitted cheerfully, was a much better product: it was more durable, less expensive, and gave a better shave than disposable razors. Unfortunately, it didn’t make the kind of profits for the razor industry that the latter wanted, and so the job of the advertising company was to convince shavers that they really wanted to spend more money on a worse product instead.
I know it may startle some people to hear a luxuriantly bearded archdruid talk about shaving, but I do have a certain amount of experience with the process—though admittedly it’s been a while. The executive was quite correct: an old-fashioned safety razor gives better shaves than a disposable. What’s more, an old-fashioned safety razor combined with a shaving brush, a cake of shaving soap, a mug and a bit of hot water from the teakettle produces a shaving experience that’s vastly better, in every sense, than what you’ll get from squirting cold chemical-laced foam out of a disposable can and then scraping your face with a disposable razor; the older method, furthermore, takes no more time, costs much less on a per-shave basis, and has a drastically smaller ecological footprint to boot.
Notice also the difference in the scale and complexity of the technological suites needed to maintain these two ways of shaving. To shave with a safety razor and shaving soap, you need the metallurgical suite that produces razors and razor blades, the very simple household-chemistry suite that produces soap, the ability to make pottery and brushes, and some way to heat water. To shave with a disposable razor and a can of squirt-on shaving foam, you need fossil fuels for plastic feedstocks, chemical plants to manufacture the plastic and the foam, the whole range of technologies needed to manufacture and fill the pressurized can, and so on—all so that you can count on getting an inferior shave at a higher price, and the razor industry can boost its quarterly profits.
That’s a small and arguably silly example of a vast and far from silly issue. These days, when you see the words “new and improved” on a product, rather more often than not, the only thing that’s been improved is the bottom line of the company that’s trying to sell it to you. When you hear equivalent claims about some technology that’s being marketed to society as a whole, rather than sold to you personally, the same rule applies at least as often. That’s one of the things that drove the enthusiastic conversations on this blog’s comment page last week, as readers came out of hiding to confess that they, too, had stopped using this or that piece of cutting-edge, up-to-date, hypermodern trash, and replaced it with some sturdy, elegant, user-friendly device from an earlier decade which works better and lacks the downsides of the newer item.
What, after all, defines a change as “progress”? There’s a wilderness of ambiguities hidden in that apparently simple word. The popular notion of progress presupposes that there’s an inherent dynamic to history, that things change, or tend to change, or at the very least ought to change, from worse to better over time.  That presupposition then gets flipped around into the even more dubious claim that just because something’s new, it must be better than whatever it replaced. Move from there to specific examples, and all of a sudden it’s necessary to deal with competing claims—if there are two hot new technologies on the market, is option A more progressive than option B, or vice versa? The answer, of course, is that whichever of them manages to elbow the other aside will be retroactively awarded the coveted title of the next step in the march of progress.
That was exactly the process by which the appropriate tech of the 1970s was shoved aside and buried in the memory hole of our culture. In its heyday, appropriate tech was as cutting-edge and progressive as anything you care to name, a rapidly advancing field pushed forward by brilliant young engineers and innovative startups, and it saw itself (and presented itself to the world) as the wave of the future. In the wake of the Reagan-Thatcher counterrevolution of the 1980s, though, it was retroactively stripped of its erstwhile status as an icon of progress and consigned to the dustbin of the past. Technologies that had been lauded in the media as brilliantly innovative in 1978 were thus being condemned in the same media as Luddite throwbacks by 1988. If that abrupt act of redefinition reminds any of my readers of the way history got rewritten in George Orwell’s 1984—“Oceania has never been allied with Eurasia” and the like—well, let’s just say the parallel was noticed at the time, too.
The same process on a much smaller scale can be traced with equal clarity in the replacement of the safety razor and shaving soap with the disposable razor and squirt-can shaving foam. In what sense is the latter, which wastes more resources and generates more trash in the process of giving users a worse shave at a higher price, more progressive than the former? Merely the fact that it’s been awarded that title by advertising and the media. If razor companies could make more money by reintroducing the Roman habit of scraping beard hairs off the face with a chunk of pumice, no doubt that would quickly be proclaimed as the last word in cutting-edge, up-to-date hypermodernity, too.
Behind the mythological image of the relentless and inevitable forward march of technology-in-the-singular in the grand cause of progress, in other words, lies a murky underworld of crass commercial motives and no-holds-barred struggles over which of the available technologies will get the funding and marketing that will define it as the next great step in progress. That’s as true of major technological programs as it is of shaving supplies. Some of my readers are old enough, as I am, to remember when supersonic airliners and undersea habitats were the next great steps in progress, until all of a sudden they weren’t.  We may not be all that far from the point at which space travel and nuclear power will go the way of Sealab and the Concorde.
In today’s industrial societies, we don’t talk about that. It’s practically taboo these days to mention the long, long list of waves of the future that abruptly stalled and rolled back out to sea without delivering on their promoters’ overblown promises. Remind people that the same rhetoric currently being used to prop up faith in space travel, nuclear power, or any of today’s other venerated icons of the religion of progress was lavished just as thickly on these earlier failures, and you can pretty much expect to have that comment shouted down as an irrelevancy if the other people in the conversation don’t simply turn their backs and pretend that they never heard you say anything at all.
They have to do something of the sort, because the alternative is to admit that what we call “progress” isn’t the impersonal, unstoppable force of nature that industrial culture’s ideology insists it must be. Pay attention to the grand technological projects that failed, compare them with those that are failing now, and it’s impossible to keep ignoring certain crucial if hugely unpopular points. To begin with technological progress is a function of collective choices—do we fund Sealab or the Apollo program? Supersonic transports or urban light rail? Energy conservation and appropriate tech or an endless series of wars in the Middle East? No impersonal force makes those decisions; individuals and institutions make them, and then use the rhetoric of impersonal progress to cloak the political and financial agendas that guide the decision-making process.
What’s more, even if the industrial world chooses to invest its resources in a project, the laws of physics and economics determine whether the project is going to work. The Concorde is the poster child here, a technological successbut an economic flop that never even managed to cover its operating costs. Like nuclear power, it was only viable given huge and continuing government subsidies, and since the strategic benefits Britain and France got from having Concordes in the air were nothing like so great as those they got from having an independent source of raw material for nuclear weapons, it’s not hard to see why the subsidies went where they did.
That is to say, when something is being lauded as the next great step forward in the glorious march of progress leading humanity to a better world, those who haven’t drunk themselves tipsy on folk mythology need to keep four things in mind. The first is that the next great step forward  in the glorious march of progres (etc.) might not actually work when it’s brought down out of the billowing clouds of overheated rhetoric into the cold hard world of everyday life. The second is that even if it works, the next great step forward (etc.) may be a white elephant in economic terms, and survive only so long as it gets propped up by subsidies. The third is that even if it does make economic sense, the next great step (etc.) may be an inferior product, and do a less effective job of meeting human needs than whatever it’s supposed to replace. The fourth is that when it comes right down to it, to label something as the next great (etc.) is just a sales pitch, an overblown and increasingly trite way of saying “Buy this product!”
Those necessary critiques, in turn, are all implicit in the project of deliberate technological regress. Get past the thoughtstopping rhetoric that insists “you can’t turn back the clock”—to rephrase a comment of G.K. Chesterton’s, most people turn back the clock every fall, so that’s hardly a valid objection—and it becomes hard not to notice that “progress” is just a label for whatever choices happen to have been made by governments and corporations, with or without input from the rest of us. If we don’t like the choices that have been made for us in the name of progress, in turn, we can choose something else.
Now of course it’s possible to stuff that sort of thinking back into the straitjacket of progress, and claim that progress is chugging along just fine, and all we have to do is get it back on the proper track, or what have you. This is a very common sort of argument, and one that’s been used over and over again by critics of this or that candidate for the next (etc.). The problem with that argument, as I see it, is that it may occasionally win battles but it pretty consistently loses the war; by failing to challenge the folk mythology of progress and the agendas that are enshrined by that mythology, it guarantees that no matter what technology or policy or program gets put into place, it’ll end up leading the same place as all the others before it, because it questions the means but forgets to question the goals.
That’s the trap hardwired into the contemporary faith in progress. Once you buy into the notion that the specific choices made by industrial societies over the last three centuries or so are something more than the projects that happened to win out in the struggle for wealth and power, once you let yourself believe that there’s a teleology to it all—that there’s some objectively definable goal called “progress” that all these choices did a better or worse job of furthering—you’ve just made it much harder to ask where this thing called “progress” is going. The word “progress,” remember, means going further in the same direction, and it’s precisely questions about the direction that industrial society is going that most need to be asked.
I’d like to suggest, in fact, that going further in the direction we’ve been going isn’t a particularly bright idea just now.  It isn’t even necessary to point to the more obviously self-destructive dimensions of business as usual. Look at any trend that affects your life right now, however global or local that trend may be, and extrapolate it out in a straight line indefinitely; that’s what going further in the same direction means. If that appeals to you, dear reader, then you’re certainly welcome to it.  I have to say it doesn’t do much for me.
It’s only from within the folk mythology of progress that we have no choice but to accept the endless prolongation of current trends. Right now, as individuals, we can choose to shrug and walk away from the latest hypermodern trash, and do something else instead. Later on, on the far side of the crisis of our time, it may be possible to take the same logic further, and make deliberate technological regress a recognized policy option for organizations, communities, and whole nations—but that will depend on whether individuals do the thing first, and demonstrate to everyone else that it’s a workable option. In next week’s post, we’ll talk more about where that strategy might lead.

The Butlerian Carnival

Wed, 2015-02-11 16:29
Over the last week or so, I’ve heard from a remarkable number of people who feel that a major crisis is in the offing. The people in question don’t know each other, many of them have even less contact with the mass media than I do, and the sense they’ve tried to express to me is inchoate enough that they’ve been left fumbling for words, but they all end up reaching for the same metaphors: that something in the air just now seems reminiscent of the American colonies in 1775, France in 1789, America in 1860, Europe in 1914, or the world in 1939: a sense of being poised on the brink of convulsive change, with the sound of gunfire and marching boots coming ever more clearly from the dimly seen abyss ahead.
It’s not an unreasonable feeling, all things considered. In Washington DC, Obama’s flunkies are beating the war drums over Ukraine, threatening to send shipments of allegedly “defensive” weapons to join the mercenaries and military advisors we’ve already not-so-covertly got over there. Russian officials have responded to American saber-rattling by stating flatly that a US decision to arm Kiev will be the signal for all-out war. The current Ukrainian regime, installed by a US-sponsored coup and backed by NATO, means to Russia precisely what a hostile Canadian government installed by a Chinese-sponsored coup and backed by the People’s Liberation Army would mean to the United States; if Obama’s trademark cluelessness leads him to ignore that far from minor point and decide that the Russians are bluffing, we could be facing a European war within weeks.
Head south and west from the fighting around Donetsk, and another flashpoint is heating up toward an explosion of its own just now. Yes, that would be Greece, where the new Syriza government has refused to back down from the promises that got it into office: promises that center on the rejection of the so-called “austerity” policies that have all but destroyed the Greek economy since they were imposed in 2009.  This shouldn’t be news to anyone; those same policies, though they’ve been praised to the skies by neoliberal economists for decades now as a guaranteed ticket to prosperity, have had precisely the opposite effect in every single country where they’ve been put in place.
Despite that track record of unbroken failure, the EU—in particular, Germany, which has benefited handsomely from the gutting of southern European economies—continues to insist that Greece must accept what amounts to a perpetual state of debt peonage. The Greek defense minister noted in response in a recent speech that if Europe isn’t willing to cut a deal, other nations might well do so. He’s quite correct; it’s probably a safe bet that cold-eyed men in Moscow and Beijing are busy right now figuring out how best to step through the window of opportunity the EU is flinging open for them. If they do so—well, I’ll leave it to my readers to consider how the US is likely to respond to the threat of Russian air and naval bases in Greece, which would be capable of projecting power anywhere in the eastern and central Mediterranean basin. Here again, war is a likely outcome; I hope that the Greek government is braced for an attempt at regime change.
That is to say, the decline and fall of industrial civilization is proceeding in the normal way, at pretty much the normal pace. The thermodynamic foundations tipped over into decline first, as stocks of cheap abundant fossil fuels depleted steadily and the gap had to be filled by costly and much less abundant replacements, driving down net energy; the economy went next, as more and more real wealth had to be pulled out of all other economic activities to keep the energy supply more or less steady, until demand destruction cut in and made that increasingly frantic effort moot; now a global political and military superstructure dependent on cheap abundant fossil fuels, and on the economic arrangement that all of that surplus energy made possible, is cracking at the seams.
One feature of times like these is that the number of people who can have an influence on the immediate outcome declines steadily as crisis approaches. In the years leading up to 1914, for example, a vast number of people contributed to the rising spiral of conflict between the aging British Empire and its German rival, but the closer war came, the narrower the circle of decision-makers became, until a handful of politicians in Germany, France, and Britain had the fate of Europe in their hands. A few more bad decisions, and the situation was no longer under anybody’s control; thereafter, the only option left was to let the juggernaut of the First World War roll mindlessly onward to its conclusion.
In the same way, as recently as the 1980s, many people in the United States and elsewhere had some influence on how the industrial age would end; unfortunately most of them backed politicians who cashed in the resources that could have built a better future on one last round of absurd extravagance, and a whole landscape of possibilities went by the boards. Step by step, as the United States backed itself further and further into a morass of short-term gimmicks with ghastly long-term consequences, the number of people who have had any influence on the trajectory we’re on has narrowed steadily, and as we approach what may turn out to be the defining crisis of our time, a handful of politicians in a handful of capitals are left to make the last decisions that can shape the situation in any way at all, before the tanks begin to roll and the fighter-bombers rise up from their runways.
Out here on the fringes of the collective conversation of our time, where archdruids lurk and heresies get uttered, the opportunity to shape events as they happen is a very rare thing. Our role, rather, is to set agendas for the future, to take ideas that are unthinkable in the mainstream today and prepare them for their future role as the conventional wisdom of eras that haven’t dawned yet. Every phrase on the lips of today’s practical men of affairs, after all, was once a crazy notion taken seriously only by the lunatic fringe—yes, that includes democracy, free-market capitalism, and all the other shibboleths of our age. 
With that in mind, while we wait to see whether today’s practical men of affairs stumble into war the way they did in 1914, I propose to shift gears and talk about something else—something that may seem whimsical, even pointless, in the light of the grim martial realities just discussed. It’s neither whimsical nor pointless, as it happens, but the implications may take a little while to dawn even on those of my readers who’ve been following the last few years of discussions most closely. Let’s begin with a handful of data points.
Item: Britain’s largest bookseller recently noted that sales of the Kindle e-book reader have dropped like a rock in recent months, while sales of old-fashioned printed books are up. Here in the more gizmocentric USA, e-books retain more of their erstwhile popularity, but the bloom is off the rose; among the young and hip, it’s not hard at all to find people who got rid of their book collections in a rush of enthusiasm when e-books came out, regretted the action after it was too late, and now are slowly restocking their bookshelves while their e-book readers collect cobwebs or, at best, find use as a convenience for travel and the like.
Item: more generally, a good many of the hottest new trends in popular culture aren’t new trends at all—they’re old trends revived, in many cases, by people who weren’t even alive to see them the first time around. Kurt B. Reighley’s lively guide The United States of Americana was the first, and remains the best, introduction to the phenomenon, one that embraces everything from burlesque shows and homebrewed bitters to backyard chickens and the revival of Victorian martial arts. One pervasive thread that runs through the wild diversity of this emerging subculture is the simple recognition that many of these older things are better, in straightforwardly measurable senses, than their shiny modern mass-marketed not-quite-equivalents.
Item: within that subculture, a small but steadily growing number of people have taken the principle to its logical extreme and adopted the lifestyles and furnishings of an earlier decade wholesale in their personal lives. The 1950s are a common target, and so far as I know, adopters of 1950s culture are the furthest along the process of turning into a community, but other decades are increasingly finding the same kind of welcome among those less than impressed by what today’s society has on offer. Meanwhile, the reenactment scene has expanded spectacularly in recent years from the standard hearty fare of Civil War regiments and the neo-medievalism of the Society for Creative Anachronism to embrace almost any historical period you care to name. These aren’t merely dress-up games; go to a buckskinner’s rendezvous or an outdoor SCA event, for example, and you’re as likely as not to see handspinners turning wool into yarn with drop spindles, a blacksmith or two laboring over a portable forge, and the like.
Other examples of the same broad phenomenon could be added to the list, but these will do for now. I’m well aware, of course, that most people—even most of my readers—will have dismissed the things just listed as bizarre personal eccentricities, right up there with the goldfish-swallowing and flagpole-sitting of an earlier era. I’d encourage those of my readers who had that reaction to stop, take a second look, and tease out the mental automatisms that make that dismissal so automatic a part of today’s conventional wisdom. Once that’s done, a third look might well be in order, because the phenomenon sketched out here marks a shift of immense importance for our future.
For well over two centuries now, since it first emerged as the crackpot belief system of a handful of intellectuals on the outer fringes of their culture, the modern ideology of progress has taken it as given that new things were by definition better than whatever they replaced.  That assumption stands at the heart of contemporary industrial civilization’s childlike trust in the irreversible cumulative march of progress toward a future among the stars. Finding ways to defend that belief even when it obviously wasn’t true—when the latest, shiniest products of progress turned out to be worse in every meaningful sense than the older products they elbowed out of the way—was among the great growth industries of the 20th century; even so, there were plenty of cases where progress really did seem to measure up to its billing. Given the steady increases of energy per capita in the world’s industrial nations over the last century or so, that was a predictable outcome.
The difficulty, of course, is that the number of cases where new things really are better than what they replace has been shrinking steadily in recent decades, while the number of cases where old products are quite simply better than their current equivalents—easier to use, more effective, more comfortable, less prone to break, less burdened with unwanted side effects and awkward features, and so on—has been steadily rising. Back behind the myth of progress, like the little man behind the curtain in The Wizard of Oz, stand two unpalatable and usually unmentioned realities. The first is that profits, not progress, determines which products get marketed and which get roundfiled; the second is that making a cheaper, shoddier product and using advertising gimmicks to sell it anyway has been the standard marketing strategy across a vast range of American businesses for years now.
More generally, believers in progress used to take it for granted that progress would sooner or later bring about a world where everyone would live exciting, fulfilling lives brimfull of miracle products and marvelous experiences. You still hear that sort of talk from the faithful now and then these days, but it’s coming to sound a lot like all that talk about the glorious worker’s paradise of the future did right around the time the Iron Curtain came down for good. In both cases, the future that was promised didn’t have much in common with the one that actually showed up. The one we got doesn’t have some of the nastier features of the one the former Soviet Union and its satellites produced—well, not yet, at least—but the glorious consumer’s paradise described in such lavish terms a few decades back got lost on the way to the spaceport, and what we got instead was a bleak landscape of decaying infrastructure, abandoned factories, prostituted media, and steadily declining standards of living for everyone outside the narrowing circle of the privileged, with the remnants of our once-vital democratic institutions hanging above it all like rotting scarecrows silhouetted against a darkening sky.
In place of those exciting, fulfilling lives mentioned above, furthermore, we got the monotony and stress of long commutes, cubicle farms, and would-you-like-fries-with that for the slowly shrinking fraction of our population who can find a job at all. The Onion, with its usual flair for packaging unpalatable realities in the form of deadpan humor, nailed it a few days ago with a faux health-news article announcing that the best thing office workers could do for their health is stand up at their desk, leave the office, and never go back. Joke or not, it’s not bad advice; if you have a full-time job in today’s America, the average medieval peasant had a less stressful job environment and more days off than you do; he also kept a larger fraction of the product of his labor than you’ll ever see.
Then, of course, if you’re like most Americans, you’ll numb yourself once you get home by flopping down on the sofa and spending most of your remaining waking hours staring at little colored pictures on a glass screen. It’s remarkable how many people get confused about what this action really entails. They insist that they’re experiencing distant places, traveling in worlds of pure imagination, and so on through the whole litany of self-glorifying drivel the mass media likes to employ in its own praise. Let us please be real: when you watch a program about the Amazon rain forest, you’re not experiencing the Amazon rain forest; you’re experiencing colored pictures on a screen, and you’re only getting as much of the experience as fits through the narrow lens of a video camera and the even narrower filter of the production process. The difference between experiencing something and watching it on TV or the internet, that is to say, is precisely the same as the difference between making love and watching pornography; in each case, the latter is a very poor substitute for the real thing.
For most people in today’s America, in other words, the closest approach to the glorious consumer’s paradise of the future they can expect to get is eight hours a day, five days a week of mindless, monotonous work under the constant pressure of management efficiency experts, if they’re lucky enough to get a job at all, with anything up to a couple of additional hours commuting and any off-book hours the employer happens to choose to demand from them into the deal, in order to get a paycheck that buys a little less each month—inflation is under control, the government insists, but prices somehow keep going up—of products that get more cheaply made, more likely to be riddled with defects, and more likely to pose a serious threat to the health and well-being of their users, with every passing year. Then they can go home and numb their nervous systems with those little colored pictures on the screen, showing them bland little snippets of experiences they will never have, wedged in there between the advertising.
That’s the world that progress has made. That’s the shining future that resulted from all those centuries of scientific research and technological tinkering, all the genius and hard work and sacrifice that have gone into the project of progress. Of course there’s more to the consequences of progress than that; progress has saved quite a few children from infectious diseases, and laced the environment with so many toxic wastes that childhood cancer, all but unheard of in 1850, is a routine event today; it’s made impressive contributions to human welfare, while flooding the atmosphere with greenhouse gases that will soon make far more impressive contributions to human suffering and death—well, I could go on along these lines for quite a while. True believers in the ideology of perpetual progress like to insist that all the good things ought to be credited to progress while all the bad things ought to be blamed on something else, but that’s not so plausible an article of faith as it once was, and it bids fair to become a great deal less common as the downsides of progress become more and more difficult to ignore.
The data points I noted earlier in this week’s post, I’ve come to believe, are symptoms of that change, the first stirrings of wind that tell of the storm to come. People searching for a better way of living than the one our society offers these days are turning to the actual past, rather than to some imaginary future, in that quest. That’s the immense shift I mentioned earlier. What makes it even more momentous is that by and large, it’s not being done in the sort of grim Puritanical spirit of humorless renunciation that today’s popular culture expects from those who want something other than what the consumer economy has on offer. It’s being done, rather, in a spirit of celebration.
One of my readers responded to my post  two weeks ago on deliberate technological regress by suggesting that I was proposing a Butlerian jihad of sorts. (Those of my readers who don’t get the reference should pick up a copy of Frank Herbert’s iconic SF novel Dune and read it.) I demurred, for two reasons. First, the Butlerian jihad in Herbert’s novel was a revolt against computer technology, and I see no need for that; once the falling cost of human labor intersects the rising cost of energy and technology, and it becomes cheaper to hire file clerks and accountants than to maintain the gargantuan industrial machine that keeps computer technology available, computers will go away, or linger as a legacy technology for a narrowing range of special purposes until the hardware finally burns out.
The second reason, though, is the more important. I’m not a fan of jihads, or of holy wars of any flavor; history shows all too well that when you mix politics and violence with religion, any actual religious content vanishes away, leaving its castoff garments to cover the naked rule of force and fraud. If you want people to embrace a new way of looking at things, furthermore, violence, threats, and abusive language don’t work, and it’s even less effective to offer that new way as a ticket to virtuous misery, along the lines of the Puritanical spirit noted above. That’s why so much of the green-lifestyle propaganda of the last thirty years has done so little good—so much of it has been pitched as a way to suffer self-righteously for the good of Gaia, and while that approach appeals to a certain number of wannabe martyrs, that’s not a large enough fraction of the population to matter.
The people who are ditching their Kindles and savoring books as physical objects, brewing their own beer and resurrecting other old arts and crafts, reformatting their lives in the modes of a past decade, or spending their spare time reconnecting with the customs and technologies of an earlier time—these people aren’t doing any of those things out of some passion for self-denial. They’re doing them because these things bring them delights that the shoddy mass-produced lifestyles of the consumer economy can’t match. What these first stirrings suggest to me is that the way forward isn’t a Butlerian jihad, but a Butlerian carnival—a sensuous celebration of the living world outside the cubicle farms and the glass screens, which will inevitably draw most of its raw materials from eras, technologies, and customs of the past, which don’t require the extravagant energy and resource inputs that the modern consumer economy demands, and so will be better suited to a future defined by scarce energy and resources.
The Butlerian carnival isn’t the only way to approach the deliberate technological regression we need to carry out in the decades ahead, but it’s an important one. In upcoming posts, I’ll talk more about how this and other avenues to the same goal might be used to get through the mess immediately ahead, and start laying foundations for a future on the far side of the crises of our time.

As Night Closes In

Wed, 2015-02-04 17:47
I was saddened to learn a few days ago, via a phone call from a fellow author, that William R. Catton Jr. died early last month, just short of his 89th birthday. Some of my readers will have no idea who he was; others may dimly recall that I’ve mentioned him and his most important book, Overshoot, repeatedly in these essays. Those who’ve taken the time to read the book just named may be wondering why none of the sites in the peak oil blogosphere has put up an obituary, or even noted the man’s passing. I don’t happen to know the answer to that last question, though I have my suspicions.
I encountered Overshoot for the first time in a college bookstore in Bellingham, Washington in 1983. Red letters on a stark yellow spine spelled out the title, a word I already knew from my classes in ecology and systems theory; I pulled it off the shelf, and found the future staring me in the face. This is what’s on the front cover below the title:
carrying capacity: maximum permanently supportable load.
cornucopian myth: euphoric belief in limitless resources.
drawdown: stealing resources from the future.
cargoism: delusion that technology will always save us from
overshoot: growth beyond an area’s carrying capacity, leading to
crash: die-off.
If you want to know where I got the core ideas I’ve been exploring in these essays for the last eight-going-on-nine years, in other words, now you know. I still have that copy of Overshoot; it’s sitting on the desk in front of me right now, reminding me yet again just how many chances we had to turn away from the bleak future that’s closing in around us now, like the night at the end of a long day.
Plenty of books in the 1970s and early 1980s applied the lessons of ecology to the future of industrial civilization and picked up at least part of the bad news that results. Overshoot was arguably the best of the lot, but it was pretty much guaranteed to land even deeper in the memory hole than the others. The difficulty was that Catton’s book didn’t pander to the standard mythologies that still beset any attempt to make sense of the predicament we’ve made for ourselves; it provided no encouragement to what he called cargoism, the claim that technological progress will inevitably allow us to have our planet and eat it too, without falling off the other side of the balance into the sort of apocalyptic daydreams that Hollywood loves to make into bad movies. Instead, in calm, crisp, thoughtful prose, he explained how industrial civilization was cutting its own throat, how far past the point of no return we’d already gone, and what had to be done in order to salvage anything from the approaching wreck.
As I noted in a post here in 2011, I had the chance to meet Catton at an ASPO conference, and tried to give him some idea of how much his book had meant to me. I did my best not to act like a fourteen-year-old fan meeting a rock star, but I’m by no means sure that I succeeded. We talked for fifteen minutes over dinner; he was very gracious; then things moved on, each of us left the conference to carry on with our lives, and now he’s gone. As the old song says, that’s the way it goes.
There’s much more that could be said about William Catton, but that task should probably be left for someone who knew the man as a teacher, a scholar, and a human being. I didn’t; except for that one fifteen-minute conversation, I knew him solely as the mind behind one of the books that helped me make sense of the world, and then kept me going on the long desert journey through the Reagan era, when most of those who claimed to be environmentalists over the previous decade cashed in their ideals and waved around the cornucopian myth as their excuse for that act. Thus I’m simply going to urge all of my readers who haven’t yet read Overshoot to do so as soon as possible, even if they have to crawl on their bare hands and knees over abandoned fracking equipment to get a copy. Having said that, I’d like to go on to the sort of tribute I think he would have appreciated most: an attempt to take certain of his ideas a little further than he did.
The core of Overshoot, which is also the core of the entire world of appropriate technology and green alternatives that got shot through the head and shoved into an unmarked grave in the Reagan years, is the recognition that the principles of ecology apply to industrial society just as much as they do to other communities of living things. It’s odd, all things considered, that this is such a controversial proposal. Most of us have no trouble grasping the fact that the law of gravity affects human beings the same way it affects rocks; most of us understand that other laws of nature really do apply to us; but quite a few of us seem to be incapable of extending that same sensible reasoning to one particular set of laws, the ones that govern how communities of living things relate to their environments.
If people treated gravity the way they treat ecology, you could visit a news website any day of the week and read someone insisting with a straight face that while it’s true that rocks fall down when dropped, human beings don’t—no, no, they fall straight up into the sky, and anyone who thinks otherwise is so obviously wrong that there’s no point even discussing the matter. That degree of absurdity appears every single day in the American media, and in ordinary conversations as well, whenever ecological issues come up. Suggest that a finite planet must by definition contain a finite amount of fossil fuels, that dumping billions of tons of gaseous trash into the air every single year for centuries might change the way that the atmosphere retains heat, or that the law of diminishing returns might apply to technology the way it applies to everything else, and you can pretty much count on being shouted down by those who, for all practical purposes, might as well believe that the world is flat.
Still, as part of the ongoing voyage into the unspeakable in which this blog is currently engaged, I’d like to propose that, in fact, human societies are as subject to the laws of ecology as they are to every other dimension of natural law. That act of intellectual heresy implies certain conclusions that are acutely unwelcome in most circles just now; still, as my regular readers will have noticed long since, that’s just one of the services this blog offers.
Let’s start with the basics. Every ecosystem, in thermodynamic terms, is a process by which relatively concentrated energy is dispersed into diffuse background heat. Here on Earth, at least, the concentrated energy mostly comes from the Sun, in the form of solar radiation—there are a few ecosystems, in deep oceans and underground, that get their energy from chemical reactions driven by the Earth’s internal heat instead. Ilya Prigogine showed some decades back that the flow of energy through a system of this sort tends to increase the complexity of the system; Jeremy England, a MIT physicist, has recently shown that the same process accounts neatly for the origin of life itself. The steady flow of energy from source to sink is the foundation on which everything else rests.
The complexity of the system, in turn, is limited by the rate at which energy flows through the system, and this in turn depends on the difference in concentration between the energy that enters the system, on the one hand, and the background into which waste heat diffuses when it leaves the system, on the other. That shouldn’t be a difficult concept to grasp. Not only is it basic thermodynamics, it’s basic physics—it’s precisely equivalent, in fact, to pointing out that the rate at which water flows through any section of a stream depends on the difference in height between the place where the water flows into that section and the place where it flows out.
Simple as it is, it’s a point that an astonishing number of people—including some who are scientifically literate—routinely miss. A while back on this blog, for example, I noted that one of the core reasons you can’t power a modern industrial civilization on solar energy is that sunlight is relatively diffuse as an energy source, compared to the extremely concentrated energy we get from fossil fuels. I still field rants from people insisting that this is utter hogwash, since photons have exactly the same amount of energy they did when they left the Sun, and so the energy they carry is just as concentrated as it was when it left the Sun. You’ll notice, though, that if this was the only variable that mattered, Neptune would be just as warm as Mercury, since each of the photons hitting the one planet pack on average the same energetic punch as those that hit the other.
It’s hard to think of a better example of the blindness to whole systems that’s pandemic in today’s geek culture. Obviously, the difference between the temperatures of Neptune and Mercury isn’t a function of the energy of individual photons hitting the two worlds; it’s a function of differing concentrations of photons—the number of them, let’s say, hitting a square meter of each planet’s surface. This is also one of the two figures that matter when we’re talking about solar energy here on Earth. The other? That’s the background heat into which waste energy disperses when the system, eco- or solar, is done with it. On the broadest scale, that’s deep space, but ecosystems don’t funnel their waste heat straight into orbit, you know. Rather, they diffuse it into the ambient temperature at whatever height above or below sea level, and whatever latitude closer or further from the equator, they happen to be—and since that’s heated by the Sun, too, the difference between input and output concentrations isn’t very substantial.
Nature has done astonishing things with that very modest difference in concentration. People who insist that photosynthesis is horribly inefficient, and of course we can improve its efficiency, are missing a crucial point: something like half the energy that reaches the leaves of a green plant from the Sun is put to work lifting water up from the roots by an ingenious form of evaporative pumping, in which water sucked out through the leaf pores as vapor draws up more water through a network of tiny tubes in the plant’s stems. Another few per cent goes into the manufacture of sugars by photosynthesis, and a variety of minor processes, such as the chemical reactions that ripen fruit, also depend to some extent on light or heat from the Sun; all told, a green plant is probably about as efficient in its total use of solar energy as the laws of thermodynamics will permit. 
What’s more, the Earth’s ecosystems take the energy that flows through the green engines of plant life and put it to work in an extraordinary diversity of ways. The water pumped into the sky by what botanists call evapotranspiration—that’s the evaporative pumping I mentioned a moment ago—plays critical roles in local, regional, and global water cycles. The production of sugars to store solar energy in chemical form kicks off an even more intricate set of changes, as the plant’s cells are eaten by something, which is eaten by something, and so on through the lively but precise dance of the food web. Eventually all the energy the original plant scooped up from the Sun turns into diffuse waste heat and permeates slowly up through the atmosphere to its ultimate destiny warming some corner of deep space a bit above absolute zero, but by the time it gets there, it’s usually had quite a ride.
That said, there are hard upper limits to the complexity of the ecosystem that these intricate processes can support. You can see that clearly enough by comparing a tropical rain forest to a polar tundra. The two environments may have approximately equal amounts of precipitation over the course of a year; they may have an equally rich or poor supply of nutrients in the soil; even so, the tropical rain forest can easily support fifteen or twenty thousand species of plants and animals, and the tundra will be lucky to support a few hundred. Why? The same reason Mercury is warmer than Neptune: the rate at which photons from the sun arrive in each place per square meter of surface.
Near the equator, the sun’s rays fall almost vertically.  Close to the poles, since the Earth is round, the Sun’s rays come in at a sharp angle, and thus are spread out over more surface area. The ambient temperature’s quite a bit warmer in the rain forest than it is on the tundra, but because the vast heat engine we call the atmosphere pumps heat from the equator to the poles, the difference in ambient temperature is not as great as the difference in solar input per cubic meter. Thus ecosystems near the equator have a greater difference in energy concentration between input and output than those near the poles, and the complexity of the two systems varies accordingly.
All this should be common knowledge. Of course it isn’t, because the industrial world’s notions of education consistently ignore what William Catton called “the processes that matter”—that is, the fundamental laws of ecology that frame our existence on this planet—and approach a great many of those subjects that do make it into the curriculum in ways that encourage the most embarrassing sort of ignorance about the natural processes that keep us all alive. Down the road a bit, we’ll be discussing that in much more detail. For now, though, I want to take the points just made and apply them systematically, in much the way Catton did, to the predicament of industrial civilization.
A human society is an ecosystem.  Like any other ecosystem, it depends for its existence on flows of energy, and as with any other ecosystem, the upper limit on its complexity depends ultimately on the difference in concentration between the energy that enters it and the background into which its waste heat disperses. (This last point is a corollary of White’s Law, one of the fundamental principles of human ecology, which holds that a society’s economic development is directly proportional to its consumption of energy per capita.)  Until the beginning of the industrial revolution, that upper limit was not much higher than the upper limit of complexity in other ecosystems, since human ecosystems drew most of their energy from the same source as nonhuman ones: sunlight falling on green plants.  As human societies figured out how to tap other flows of solar energy—windpower to drive windmills and send ships coursing over the seas, water power to turn mills, and so on—that upper limit crept higher, but not dramatically so.
The discoveries that made it possible to turn fossil fuels into mechanical energy transformed that equation completely. The geological processes that stockpiled half a billion years of sunlight into coal, oil, and natural gas boosted the concentration of the energy inputs available to industrial societies by an almost unimaginable factor, without warming the ambient temperature of the planet more than a few degrees, and the huge differentials in energy concentration that resulted drove an equally unimaginable increase in complexity. Choose any measure of complexity you wish—number of discrete occupational categories, average number of human beings involved in the production, distribution, and consumption of any given good or service, or what have you—and in the wake of the industrial revolution, it soared right off the charts. Thermodynamically, that’s exactly what you’d expect.
The difference in energy concentration between input and output, it bears repeating, defines the upper limit of complexity. Other variables determine whether or not the system in question will achieve that upper limit. In the ecosystems we call human societies, knowledge is one of those other variables. If you have a highly concentrated energy source and don’t yet know how to use it efficiently, your society isn’t going to become as complex as it otherwise could. Over the three centuries of industrialization, as a result, the production of useful knowledge was a winning strategy, since it allowed industrial societies to rise steadily toward the upper limit of complexity defined by the concentration differential. The limit was never reached—the law of diminishing returns saw to that—and so, inevitably, industrial societies ended up believing that knowledge all by itself was capable of increasing the complexity of the human ecosystem. Since there’s no upper limit to knowledge, in turn, that belief system drove what Catton called the cornucopian myth, the delusion that there would always be enough resources if only the stock of knowledge increased quickly enough.
That belief only seemed to work, though, as long as the concentration differential between energy inputs and the background remained very high. Once easily accessible fossil fuels started to become scarce, and more and more energy and other resources had to be invested in the extraction of what remained, problems started to crop up. Tar sands and oil shales in their natural form are not as concentrated an energy source as light sweet crude—once they’re refined, sure, the differences are minimal, but a whole system analysis of energy concentration has to start at the moment each energy source enters the system. Take a cubic yard of tar sand fresh from the pit mine, with the sand still in it, or a cubic yard of oil shale with the oil still trapped in the rock, and you’ve simply got less energy per unit volume than you do if you’ve got a cubic yard of light sweet crude fresh from the well, or even a cubic yard of good permeable sandstone with light sweet crude oozing out of every pore.
It’s an article of faith in contemporary culture that such differences don’t matter, but that’s just another aspect of our cornucopian myth. The energy needed to get the sand out of the tar sands or the oil out of the shale oil has to come from somewhere, and that energy, in turn, is not available for other uses. The result, however you slice it conceptually, is that the upper limit of complexity begins moving down. That sounds abstract, but it adds up to a great deal of very concrete misery, because as already noted, the complexity of a society determines such things as the number of different occupational specialties it can support, the number of employees who are involved in the production and distribution of a given good or service, and so on. There’s a useful phrase for a sustained contraction in the usual measures of complexity in a human ecosystem: “economic depression.”
The economic troubles that are shaking the industrial world more and more often these days, in other words, are symptoms of a disastrous mismatch between the level of complexity that our remaining concentration differential can support, and the level of complexity that our preferred ideologies insist we ought to have. As those two things collide, there’s no question which of them is going to win. Adding to our total stock of knowledge won’t change that result, since knowledge is a necessary condition for economic expansion but not a sufficient one: if the upper limit of complexity set by the laws of thermodynamics drops below the level that your knowledge base would otherwise support, further additions to the knowledge base simply mean that there will be a growing number of things that people know how to do in theory, but that nobody has the resources to do in practice.
Knowledge, in other words, is not a magic wand, a surrogate messiah, or a source of miracles. It can open the way to exploiting energy more efficiently than otherwise, and it can figure out how to use energy resources that were not previously being used at all, but it can’t conjure energy out of thin air. Even if the energy resources are there, for that matter, if other factors prevent them from being used, the knowledge of how they might be used offers no consolation—quite the contrary.
That latter point, I think, sums up the tragedy of William Catton’s career. He knew, and could explain with great clarity, why industrialism would bring about its own downfall, and what could be done to salvage something from its wreck. That knowledge, however, was not enough to make things happen; only a few people ever listened, most of them promptly plugged their ears and started chanting “La, la, la, I can’t hear you” once Reagan made that fashionable, and the actions that might have spared all of us a vast amount of misery never happened. When I spoke to him in 2011, he was perfectly aware that his life’s work had done essentially nothing to turn industrial society aside from its rush toward the abyss. That’s got to be a bitter thing to contemplate in your final hours, and I hope his thoughts were on something else last month as the night closed in at last.

The One Way Forward

Wed, 2015-01-28 16:19
All things considered, 2015 just isn’t shaping up to be a good year for believers in business as usual. Since last week’s post here on The Archdruid Report, the anti-austerity party Syriza has swept the Greek elections, to the enthusiastic cheers of similar parties all over Europe and the discomfiture of the Brussels hierarchy. The latter have no one to blame for this turn of events but themselves; for more than a decade now, EU policies have effectively put sheltering banks and bondholders from the healthy discipline of the market ahead of all other considerations, including the economic survival of entire nations. It should be no surprise to anyone that this wasn’t an approach with a long shelf life.
Meanwhile, the fracking bust continues unabated. The number of drilling rigs at work in American oilfields continues to drop vertically from week to week, layoffs in the nation’s various oil patches are picking up speed, and the price of oil remains down at levels that make further fracking a welcome mat for the local bankruptcy judge. Those media pundits who are still talking the fracking industry’s book keep insisting that the dropping price of oil proves that they were right and those dratted heretics who talk of peak oil must be wrong, but somehow those pundits never get around to explaining why iron ore, copper, and most other major commodities are dropping in price even faster than crude oil, nor why demand for petroleum products here in the US has been declining steadily as well.
The fact of the matter is that an industrial economy built to run on cheap conventional oil can’t run on expensive oil for long without running itself into the ground. Since 2008, the world’s industrial nations have tried to make up the difference by flooding their economies with cheap credit, in the hope that this would somehow make up for the sharply increased amounts of real wealth that have had to be diverted from other purposes into the struggle to keep liquid fuels flowing at their peak levels. Now, though, the laws of economics have called their bluff; the wheels are coming off one national economy after another, and the price of oil (and all those other commodities) has dropped to levels that won’t cover the costs of fracked oil, tar sands, and the like, because all those frantic attempts to externalize the costs of energy production just meant that the whole global economy took the hit.
Now of course this isn’t how governments and the media are spinning the emerging crisis. For that matter, there’s no shortage of people outside the corridors of power, or for that matter of punditry, who ignore the general collapse of commodity prices, fixate on oil outside of the broader context of resource depletion in general, and insist that the change in the price of oil must be an act of economic warfare, or what have you. It’s a logic that readers of this blog will have seen deployed many times in the past: whatever happens, it must have been decided and carried out by human beings. An astonishing number of people these days seem unable to imagine the possibility that such wholly impersonal factors as the laws of economics, geology, and thermodynamics could make things happen all by themselves.
The problem we face now is precisely that the unimaginable is now our reality. For just that little bit too long, too many people have insisted that we didn’t need to worry about the absurdity of pursuing limitless growth on a finite and fragile planet, that “they’ll think of something,” or that chattering on internet forums about this or that or the other piece of technological vaporware was doing something concrete about our species’ imminent collision with the limits to growth. For just that little bit too long, not enough people were willing to do anything that mattered, and now impersonal factors have climbed into the driver’s seat, having mugged all seven billion of us and shoved us into the trunk.
As I noted in last week’s post, that puts hard limits on what can be done in the short term. In all probability, at this stage of the game, each of us will be meeting the oncoming wave of crisis with whatever preparations we’ve made, however substantial or insubstantial those happen to be. I’m aware that a certain subset of my readers are unhappy with that suggestion, but that can’t be helped; the future is under no obligation to wait patiently while we get ready for it. A few years back, when I posted an essay here whose title sums up the strategy I’ve been proposing, I probably should have put more stress on the most important word in that slogan: now. Still, that’s gone wherever might-have-beens spend their time. 
That doesn’t mean the world is about to end. It means that in all probability, beginning at some point this year and continuing for several years after that, most of my readers will be busy coping with the multiple impacts of a thumping economic crisis on their own lives and those of their families, friends, communities, and employers, at a time when political systems over much of the industrial world have frozen up into gridlock, the simmering wars in the Middle East and much of the Third World seem more than usually likely to boil over, and the twilight of the Pax Americana is pushing both the US government and its enemies into an ever greater degree of brinksmanship. Exactly how that’s going to play out is anyone’s guess, but no matter what happens, it’s unlikely to be pretty.
While we get ready for the first shocks to hit, though, it’s worth talking a little bit about what comes afterwards.  No matter how long a train of financial dominoes the collapse of the fracking bubble sets toppling, the last one fill fall eventually, and within a few years things will have found a “new normal,” however far down the slope of contraction that turns out to be. No matter how many proxy wars, coups d’etat, covert actions, and manufactured insurgencies get launched by the United States or its global rivals in their struggle for supremacy, most of the places touched by that conflict will see a few years at most of actual warfare or the equivalent, with periods of relative peace before and after. The other driving forces of collapse act in much the same way; collapse is a fractal process, not a linear one.
Thus there’s something on the far side of crisis besides more of the same. The discussion I’d like to start at this point centers on what might be worth doing once the various masses of economic, political, and military rubble stops bouncing. It’s not too early to begin planning for that. If nothing else, it will give readers of this blog something to think about while standing in bread lines or hiding in the basement while riot police and insurgents duke it out in the streets. That benefit aside, the sooner we start thinking about the options that will be available once relative stability returns, the better chance we’ll have of being ready to implement it, in our own lives or on a broader scale, once stability returns.
One of the interesting consequences of crisis, for that matter, is that what was unthinkable before a really substantial crisis may not be unthinkable afterwards. Read Barbara Tuchman’s brilliant The Proud Tower and you’ll see how many of the unquestioned certainties of 1914 were rotting in history’s compost bucket by the time 1945 rolled around, and how many ideas that had been on the outermost fringes before the First World War that had become plain common sense after the Second. It’s a common phenomenon, and I propose to get ahead of the curve here by proposing, as raw material for reflection if nothing else, something that’s utterly unthinkable today but may well be a matter of necessity ten or twenty or forty years from now.
What do I have in mind? Intentional technological regression as a matter of public policy.
Imagine, for a moment, that an industrial nation were to downshift its technological infrastructure to roughly what it was in 1950. That would involve a drastic decrease in energy consumption per capita, both directly—people used a lot less energy of all kinds in 1950—and indirectly—goods and services took much less energy to produce then, too. It would involve equally sharp decreases in the per capita consumption of most resources. It would also involve a sharp increase in jobs for the working classes—a great many things currently done by robots were done by human beings in those days, and so there were a great many more paychecks going out of a Friday to pay for the goods and services that ordinary consumers buy. Since a steady flow of paychecks to the working classes is one of the major things that keep an economy stable and thriving, this has certain obvious advantages, but we can leave those alone for now.
Now of course the change just proposed would involve certain changes from the way we do things. Air travel in the 1950s was extremely expensive—the well-to-do in those days were called “the jet set,” because that’s who could afford tickets—and so everyone else had to put up with fast, reliable, energy-efficient railroads when they needed to get from place to place. Computers were rare and expensive, which meant once again that more people got hired to do jobs, and also meant that when you called a utility or a business, your chance of getting a human being who could help you with whatever problem you might have was considerably higher than it is today.
Lacking the internet, people had to make do instead with their choice of scores of AM and shortwave radio stations, thousands of general and specialized print periodicals, and full-service bookstores and local libraries bursting at the seams with books—in America, at least, the 1950s were the golden age of the public library, and most small towns had collections you can’t always find in big cities these days. Oh, and the folks who like looking at pictures of people with their clothes off, and who play a large and usually unmentioned role in paying for the internet today, had to settle for naughty magazines, mail-order houses that shipped their products in plain brown wrappers, and tacky stores in the wrong end of town. (For what it’s worth, this didn’t seem to inconvenience them any.)
As previously noted, I’m quite aware that such a project is utterly unthinkable today, and we’ll get to the superstitious horror that lies behind that reaction in a bit. First, though, let’s talk about the obvious objections. Would it be possible? Of course. Much of it could be done by simple changes in the tax code. Right now, in the United States, a galaxy of perverse regulatory incentives penalize employers for hiring people and reward them for replacing employees with machines. Change those so that spending money on wages, salaries and benefits up to a certain comfortable threshold makes more financial sense for employers than using the money to automate, and you’re halfway there already. 
A revision in trade policy would do most of the rest of what’s needed.  What’s jokingly called “free trade,” despite the faith-based claims of economists, benefits the rich at everyone else’s expense, and would best be replaced by sensible tariffs to support domestic production against the sort of predatory export-driven mercantilism that dominates the global economy these days. Add to that high tariffs on technology imports, and strip any technology beyond the 1950 level of the lavish subsidies that fatten the profit margins of the welfare-queen corporations in the Fortune 500, and you’re basically there.
What makes the concept of technological regression so intriguing, and so workable, is that it doesn’t require anything new to be developed. We already know how 1950 technology worked, what its energy and resource needs are, and what the upsides and downsides of adopting it would be; abundant records and a certain fraction of the population who still remember how it worked make that easy. Thus it would be an easy thing to pencil out exactly what would be needed, what the costs and benefits would be, and how to minimize the former and maximize the latter; the sort of blind guesses and arbitrary assumptions that have to go into deploying a brand new technology need not apply.
So much for the first objection. Would there be downsides to deliberate technological regression? Of course. Every technology and every set of policy options has its downsides.  A common delusion these days claims, in effect, that it’s unfair to take the downsides of new technologies or the corresponding upsides of old ones into consideration when deciding whether to replace an older technology with a newer one. An even more common delusion claims that you’re not supposed to decide at all; once a new technology shows up, you’re supposed to run bleating after it like everyone else, without asking any questions at all.
Current technology has immense downsides. Future technologies are going to have them, too—it’s only in sales brochures and science fiction stories, remember, that any technology is without them. Thus the mere fact that 1950 technology has problematic features, too, is not a valid reason to dismiss technological retrogression. The question that needs to be asked, however unthinkable it might be, is whether, all things considered, it’s wiser to accept the downsides of 1950 technology in order to have a working technological suite that can function on much smaller per capita inputs of energy and resources, and thus a much better chance to get through the age of limits ahead than today’s far more extravagant and brittle technological infrastructure.
It’s probably also necessary to talk about a particular piece of paralogic that comes up reliably any time somebody suggests technological regression: the notion that if you return to an older technology, you have to take the social practices and cultural mores of its heyday as well. I fielded a good many such comments last year when I suggested steam-powered Victorian technology powered by solar energy as a form the ecotechnics of the future might take. An astonishing number of people seemed unable to imagine that it was possible to have such a technology without also reintroducing Victorian habits such as child labor and sexual prudery. Silly as that claim is, it has deep roots in the modern imagination.
No doubt, as a result of those deep roots, there will be plenty of people who respond to the proposal just made by insisting that the social practices and cultural mores of 1950 were awful, and claiming that those habits can’t be separated from the technologies I’m discussing. I could point out in response that 1950 didn’t have a single set of social practices and cultural mores; even in the United States, a drive from Greenwich Village to rural Pennsylvania in 1950 would have met with remarkable cultural diversity among people using the same technology. 
The point could be made even more strongly by noting that the same technology was in use that year in Paris, Djakarta, Buenos Aires, Tokyo, Tangiers, Novosibirsk, Guadalajara, and Lagos, and the social practices and cultural mores of 1950s middle America didn’t follow the technology around to these distinctly diverse settings, you know. Pointing that out, though, will likely be wasted breath. To true believers in the religion of progress, the past is the bubbling pit of eternal damnation from which the surrogate messiah of progress is perpetually saving us, and the future is the radiant heaven into whose portals the faithful hope to enter in good time. Most people these days are no more willing to question those dubious classifications than a medieval peasant would be to question the miraculous powers that supposely emanated from the bones of St. Ethelfrith.
Nothing, but nothing, stirs up shuddering superstitious horror in the minds of the cultural mainstream these days as effectively as the thought of, heaven help us, “going back.” Even if the technology of an earlier day is better suited to a future of energy and resource scarcity than the infrastructure we’ve got now, even if the technology of an earlier day actually does a better job of many things than what we’ve got today, “we can’t go back!” is the anguished cry of the masses. They’ve been so thoroughly bamboozled by the propagandists of progress that they never stop to think that, why, yes, they can, and there are valid reasons why they might even decide that it’s the best option open to them.
There’s a very rich irony in the fact that alternative and avant-garde circles tend to be even more obsessively fixated on the dogma of linear progress than the supposedly more conformist masses. That’s one of the sneakiest features of the myth of progress; when people get dissatisfied with the status quo, the myth convinces them that the only option they’ve got is to do exactly what everyone else is doing, and just take it a little further than anyone else has gotten yet. What starts off as rebellion thus gets coopted into perfect conformity, and society continues to march mindlessly along its current trajectory, like lemmings in a Disney nature film, without ever asking the obvious questions about what might be waiting at the far end.
That’s the thing about progress; all the word means is “continued movement in the same direction.” If the direction was a bad idea to start with, or if it’s passed the point at which it still made sense, continuing to trudge blindly onward into the gathering dark may not be the best idea in the world. Break out of that mental straitjacket, and the range of possible futures broadens out immeasurably.
It may be, for example, that technological regression to the level of 1950 turns out to be impossible to maintain over the long term. If the technologies of 1920  can be supported on the modest energy supply we can count on getting from renewable sources, for example, something like a 1920 technological suite might be maintained over the long term, without further regression. It might turn out instead that something like the solar steampower I mentioned earlier, an ecotechnic equivalent of 1880 technology, might be the most complex technology that can be supported on a renewable basis. It might be the case, for that matter, that something like the technological infrastructure the United States had in 1820, with windmills and water wheels as the prime movers of industry, canalboats as the core domestic transport technology, and most of the population working on small family farms to support very modest towns and cities, is the fallback level that can be sustained indefinitely.
Does that last option seem unbearably depressing? Compare it to another very likely scenario—what will happen if the world’s industrial societies gamble their survival on a great leap forward to some unproven energy source, which doesn’t live up to its billing, and leaves billions of people twisting in the wind without any working technological infrastructure at all—and you may find that it has its good points. If you’ve driven down a dead end alley and are sitting there with the front grill hard against a brick wall, it bears remembering, shouting “We can’t go back!” isn’t exactly a useful habit. In such a situation—and I’d like to suggest that that’s a fair metaphor for the situation we’re in right now—going back, retracing the route as far back as necessary, is the one way forward.

The Mariner's Rule

Wed, 2015-01-21 16:54
One of the things my readers ask me most often, in response to this blog’s exploration of the ongoing decline and impending fall of modern industrial civilization, is what I suggest people ought to do about it all. It’s a valid question, and it deserves a serious answer.
Now of course not everyone who asks the question is interested in the answers I have to offer. A great many people, for example, are only interested in answers that will allow them to keep on enjoying the absurd extravagance that passed, not too long ago, for an ordinary lifestyle among the industrial world’s privileged classes, and is becoming just a little bit less ordinary with every year that slips by.  To such people I have nothing to say. Those lifestyles were only possible because the world’s industrial nations burnt through half a billion years of stored sunlight in a few short centuries, and gave most of the benefits of that orgy of consumption to a relatively small fraction of their population; now that easily accessible reserves of fossil fuels are running short, the party’s over. 
Yes, I’m quite aware that that’s a controversial statement. I field heated denunciations on a regular basis insisting that it just ain’t so, that solar energy or fission or perpetual motion or something will allow the industrial world’s privileged classes to have their planet and eat it too. Printer’s ink being unfashionable these days, a great many electrons have been inconvenienced on the internet to proclaim that this or that technology must surely allow the comfortable to remain comfortable, no matter what the laws of physics, geology, or economics have to say.  Now of course the only alternative energy sources that have been able to stay in business even in a time of sky-high oil prices are those that can count on gargantuan government subsidies to pay their operating expenses; equally, the alternatives receive an even more gigantic “energy subsidy” from fossil fuels, which make them look much more economical than they otherwise would.  Such reflections carry no weight with those whose sense of entitlement makes living with less unthinkable.
I’m glad to say that there are  fair number of people who’ve gotten past that unproductive attitude, who have grasped the severity of the crisis of our time and are ready to accept unwelcome change in order to secure a livable future for our descendants. They want to know how we can pull modern civilization out of its current power dive and perpetuate it into the centuries ahead. I have no answers for them, either, because that’s not an option at this stage of the game; we’re long past the point at which decline and fall can be avoided, or even ameliorated on any large scale.
A decade ago, a team headed by Robert Hirsch and funded by the Department of Energy released a study outlining what would have to be done in order to transition away from fossil fuels before they transitioned away from us. What they found, to sketch out too briefly the findings of a long and carefully worded study, is that in order to avoid massive disruption, the transition would have to begin twenty years before conventional petroleum production reached its peak and began to decline. There’s a certain irony in the fact that 2005, the year this study was published, was also the year when conventional petroleum production peaked; the transition would thus have had to begin in 1985—right about the time, that is, that the Reagan administration in the US and its clones overseas were scrapping the promising steps toward just such a transition.
A transition that got under way in 2005, in other words, would have been too late, and given the political climate, it probably would have been too little as well. Even so, it would have been a much better outcome than the one we got, in which most of us have spent the last ten years insisting that we don’t have to worry about depleting oilfields because fracking was going to save us all. At this point, thirty years after the point at which we would have had to get started, it’s all very well to talk about some sort of grand transition to sustainability, but the time when such a thing would have been possible came and went decades ago. We could have chosen that path, but we didn’t, and insisting thirty years after the fact that we’ve changed our minds and want a different future than the one we chose isn’t likely to make any kind of difference that matters.
So what options does that leave? In the minds of a great many people, at least in the United States, the choice that apparently comes first to mind involves buying farmland in some isolated rural area and setting up a homestead in the traditional style. Many of the people who talk enthusiastically about this option, to be sure, have never grown anything more demanding than a potted petunia, know nothing about the complex and demanding arts of farming and livestock raising, and aren’t in anything like the sort of robust physical condition needed to handle the unremitting hard work of raising food without benefit of fossil fuels; thus it’s a safe guess that in most of these cases, heading out to the country is simply a comforting daydream that serves to distract attention from the increasingly bleak prospects so many people are facing in the age of unraveling upon us.
There’s a long history behind such daydreams. Since colonial times, the lure of the frontier has played a huge role in the American imagination, providing any number of colorful inkblots onto which fantasies of a better life could be projected. Those of my readers who are old enough to remember the aftermath of the Sixties counterculture, when a great many young people followed that dream to an assortment of hastily created rural communes, will also recall the head-on collision between middle-class fantasies of entitlement and the hard realities of rural subsistence farming that generally resulted. Some of the communes survived, though many more did not; that I know of, none of the surviving ones made it without a long and difficult period of readjustment in which romantic notions of easy living in the lap of nature got chucked in favor of a more realistic awareness of just how little in the way of goods and services a bunch of untrained ex-suburbanites can actually produce by their own labor.
In theory, that process of reassessment is still open. In practice, just at the moment, I’m far from sure it’s an option for anyone who’s not already traveled far along that road. The decline and fall of modern industrial civilization, it bears repeating, is not poised somewhere off in the indefinite future, waiting patiently for us to get ready for it before it puts in an appearance; it’s already happening at the usual pace, and the points I’ve raised in posts here over the last few weeks suggest that the downward slope is probably going to get a lot steeper in the near future. As the collapse of the fracking bubble ripples out through the financial sphere, most of us are going to be scrambling to adapt, and the chances of getting everything lined up in time to move to rural property, get the necessary equipment and supplies to start farming, and get past the worst of the learning curve before crunch time arrives are not good.
If you’re already on a rural farm, in other words, by all means pursue the strategy that put you there. If your plans to get the necessary property, equipment, and skills are well advanced at this point, you may still be able to make it, but you’d probably better get a move on. On the other hand, dear reader, if your rural retreat is still off there in the realm of daydreams and good intentions, it’s almost certainly too late to do much about it, and where you are right now is probably where you’ll be when the onrushing waves of crisis come surging up and break over your head.
That being the case, are there any options left other than hiding under the bed and hoping that the end will be relatively painless? As it happens, there are.
The point that has to be understood to make sense of those options is that in the real world, as distinct from Hollywood-style disaster fantasies, the end of a civilization follows the famous rule attributed to William Gibson: “The future is already here, it’s just not evenly distributed yet.”  Put another way, the impacts of decline and fall aren’t uniform; they vary in intensity over space and time, and they impact particular systems of a falling civilization at different times and in different ways.  If you’re in the wrong place at the wrong time, and depend on the wrong systems to support you, your chances aren’t good, but the places, times, and systems that take the brunt of the collapse aren’t random. To some extent, those can be anticipated, and some of them can also be avoided.
Here’s an obvious example. Right now, if your livelihood depends on the fracking industry, the tar sands industry, or any of the subsidiary industries that feed into those, your chances of getting through 2015 with your income intact are pretty minimal.  People in those industries who got to witness earlier booms and busts know this, and a good many of them are paying off their debts, settling any unfinished business they might have, and making sure they can cover a tank of gas or a plane ticket to get back home when the bottom falls out. People in those industries who don’t have that experience to guide them, and are convinced that nothing bad can actually happen to them, are not doing these things, and are likely to end up in a world of hurt when their turn comes.
They’re not the only ones who would benefit right now from taking such steps. A very large part of the US banking and finance industry has been flying high on bloated profits from an assortment of fracking-related scams, ranging from junk bonds through derivatives to exotic financial fauna such as volumetric production payments. Now that the goose that laid the golden eggs is bobbing feet upwards in a pond of used fracking fluid, the good times are coming to a sudden stop, and that means sharply reduced income for those junior bankers, brokers, and salespeople who can keep their jobs, and even more sharply reduced prospects for those who don’t.
They’ve got plenty of company on the chopping block.  The entire retail sector in the US is already in trouble, with big-box stores struggling for survival and shopping malls being abandoned, and the sharp economic downturn we can expect as the fracking bust unfolds will likely turn that decline into freefall, varying in intensity by region and a galaxy of other factors. Those who brace themselves for a hard landing now are a good deal more likely to make it than those who don’t, and those who have the chance to jump to something more stable now would be well advised to make the leap.
That’s one example; here’s another. I’ve written here in some detail about how anthropogenic climate change will wallop North America in the centuries ahead of us. One thing that’s been learned from the last few years of climate vagaries is that North America, at least, is shifting in exactly the way paleoclimatic data would suggest—more or less the same way it did during warm periods over the last ten or twenty million years. The short form is that the Southwest and mountain West are getting baked to a crackly crunch under savage droughts; the eastern Great Plains, Midwest, and most of the South are being hit by a wildly unstable climate, with bone-dry dry years alternating with exceptionally soggy wet ones; while the Appalachians and points eastward have been getting unsteady temperatures but reliable rainfall. Line up your choice of subsistence strategies next to those climate shifts, and if you still have the time and resources to relocate, you have some idea where to go.
All this presumes, of course, that what we’re facing has much more in common with the crises faced by other civilizations on their way to history’s compost heap than it does with the apocalyptic fantasies so often retailed these days as visions of the immediate future. I expect to field a flurry of claims that it just ain’t so, that everything I’ve just said is wasted breath because some vast and terrible whatsit will shortly descend on the whole world and squash us like bugs. I can utter that prediction with perfect confidence, because I’ve been fielding such claims over and over again since long before this blog got started. All the dates by which the world was surely going to end have rolled past without incident, and the inevitable cataclysms have pulled one no-show after another, but the shrill insistence that something of the sort really will happen this time around has shown no sign of letting up. Nor will it, since the unacceptable alternative consists of taking responsibility for doing something about the future.
Now of course I’ve already pointed out that there’s not much that can be done about the future on the largest scale. As the fracking bubble implodes, the global economy shudders, the climate destabilizes, and a dozen other measures of imminent crisis head toward the red zone on the gauge, it’s far too late in the day for much more than crisis management on a local and individual level. Even so, crisis management is a considerably more useful response than sitting on the sofa daydreaming about the grandiose project that’s certain to save us or the grandiose cataclysm that’s certain to annihilate us—though these latter options are admittedly much more comfortable in the short term.
What’s more, there’s no shortage of examples in relatively recent history to guide the sort of crisis management I have in mind. The tsunami of discontinuities that’s rolling toward us out of the deep waters of the future may be larger than the waves that hit the Western world with the coming of the First World War in 1914, the Great Depression in 1929, or the Second World War in 1939, but from the perspective of the individual, the difference isn’t as vast as it might seem. In fact, I’d encourage my readers to visit their local public libraries and pick up books about the lived experience of those earlier traumas. I’d also encourage those with elderly relatives who still remember the Second World War to sit down with them over a couple of cups of whatever beverage seems appropriate, and ask about what it was like on a day-by-day basis to watch their ordinary peacetime world unravel into chaos.
I’ve had the advantage of taking part in such conversations, and I’ve also done a great deal of reading about historical crises that have passed below the horizon of living memory. There are plenty of lessons to be gained from such sources, and one of the most important also used to be standard aboard sailing ships in the days before steam power. Sailors in those days had to go scrambling up the rigging at all hours and in all weathers to set, reef, or furl sails; it was not an easy job—imagine yourself up in the rigging of a tall ship in the middle of a howling storm at night, clinging to tarred ropes and slick wood and trying to get a mass of wet, heavy, wind-whipped canvas to behave, while below you the ship rolls from side to side and swings you out over a raging ocean and back again. If you slip and you’re lucky, you land on deck with a pretty good chance of breaking bones or worse; if you slip and you’re not lucky, you plunge straight down into churning black water and are never seen again.
The rule that sailors learned and followed in those days was simple: “One hand for yourself, one hand for the ship.” Every chore that had to be done up there in the rigging could be done by a gang of sailors who each lent one hand to the effort, so the other could cling for dear life to the nearest rope or ratline. Those tasks that couldn’t be done that way, such as hauling on ropes, took place down on the deck—the rigging was designed with that in mind. There were emergencies where that rule didn’t apply, and even with the rule in place there were sailors who fell from the rigging to their deaths, but as a general principle it worked tolerably well.
I’d like to propose that the same rule might be worth pursuing in the crisis of our age. In the years to come, a great many of us will face the same kind of scramble for survival that so many others faced in the catastrophes of the early 20th century. Some of us won’t make it, and some will have to face the ghastly choice between sheer survival and everything else they value in life. Not everyone, though, will land in one or the other of those categories, and many those who manage to stay out of them will have the chance to direct time and energy toward the broader picture.
Exactly what projects might fall into that latter category will differ from one person to another, for reasons that are irreducibly personal. I’m sure there are plenty of things that would motivate you to action in desperate times, dear reader, that would leave me cold, and of course the reverse is also true—and in times of crisis, of the kind we’re discussing, it’s personal factors of that sort that make the difference, not abstract considerations of the sort we might debate here. I’ll be discussing a few of the options in upcoming posts, but I’d also encourage readers of this blog to reflect on the question themselves: in the wreck of industrial civilization, what are you willing to make an effort to accomplish, to defend, or to preserve?
In thinking about that, I’d encourage my readers to consider the traumatic years of the early 20th century as a model for what’s approaching us. Those who were alive when the first great wave of dissolution hit in 1914 weren’t facing forty years of continuous cataclysm; as noted here repeatedly, collapse is a fractal process, and unfolds in real time as a sequence of crises of various kinds separated by intervals of relative calm in which some level of recovery is possible. It’s pretty clear that the first round of trouble here in the United States, at least, will be a major economic crisis; at some point not too far down the road, the yawning gap between our senile political class and the impoverished and disaffected masses promises the collapse of politics as usual and a descent into domestic insurgency or one of the other standard patterns by which former democracies destroy themselves; as already noted, there are plenty of other things bearing down on us—but after an interval, things will stabilize again.
Then it’ll be time to sort through the wreckage, see what’s been saved and what can be recovered, and go on from there. First, though, we have a troubled time to get through.

March of the Squirrels

Wed, 2015-01-14 16:03
Prediction is a difficult business at the best of times, but the difficulties seem to change from one era to another. Just now, at least for me, the biggest challenge is staying in front of the headlines. So far, the crash of 2015 is running precisely to spec. Smaller companies in the energy sector are being hammered by the plunging price of oil, while the banking industry insists that it’s not in trouble—those of my readers who recall identical expressions of misplaced confidence on the part of bankers in news stories just before the 2008 real estate crash will know just how seriously to take such claims.
The shiny new distractions disguised as energy breakthroughs I mentioned here two weeks ago have also started to show up. A glossy puff piece touting oceanic thermal energy conversion (OTEC), a white-elephant technology which was tested back in the 1970s and shown to be hopelessly uneconomical, shared space in the cornucopian end of the blogosphere over the last week with an equally disingenuous puff piece touting yet another rehash of nuclear fission as the answer to our energy woes. (Like every fission technology, of course, this one will be safe, clean, and affordable until someone actually tries to build it.)
No doubt there will shortly be other promoters scrambling for whatever government subsidies and private investment funds might be available for whatever revolutionary new energy breakthrough (ahem) will take the place of hydrofractured shales as America’s favorite reason to do nothing. I admit to a certain feeling of disappointment, though, in the sheer lack of imagination displayed so far in that competition. OTEC and molten-salt fission reactors were already being lauded as America’s energy salvation back when I was in high school: my junior year, I think it was, energy was the topic du jour for the local high school debate league, and we discussed those technologies at length. So did plenty of more qualified people, which is why both of them—and quite a few other superficially plausible technologies—never made it off the drawing board.
Something else came in for discussion that same year, and it’s a story with more than a little relevance to the current situation. A team from another school in the south Seattle suburbs had a brainstorm, did some frantic research right before a big debate tournament, and showed up with data claiming to prove that legions of squirrels running in squirrel cages, powering little generators, could produce America’s electricity. Since no one else happened to have thought of that gimmick, none of the other teams had evidence to refute them, and they swept the tournament. By the next tournament, of course, everyone else had crunched the numbers and proceeded to stomp the squirrel promoters, but for years to come the phrase “squirrel case” saw use in local debate circles as the standard term for a crackpot proposal backed with seemingly plausible data.
The OTEC plants and molten-salt reactors currently being hawked via the media are squirrel cases in exactly the same sense; they sound plausible as long as you don’t actually crunch the numbers and see whether they’re economically and thermodynamically viable. The same thing was true of the fracking bubble that’s messily imploding around us right now, not to mention the ethanol and biodiesel projects, the hydrogen economy, and the various other glittery excuses that have occupied so much useless space in the collective conversation of our time. So, it has to be said, do the more enthusiastic claims being made for renewable energy just now.
Don’t get me wrong, I’m a great fan of renewable energy. When extracting fossil carbon from the earth stops being economically viable—a point that may arrive a good deal sooner than many people expect—renewables are what we’ll have left, and the modest but real energy inputs that can be gotten from renewable sources when they don’t receive energy subsidies from fossil fuels could make things significantly better for our descendants. The fact remains that in the absence of subsidies from fossil fuels, renewables won’t support the absurdly extravagant energy consumption that props up what passes for an ordinary middle class lifestyle in the industrial world these days.
That’s the pterodactyl in the ointment, the awkward detail that most people even in the greenest of green circles don’t want to discuss. Force the issue into a conversation, and one of the more common responses you’ll get is the exasperated outburst “But there has to be something.” Now of course this simply isn’t true; no law of nature, no special providence, no parade of marching squirrels assures us that we can go ahead and use as much energy as we want in the serene assurance that more will always be waiting for us. It’s hard to think of a more absurd delusion, and the fact that a great many people making such claims insist on their superior rationality and pragmatism just adds icing to the cake.
Let’s go ahead and say it in so many words: there doesn’t have to be a replacement for fossil fuels. In point of fact, there’s good reason to think that no such replacement exists anywhere in the small corner of the universe accessible to us, and once fossil fuels are gone, the rest of human history will be spent in a world that doesn’t have the kind of lavish energy resources we’re used to having. Concentrations of energy, like all other natural resources, follow what’s known as the power law, the rule—applicable across an astonishingly broad spectrum of phenomena—that whatever’s ten times as concentrated is approximately ten times as rare. At the dawn of the industrial age, the reserves of fossil fuel in the Earth’s crust were the richest trove of stored energy on the planet, and of course fossil fuel extraction focused on the richest and most easily accessible prizes first, just as quickly as they could be found.
Those are gone now. Since 2005, when conventional petroleum production peaked worldwide, the industrial world has been engaged in what amounts to a frantic game of make-believe, pretending that scraping the bottom of the oil barrel proves that the barrel is still full. Every half-baked scheme for producing liquid fuels got flooded with as much cheap credit as its promoters could squander. Some of those—biodiesel and cellulosic ethanol come to mind—turned out to be money pits so abysmal that even a tide of freshly printed money couldn’t do much more than gurgle on the way down; others—above all, shale fracking and tar sand mining—were able to maintain a pretense of profitability for a while, with government subsidies, junk bonds, loans from clueless banks, and round after round of economic stimulus from central banks over much of the world serving to prop up industries that, in the final analysis, were never economically viable in the first place.
The collapse in the price of oil that began this June put paid to that era of make-believe. The causes of the oil crash are complex, but back of them all, I suggest, is a straightforward bit of economics that almost everyone’s been trying to avoid for a decade now.  To maintain economic production at any given level, the global economy has to produce enough real wealth—not, please note, enough money, but enough actual goods and services—to cover resource extraction, the manufacture and replacement of the whole stock of nonfinancial capital goods, and whatever level of economic waste is considered socially and politically necessary. If the amount of real wealth needed to keep extracting resources at a given rate goes up steeply, the rest of the economy won’t escape the consequences: somewhere or other, something has to give.
The economic history of the last decade is precisely the story of what gave in what order, or to put it another way, how the industrial world threw everything in sight under the bus to keep liquid fuel production around its 2005 peak. Infrastructure was abandoned to malign neglect, the last of the industrial world’s factory jobs got offshored to Third World sweatshops, standards of living for most people dropped steadily—well, you can fill in the blanks as well as I can. Consumption remained relatively high only because central banks flooded the global economy with limitless cheap credit, while the US government filled the gap between soaring government expenditures and flat or shrinking tax receipts by the simple equivalent of having the Fed print enough money each month to cover the federal deficit. All these things were justified by the presupposition that the global economy was just going through a temporary rough patch, and normal growth would return any day now.
But normal growth has not returned. It’s not going to return, either, because it was only “normal” in an era when cheap abundant fossil fuels greased the wheels of every kind of economic activity. As I noted in a blog post here back in 2007, the inevitable consequence of soaring oil prices is what economists call demand destruction: less formally, the process by which people who can’t afford oil stop using it, bringing the price back down. Since what’s driving the price of oil up isn’t merely market factors, but the hard geological realities of depletion, not everyone who got forced out of the market when the price was high can get back into it when the price is low—gas at $2 a gallon doesn’t matter if your job scavenging abandoned houses doesn’t pay enough for you to cover the costs of a car, and let’s not even talk about how much longer the local government can afford to maintain streets in driveable condition.
Demand destruction sounds very abstract. In practice, though, it’s all too brutally concrete: a rising tide of job losses, business failures, slumping standards of living, cutbacks to every kind of government service at every level, and so on down the litany of decline that’s become part of everyday life in the industrial world over the last decade—leaving aside, that is, the privileged few who have been sheltered from those changes so far. Unless I miss my guess, we’re going to see those same changes shift into overdrive in the months and years ahead. The attempt to boost the world out of its deepening slump by flooding the planet with cheap credit has failed; the global economy is choking on a supersized meal of unpayable IOUs and failed investments; stock markets and other venues for the exchange of paper wealth are so thoroughly gimmicked that they’ve become completely detached from the real economy of goods and services, and the real economy is headed south in a hurry.
Those unwelcome realities are going to constrain any attempt by the readers of this blog to follow up on the proposal I made in last week’s post, and take constructive action in the face of the crisis that’s now upon us. The energy situation here in the US could have been helped substantially if conservation measures and homescale renewables had received any kind of significant support from the oh-so-allegedly-green Democratic party, back when it still had enough clout in Congress to matter; the economic situation would be nowhere near as dire if governments and central banks had bitten the bullet and dealt with the crisis of our time in 2008 or thereafter, rather than papering things over with economic policies that assumed that enough money could negate the laws of physics and geology. At this point, it’s much too late for any sort of collective action on either of those fronts—and of course the political will needed to do anything meaningful about either one went missing in action at the end of the 1970s and hasn’t been seen since.
Thus all of us will have to cope with a world in which the cost of energy suffers from drastic and economically devastating swings, and the sort of localized infrastructure that could cushion the impact of those swings wasn’t built in time. All of us will also have to cope with a global economy in disarray, in which bank failures, currency crises, credit shortages, and crisis measures imposed by government fiat will take the place of the familiar workings of a market economy. Those are baked into the cake at this point, and what individuals, families, and community groups will be able to do in the years ahead will be constrained by the limits those transformations impose.
Those of my readers who still have a steady income and a home they expect to be able to keep would still be well advised to doublecheck their insulation and weatherstripping, install solar water heating and other homescale renewable energy technologies, and turn the back lawn into a vegetable garden with room for a chicken coop, if by any chance they haven’t taken these sensible steps already.  A great many of my readers don’t have such options, and at this point, it may be a long time before such options are readily available again. This is crunch time, folks; unless I’m very much mistaken, we’re on the brink of a historical inflection point like the ones in 1789 and 1914, one of the watersheds of time after which nothing will ever be the same again.
There’s still much that can be done in other spheres, and I’ll be discussing some of those things in upcoming posts. In terms of energy and the economy, though, I suspect that for a lot of us, the preparations we’re going to be able to make are the ones we’ve already made, and a great many people whose plans depend on having a stable income and its associated perks and privileges may find themselves scrambling for options when the unraveling of the economy leaves them without one. Those of my readers who have been putting off the big changes that might make them more secure in hard times may be facing the hard decision of making those changes now, in a hurry, or facing the crisis of our age in the location and situation they’re in right now. Those who’ve gone ahead and made the changes—well, you know as well as I do that it’s time to review your plans, doublecheck the details, batten down the hatches and get ready to weather the storm.
One of the entertainments to be expected as the year draws on and the crisis bears down on us all, though, is a profusion of squirrel cases of the sort discussed toward the beginning of this essay. It’s an interesting regularity of history that the closer to disaster a society in decline becomes, the more grandiose, triumphalist, and detached from the grubby realities its fantasies generally get. I’m thinking here of the essay on military affairs from the last years of the Roman world that’s crammed full of hopelessly unworkable war machines, and of the final, gargantuan round of Mayan pyramids built on the eve of the lowland classic collapse. The habit of doubling down in the face of self-induced catastrophe seems to be deeply engrained in the human psyche, and I don’t doubt for a moment that we’ll see some world-class examples of the phenomenon in the years immediately ahead.
That said, the squirrel cases mentioned earlier—the OTEC and molten-salt fission proposals—suffer from a disappointing lack of imagination. If our society is going to indulge in delusional daydreams as it topples over the edge of crisis, couldn’t we at least see some proposals that haven’t been rehashed since I was in high school?  I can only think of one such daydream that has the hallucinatory quality our current circumstances deserve; yes, that would be the proposal, being made quite seriously in the future-oriented media just now, that we can solve all our energy problems by mining helium-3 on the Moon and ship it to Earth to fuel fusion power plants we have absolutely no idea how to build yet. As faith-based cheerleading for vaporware, which is of course what those claims are, they set a very high standard—but it’s a standard that will doubtless be reached and exceeded in due time.
That said, I think the media may need some help launching the march of the squirrels just mentioned, and the readers of this blog proved a good long time ago that they have more than enough imagination to meet that pressing need.
Therefore I’m delighted to announce a new contest here on The Archdruid Report, the Great Squirrel Case Challenge of 2015. The goal is to come up with the most absurd new energy technology you can think of, and write either the giddily dishonest corporate press release or the absurdly sycophantic media article announcing it to the world. If you or a friend can Photoshop an image or two of your proposed nonsolution to the world’s energy needs, that’s all the better. Post your press release or media article on your blog if you have one; if you don’t, you can get one for free from Blogspot or Wordpress. Post a link to your piece in the comments section of this blog.
Entries must be posted here by February 28, 2012.  Two winners—one picked by me, the other by vote of the registered members of the Green Wizards forum—will receive signed complimentary copies of my forthcoming book After Progress. I can’t speak for the forum, which will doubtless have its own criteria, but I’ll be looking for a winsome combination of sheer absurdity with the sort of glossy corporate presentation that frames so many absurd statements these days. (Hint: it’s not against the rules to imitate real press releases and media articles.)
As for the wonderful new energy breakthrough you’ll be lauding so uncritically, why, that’s up to you. Biodiesel plants using investment bankers as their primary feedstock? A vast crank hooked to the Moon, running a global system of belts and pulleys? An undertaking of great energy profit, to misquite the famous ad from the South Sea Bubble, but no one to know what it is? Let your imagination run wild; no matter how giddy you get, as the failure of the fracking bubble becomes impossible to ignore, the mass media and a great many of our fellow hominids are go much further along the track of the marching squirrels than you will.
********************In not unrelated news, I’m delighted to report that the second volume of stories to come out of this blog’s 2014 Space Bats challenge is now available in ebook formats, and will shortly be out in print as well. After Oil 2: The Years of Crisis features a dozen original short stories set in the near future, as industrial civilization slams face first into the limits to growth. Those of my readers who followed the original contest already know that this is a first-rate collection of deindustrial SF; as for the rest of you—why, youre in for a treat. Click here to order a copy.

A Camp Amid the Ruins

Wed, 2015-01-07 18:35
Well, the Fates were apparently listening last week. As I write this, stock markets around the world are lurching through what might just be the opening moves of the Crash of 2015, whipsawed by further plunges in the price of oil and a range of other bad economic news; amid a flurry of layoffs and dropping rig counts, the first bankruptcy in the fracking industry has been announced, with more on their way; gunfire in Paris serves up a brutal reminder that the rising spiral of political violence I traced in last week’s post is by no means limited to North American soil.  The cheerleaders of business as usual in the media are still insisting at the top of their lungs that America’s new era of energy independence is still on its way; those of my readers who recall the final days of the housing bubble that burst in 2008, or the tech-stock bubble that popped in 2000, will recognize a familiar tone in the bluster.
It’s entirely possible, to be sure, that central banks and governments will be able to jerry-rig another round of temporary supports for the fraying architecture of the global economy, and postpone a crash—or at least drag out the agony a bit longer. It’s equally possible that other dimensions of the crisis of our age can be forestalled or postponed by drastic actions here and now.  That said, whether the process is fast or slow, whether the crunch hits now or a bit further down the road, the form of technic society I’ve termed abundance industrialism is on its way out through history’s exit turnstile, and an entire world of institutions and activities familiar to all of us is going with it.
It doesn’t require any particular genius or prescience to grasp this, merely the willingness to recognize that if something is unsustainable, sooner or later it won’t be sustained. Of course that’s the sticking point, because what can’t be sustained at this point is the collection of wildly extravagant energy- and resource-intensive habits that used to pass for a normal lifestyle in the world’s industrial nations, and has recently become just a little less normal than it used to be. Those lifestyles, and most of what goes with them, only existed in the first place because a handful of the world’s nations burned through half a billion years of fossil sunlight in a few short centuries, and stripped the planet of most of its other concentrated resource stocks into the bargain.
That’s the unpalatable reality of the industrial era. Despite the rhetoric of universal betterment that was brandished about so enthusiastically by the propagandists of the industrial order, there were never enough of any of the necessary resources to make that possible for more than a small fraction of the world’s population, or for more than a handful of generations. Nearly all the members of our species who lived outside the industrial nations, and a tolerably large number who resided within them, were expected to carry most of the costs of reckless resource extraction and ecosystem disruption while receiving few if any of the benefits. They’ll have plenty of company shortly: abundance industrialism is winding down, but its consequences are not, and people around the world for centuries and millennia to come will have to deal with the depleted and damaged planet our actions have left them.
That’s a bitter pill to swallow, and the likely aftermath of the industrial age won’t do anything to improve the taste. Over the last six months or so, I’ve drawn on the downside trajectories of other failed civilizations to sketch out how that aftermath will probably play out here in North America: the disintegration of familiar political and economic structures, the rise of warband culture, the collapse of public order, and the failure of cultural continuity, all against a backdrop of rapid and unpredictable climate change, rising seas, and the appearance of chemical and radiological dead zones created by some of industrial civilization’s more clueless habits. It’s an ugly picture, and the only excuse I have for that unwelcome fact is that falling civilizations look like that.
The question that remains, though, is what we’re going to do about it all.
I should say up front that by “we” I don’t mean some suitably photogenic collection of Hollywood heroes and heroines who just happen to have limitless resources and a bag of improbable inventions at their disposal. I don’t mean a US government that has somehow shaken off the senility that affects all great powers in their last days and is prepared to fling everything it has into the quest for a sustainable future. Nor do I mean a coterie of gray-skinned aliens from Zeta Reticuli, square-jawed rapists out of Ayn Rand novels, or some other source of allegedly superior beings who can be counted upon to come swaggering onto the scene to bail us out of the consequences of our own stupidity. They aren’t part of this conversation; the only people who are, just now, are the writer and the readers of this blog.
Within those limits, the question I’ve posed may seem preposterous. I grant that for a phenomenon that practically defines the far edges of the internet—a venue for lengthy and ornately written essays about wildly unpopular subjects by a clergyman from a small and distinctly eccentric fringe religion—The Archdruid Report has a preposterously large readership, and one that somehow manages to find room for a remarkably diverse and talented range of people, bridging some of the ideological and social barriers that divide  industrial society into so many armed and uncommunicative camps. Even so, the regular readership of this blog could probably all sit down at once in a football stadium and still leave room for the hot dog vendors. Am I seriously suggesting that this modest and disorganized a group can somehow rise up and take meaningful action in the face of so vast a process as the fall of a civilization?
One of the things that gives that question an ironic flavor is that quite a few people are making what amounts to the same claim in even more grandiose terms than mine. I’m thinking here of the various proposals for a Great Transition of one kind or another being hawked at various points along the social and political spectrum these days. I suspect we’re going to be hearing a lot more from those in the months and years immediately ahead, as the collapse of the fracking bubble forces people to find some other excuse for insisting that they can have their planet and eat it too.
Part of the motivation behind the grand plans just mentioned is straightforwardly financial. One part of what drove the fracking bubble along the classic trajectory—up with the rocket, down with the stick—was a panicked conviction on the part of a great many people that some way had to be found to keep industrial society’s fuel tanks somewhere on the near side of that unwelcome letter E. Another part of it, though, was the recognition on the part of a somewhat smaller but more pragmatic group of people tht the panicked conviction in question could be turned into a sales pitch. Fracking wasn’t the only thing that got put to work in the time-honored process of proving Ben Franklin’s proverb about a fool and his money; fuel ethanol, biodiesel, and large-scale wind power also had their promoters, and sucked up their share of government subsidies and private investment.
Now that fracking is falling by the wayside, there’ll likely be a wild scramble to replace it in the public eye as the wave of the energy future. The nuclear industry will doubtless be in there—nuclear power is one of the most durable subsidy dumpsters in modern economic life, and the nuclear industry has had to become highly skilled at slurping from the government teat, since nuclear power isn’t economically viable otherwise—it’s worth recalling that no nation on earth has been able to create or maintain a nuclear power program without massive ongoing government subsidies. No doubt we’ll get plenty of cheerleading for fusion, satellite-based solar power, and other bits of high-end vaporware, too.
Still, I suspect the next big energy bubble is probably going to come from the green end of things. Over the last few years, there’s been no shortage of claims that renewable resources can pick right up where fossil fuels leave off and keep the lifestyles of today’s privileged middle classes intact. Those claims tend to be long on enthusiasm and cooked numbers and short on meaningful assessment, but then that same habit didn’t slow the fracking boom any; we can expect to see a renewed flurry of claims that solar power must be sustainable because the sticker price has gone down, and similar logical non sequiturs. (By the same logic, the internet must be sustainable if you can pay your monthly ISP bill by selling cute kitten photos on eBay.  In both cases, the sprawling and almost entirely fossil-fueled infrastructure of mines, factories, supply chains, power grids, and the like, has been left out of the equation, as though those don’t have to be accounted for: typical of the blindness to whole systems that pervades so much of contemporary culture.)
It’s not enough for an energy technology to be green, in other words; it also has to work.  It’s probably safe to assume that that point is going to be finessed over and over again, in a galaxy of inventive ways,  as the fracking bubble goes whereved popped financial bubbles go when they die. The point that next to nobody wants to confront is the one made toward the beginning of this week’s post: if something is unsustainable, sooner or later it won’t be sustained—and what’s unsustainable in this case isn’t simply fossil fuel production and consumption, it’s the lifestyles that were made possible by the immensely abundant and highly concentrated energy supply we got from fossil fuels.
You can’t be part of the solution if your lifestyle is part of the problem. I know that those words are guaranteed to make the environmental equivalent of limousine liberals gasp and clutch their pearls or their Gucci ties, take your pick, but there it is; it really is as simple as that. There are at least two reasons why that maxim needs to be taken seriously. On the one hand, if you’re clinging to an unsustainable lifestyle in the teeth of increasingly strong economic and environmental headwinds, you’re not likely to be able to spare the money, the free time, or any of the other resources you would need to contribute to a solution; on the other, if you’re emotionally and financially invested in keeping an unsustainable lifestyle, you’re likely to put preserving that lifestyle ahead of things that arguably matter more, like leaving a livable planet for future generations.
Is the act of letting go of unsustainable lifestyles the only thing that needs to be done? Of course not, and in the posts immediately ahead I plan on talking at length about some of the other options. I’d like to suggest, though, that it’s the touchstone or, if you will, the boundary that divides those choices that might actually do some good from those that are pretty much guaranteed to do no good at all. That’s useful when considering the choices before us as individuals; it’s at least as useful, if not more so, when considering the collective options we’ll be facing in the months and years ahead, among them the flurry of campaigns, movements, and organizations that are already gearing up to exploit the crisis of our time in one way or another—and with one agenda or another.
An acronym I introduced a while back in these posts might well be worth revisiting here: LESS, which stands for “Less Energy, Stuff, and Stimulation.” That’s a convenient summary of the changes that have to be made to move from today’s unsustainable lifestyles to ways of living that will be viable when today’s habits of absurd extravagance are fading memories. It’s worth taking a moment to unpack the acronym a little further, and see what it implies.
“Less energy” might seem self-evident, but there’s more involved here than just turning off unneeded lights and weatherstripping your windows and doors—though those are admittedly good places to start. A huge fraction of the energy consumed by a modern industrial society gets used indirectly to produce, supply, and transport goods and services; an allegedly “green” technological device that’s made from petroleum-based plastics and exotic metals taken from an open-pit mine in a Third World country, then shipped halfway around the planet to the air-conditioned shopping mall where you bought it, can easily have a carbon footprint substantially bigger than some simpler item that does the same thing in a less immediately efficient way. The blindness to whole systems mentioned earlier has to be overcome in order to make any kind of meaningful sense of energy issues: a point I’ll be discussing further in an upcoming post here.
“Less stuff” is equally straightforward on the surface, equally subtle in its ramifications. Now of course it’s hardly irrelevant that ours is the first civilization in the history of the planet to have to create an entire industry of storage facilities to store the personal possessions that won’t fit into history’s biggest homes. That said, “stuff” includes a great deal more than the contents of your closets and storage lockers. It also includes infrastructure—the almost unimaginably vast assortment of technological systems on which the privileged classes of the industrial world rely for most of the activities of their daily lives. That infrastructure was only made possible by the deluge of cheap abundant energy our species briefly accessed from fossil fuels; as what’s left of the world’s fossil fuel supply moves deeper into depletion, the infrastructure that it created has been caught in an accelerating spiral of deferred maintenance and malign neglect; the less dependent you are on what remains, the less vulnerable you are to further systems degradation, and the more of what’s left can go to those who actually need it.
“Less stimulation” may seem like the least important part of the acronym, but in many ways it’s the most crucial point of all. These days most people in the industrial world flood their nervous systems with a torrent of electronic noise.  Much of this is quite openly intended to manipulate their thoughts and feelings by economic and political interests; a great deal more has that effect, if only by drowning out any channel of communication that doesn’t conform to the increasingly narrow intellectual tunnel vision of late industrial society. If you’ve ever noticed how much of what passes for thinking these days amounts to the mindless regurgitation of sound bites from the media, dear reader, that’s why. What comes through the media—any media—is inevitably prechewed and predigested according to someone else’s agenda; those who are interested in thinking their own thoughts and making their own decisions, rather than bleating in perfect unison with the rest of the herd, might want to keep this in mind.
It probably needs to be said that very few of us are in a position to go whole hog with LESS—though it’s also relevant that some of us, and quite possibly a great many of us, will end up doing so willy-nilly if the economic contraction at the end of the fracking bubble turns out to be as serious as some current figures suggest. Outside of that grim possibility, “less” doesn’t have to mean “none at all”—certainly not at first; for those who aren’t caught in the crash, at least, there may yet be time to make a gradual transition toward a future of scarce energy and scarce resources. Still, I’d like to suggest that any proposed response to the crisis of our time that doesn’t start with LESS simply isn’t serious.
As already noted, I expect to see a great many nonserious proposals in the months and years ahead. Those who put maintaining their comfortable lifestyles ahead of other goals will doubtless have no trouble coming up with enthusiastic rhetoric and canned numbers to support their case; certainly the promoters and cheerleaders of the soon-to-be-late fracking bubble had no difficulty at all on that score. Not too far in the future, something or other will have been anointed as the shiny new technological wonder that will save us all, or more precisely, that will give the privileged classes of the industrial world a new set of excuses for clinging to some semblance of their current lifestyles for a little while longer. Mention the growing list of things that have previously occupied that hallowed but inevitably temporary status, and you can count on either busy silence or a flustered explanation why it really is different this time.
There may not be that many of us who get past the nonserious proposals, ask the necessary but unwelcome questions about the technosavior du jour, and embrace LESS while there’s still time to do so a step at a time. I’m convinced, though, that those who manage these things are going to be the ones who make a difference in the shape the future will have on the far side of the crisis years ahead. Let go of the futile struggle to sustain the unsustainable, take the time and money and other resources that might be wasted in that cause and do something less foredoomed with them, and there’s a lot that can still be done, even in the confused and calamitous time that’s breaking over us right now. In the posts immediately ahead, as already mentioned, I’ll discuss some of the options; no doubt many of my readers will be able to think of options of their own, for that matter.
I’ve noted before more than once that the collapse of industrial society isn’t something located off in the nearer or further future; it’s something that got under way a good many years ago, has been accelerating around us for decades, and is simply hitting one of the rougher patches of the normal process of decline and fall just now. Most of the nonserious proposals just referred to start from the insistence that that can’t happen. Comforting in the short term, that insistence is a rich source of disaster and misery from any longer perspective, and the sooner each of us gets over it and starts to survey the wreckage around us, the better. Then we can make camp in the ruins, light a fire, get some soup heating in a salvaged iron pot, and begin to talk about where we can go from here.