AODA Blog

Star's Reach: A Novel of the Deindustrial Future

Sat, 2014-04-19 19:18
I'm delighted to report that my deindustrial novel Star's Reach, which appeared in episodes as a blog between 2009 and late last year, is now available in print and ebook formats from Founders House Publishing. I've suggested here more than once that narratives are among the most important tools we have for understanding and shaping the future; from that perspective, Star's Reach is a contribution to imagining a future that isn't locked inside the Hobson's choice between business as usual and overnight catastrophe that's frozen into place in the collective imagination of our time. If a story of adventure and alien contact in 25th-century neomedieval America appeals to you, you might want to give it a read!

The End of Employment

Wed, 2014-04-16 16:51
Nothing is easier, as the Long Descent begins to pick up speed around us, than giving in to despair—and nothing is more pointless. Those of us who are alive today are faced with the hugely demanding task of coping with the consequences of industrial civilization’s decline and fall, and saving as many as possible of the best achievements of the last few centuries so that they can cushion the descent and enrich the human societies of the far future.  That won’t be easy; so?  The same challenge has been faced many times before, and quite often it’s been faced with relative success.
The circumstances of the present case are in some ways more difficult than past equivalents, to be sure, but the tools and the knowledge base available to cope with them are almost incomparably greater. All in all, factoring in the greater challenges and the greater resources, it’s probably fair to suggest that the challenge of our time is about on a par with other eras of decline and fall.  The only question that still remains to be settled is how many of the people who are awake to the imminence of crisis will rise to the challenge, and how many will fail to do so.
The suicide of peak oil writer Mike Ruppert two days ago puts a bit of additional emphasis on that last point. I never met Ruppert, though we corresponded back in the days when his “From The Wilderness” website was one of the few places on the internet that paid any attention at all to peak oil, and I don’t claim to know what personal demons drove him to put a bullet through his brain. Over the last eight years, though, as the project of this blog has brought me into contact with more and more people who are grappling with the predicament of our time, I’ve met a great many people whose plans for dealing with a postpeak world amount to much the same thing.  Some of them are quite forthright about it, which at least has the virtue of honesty.  Rather more of them conceal the starkness of that choice behind a variety of convenient evasions, the insistence that we’re all going to die soon anyway being far and away the most popular of these just now.
I admit to a certain macabre curiosity about how that will play out in the years ahead. I’ve suspected for a while now, for example, that the baby boomers will manage one final mediagenic fad on the way out, and the generation that marked its childhood with coonskin caps and hula hoops and its puberty with love beads and Beatlemania will finish with a fad for suicide parties, in which attendees reminisce to the sound of the tunes they loved in high school, then wash down pills with vodka and help each other tie plastic bags over their heads. Still, I wonder how many people will have second thoughts once every other option has gone whistling down the wind, and fling themselves into an assortment of futile attempts to have their cake when they’ve already eaten it right down to the bare plate. We may see some truly bizarre religious movements, and some truly destructive political ones, before those who go around today insisting that they don’t want to live in a deindustrial world finally get their wish.
There are, of course, plenty of other options. The best choice for most of us, as I’ve noted here in previous posts, follows a strategy I’ve described wryly as “collapse first and avoid the rush:”  getting ahead of the curve of decline, in other words, and downshifting to a much less extravagant lifestyle while there’s still time to pick up the skills and tools needed to do it competently. Despite the strident insistence from defenders of the status quo that anything less than business as usual amounts to heading straight back to the caves, it’s entirely possible to have a decent and tolerably comfortable life on a tiny fraction of the energy and resource base that middle class Americans think they can’t possibly do without. Mind you, you have to know how to do it, and that’s not the sort of knowledge you can pick up from a manual, which is why it’s crucial to start now and get through the learning curve while you still have the income and the resources to cushion the impact of the inevitable mistakes.
This is more or less what I’ve been saying for eight years now. The difficulty at this stage in the process, though, is that a growing number of Americans are running out of time. I don’t think it’s escaped the notice of many people in this country that despite all the cheerleading from government officials, despite all the reassurances from dignified and clueless economists, despite all those reams of doctored statistics gobbled down whole by the watchdogs-turned-lapdogs of the media and spewed forth undigested onto the evening news, the US economy is not getting better.  Outside a few privileged sectors, times are hard and getting harder; more and more Americans are slipping into the bleak category of the long-term unemployed, and a great many of those who can still find employment work at part-time positions for sweatshop wages with no benefits at all.
Despite all the same cheerleading, reassurances, and doctored statistics, furthermore, the US economy is not going to get better: not for more than brief intervals by any measure, and not at all if “better”  means returning to some equivalent of America’s late 20th century boomtime. Those days are over, and they will not return. That harsh reality is having an immediate impact on some of my readers already, and that impact will only spread as time goes on. For those who have already been caught by the economic downdrafts, it’s arguably too late to collapse first and avoid the rush; willy-nilly, they’re already collapsing as fast as they can, and the rush is picking up speed around them as we speak.
For those who aren’t yet in that situation, the need to make changes while there’s still time to do so is paramount, and a significant number of my readers seem to be aware of this. One measure of that is the number of requests for personal advice I field, which has gone up steeply in recent months. Those requests cover a pretty fair selection of the whole gamut of human situations in a failing civilization, but one question has been coming up more and more often of late: the question of what jobs might be likely to provide steady employment as the industrial economy comes apart.
That’s a point I’ve been mulling over of late, since its implications intersect the whole tangled web in which our economy and society is snared just now. In particular, it assumes that the current way of bringing work together with workers, and turning the potentials of human mind and muscle toward the production of goods and services, is likely to remain in place for the time being, and it’s becoming increasingly clear to me that this won’t be the case.
It’s important to be clear on exactly what’s being discussed here. Human beings have always had to produce goods and services to stay alive and keep their families and communities going; that’s not going to change. In nonindustrial societies, though, most work is performed by individuals who consume the product of their own labor, and most of the rest is sold or bartered directly by the people who produce it to the people who consume it. What sets the industrial world apart is that a third party, the employer, inserts himself into this process, hiring people to produce goods and services and then selling those goods and services to buyers.  That’s employment, in the modern sense of the word; most people think of getting hired by an employer, for a fixed salary or wage, to produce goods and services that the employer then sells to someone else, as the normal and natural state of affairs—but it’s a state of affairs that is already beginning to break down around us, because the surpluses that make that kind of employment economically viable are going away.
Let’s begin with the big picture. In any human society, whether it’s a tribe of hunter-gatherers, an industrial nation-state, or anything else, people apply energy to raw materials to produce goods and services; this is what we mean by the word “economy.” The goods and services that any economy can produce are strictly limited by the energy sources and raw materials that it can access.
A principle that ecologists call Liebig’s law of the minimum is relevant here: the amount of anything  that a given species or ecosystem can produce in a given place and time is limited by whichever resource is in shortest supply. Most people get that when thinking about the nonhuman world; it makes sense that plants can’t use extra sunlight to make up for a shortage of water, and that you can’t treat soil deficient in phosphates by adding extra nitrates. It’s when you apply this same logic to human societies that the mental gears jam up, because we’ve been told so often that one resource can always be substituted for another that most people believe it without a second thought.
What’s going on here, though, is considerably more subtle than current jargon reflects. Examine most of the cases of resource substitution that find their way into economics textbooks, and you’ll find that what’s happened is that a process of resource extraction that uses less energy on a scarcer material has been replaced by another process that takes more energy but uses more abundant materials. The shift from high-quality iron ores to low-grade taconite that reshaped the iron industry in the 20th century, for example, was possible because ever-increasing amounts of highly concentrated energy could be put into the smelting process without making the resulting iron too expensive for the market.
The point made by this and comparable examples is applicable across the board to what I’ve termed technic societies, that subset of human societies—ours is the first, though probably not the last—in which a large fraction of total energy per capita comes from nonbiological sources and is put to work by way of  machines rather than human or animal muscles.  Far more often than not, in such societies, concentrated energy is the limiting resource. Given an abundant enough supply of concentrated energy at a low enough price, it would be possible to supply a technic society with raw materials by extracting dissolved minerals from seawater or chewing up ordinary rock to get a part per million or so of this or that useful element. Lacking that—and there are good reasons to think that human societies will always be lacking that—access to concentrated energy is where Liebig’s law bites down hard.
Another way to make this same point is to think of how much of any given product a single worker can make in a day using a set of good hand tools, and comparing that to the quantity of the same thing that the same worker could make using the successive generations of factory equipment, from the steam-driven and belt-fed power tools of the late 19th century straight through to the computerized milling machines and assembly-line robots of today. The difference can be expressed most clearly as a matter of the amount of energy being applied directly and indirectly to the manufacturing process—not merely the energy driving the tools through the manufacturing process, but the energy that goes into  manufacturing and maintaining the tools, supporting the infrastructure needed for manufacture and maintenance, and so on through the whole system involved in the manufacturing process.
Maverick economist E.F. Schumacher, whose work has been discussed in this blog many times already, pointed out that the cost per worker of equipping a workplace is one of the many crucial factors that  mainstream economic thought invariably neglects. That cost is usually expressed in financial terms, but underlying the abstract tokens we call money is a real cost in energy, expressed in terms of the goods and services that have to be consumed in the process of equipping and maintaining the workplace. If you have energy to spare, that’s not a problem; if you don’t, on the other hand, you’re actually better off using a less complex technology—what Schumacher called “intermediate technology” and the movement in which I studied green wizardry thirty years ago called “appropriate technology.”
The cost per worker of equipping a workplace, in turn, also has a political dimension—a point that Schumacher did not neglect, though nearly all other economists pretend that it doesn’t exist. The more costly it is to equip a workplace, the more certain it is that workers won’t be able to set themselves up in business, and the more control the very rich will then have over economic production and the supply of jobs. As Joseph Tainter pointed out in The Collapse of Complex Societies, social complexity correlates precisely with social hierarchy; one of the functions of complexity, in the workplace as elsewhere, is thus to maintain existing social pecking orders.
Schumacher’s arguments, though, focused on the Third World nations of his own time, which had very little manufacturing capacity at all—most of them, remember, had been colonies of European empires, assigned the role of producing raw materials and buying finished products from the imperial center as part of the wealth pump that drove them into grinding poverty while keeping their imperial overlords rich. He focused on advising client nations on how to build their own economies and extract themselves from the political grip of their former overlords, who were usually all too eager to import high-tech factories which their upper classes inevitably controlled. The situation is considerably more challenging when  your economy is geared to immense surpluses of concentrated energy, and the supply of energy begins to run short—and of course that’s the situation we’re in today.
Even if it were just a matter of replacing factory equipment, that would be a huge challenge, because all those expensive machines—not to mention the infrastructure that manufactures them, maintains them, supplies them, and integrates their products into the wider economy—count as sunk costs, subject to what social psychologists call the “Concorde fallacy,” the conviction that it’s less wasteful to keep on throwing money into a failing project than to cut your losses and do something else. The real problem is that it’s not just factory equipment; the entire economy has been structured from the ground up to use colossal amounts of highly concentrated energy, and everything that’s been invested in that economy since the beginning of the modern era thus counts as a sunk cost to one degree or another.
What makes this even more challenging is that very few people in the modern industrial world actually produce goods and services for consumers, much less for themselves, by applying energy to raw materials. The vast majority of today’s employees, and in particular all those who have the wealth and influence that come with high social status, don’t do this.  Executives, brokers, bankers, consultants, analysts, salespeople—well, I could go on for pages: the whole range of what used to be called white-collar jobs exists to support the production of goods and services by the working joes and janes managing all the energy-intensive machinery down there on the shop floor. So does the entire vast maze of the financial industry, and so do the legions of government bureaucrats—local, state, and federal—who manage, regulate, or oversee one or another aspect of economic activity.
All these people are understandably just as interested in keeping their jobs as the working joes and janes down there on the shop floor, and yet the energy surpluses that made it economically viable to perch such an immensely complex infrastructure on top of the production of goods and services for consumers are going away. The result is a frantic struggle on everyone’s part to make sure that the other guy loses his job first. It’s a struggle that all of them will ultimately lose—as the energy surplus needed to support it dwindles away, so will the entire system that’s perched on that high but precarious support—and so, as long as that system remains in place, getting hired by an employer, paid a regular wage or salary, and given work and a workplace to produce goods and services that the employer then sells to someone else, is going to become increasingly rare and increasingly unrewarding. 
That transformation is already well under way. Nobody I know personally who works for an employer in the sense I’ve just outlined is prospering in today’s American economy.  Most of the people I know who are employees in the usual sense of the word are having their benefits slashed, their working conditions worsened, their hours cut, and their pay reduced by one maneuver or another, and the threat of being laid off is constantly hovering over their heads.  The few exceptions are treading water and hoping to escape the same fate. None of this is accidental, and none of it is merely the result of greed on the part of the very rich, though admittedly the culture of executive kleptocracy at the upper end of the American social pyramid is making things a good deal worse than they might otherwise be.
The people I know who are prospering right now are those who produce goods and services for their own use, and provide goods and services directly to other people, without having an employer to provide them with work, a workplace, and a regular wage or salary. Some of these people have to stay under the radar screen of the current legal and regulatory system, since the people who work in that system are trying to preserve their own jobs by making life difficult for those who try to do without their services. Others can do things more openly. All of them have sidestepped as many as possible of the infrastructure services that are supposed to be part of an employee’s working life—for example, they aren’t getting trained at universities, since the US academic industry these days is just another predatory business sector trying to keep itself afloat by running others into the ground, and they aren’t going to banks for working capital for much the same reason. They’re using their own labor, their own wits, and their own personal connections with potential customers, to find a niche in which they can earn the money (or barter for the goods) they need or want.
I’d like to suggest that this is the wave of the future—not least because this is how economic life normally operates in nonindustrial societies, where the vast majority of people in the workforce are directly engaged in the production of goods and services for themselves and their own customers.  The surplus that supports all those people in management, finance, and so on is a luxury that nonindustrial societies don’t have. In the most pragmatic of economic senses, collapsing now and avoiding the rush involves getting out of a dying model of economics before it drags you down, and finding your footing in the emerging informal economy while there’s still time to get past the worst of the learning curve.
Playing by the rules of a dying economy, that is, is not a strategy with a high success rate or a long shelf life. Those of my readers who are still employed in the usual sense of the term may choose to hold onto that increasingly rare status, but it’s not wise for them to assume that such arrangements will last indefinitely; using the available money and other resources to get training, tools, and skills for some other way of getting by would probably be a wise strategy. Those of my readers who have already fallen through the widening cracks of the employment economy will have a harder row to hoe in many cases; for them, the crucial requirement is getting access to food, shelter, and other necessities while figuring out what to do next and getting through any learning curve that might be required.
All these are challenges; still, like the broader challenge of coping with the decline and fall of a civilization, they are challenges that countless other people have met in other places and times. Those who are willing to set aside currently popular fantasies of entitlement and the fashionable pleasures of despair will likely be in a position to do the same thing this time around, too.

The Four Industrial Revolutions

Wed, 2014-04-09 16:34
Last week’s post on the vacuous catchphrases that so often substitute for thought in today’s America referenced only a few examples of the species under discussion.  It might someday be educational, or at least entertaining, to write a sequel to H.L. Mencken’s The American Credo, bringing his choice collection of thoughtstoppers up to date with the latest fashionable examples; still, that enticing prospect will have to wait for some later opportunity.In the meantime, those who liked my suggestion of Peak Oil Denial Bingo will doubtless want to know that cards can now be downloaded for free.
What I’d like to do this week is talk about another popular credo, one that plays a very large role in blinding people nowadays to the shape of the future looming up ahead of us all just now. In an interesting display of synchronicity, it came up in a conversation I had while last week’s essay was still being written. A friend and I were talking about the myth of progress, the facile and popular conviction that all human history follows an ever-ascending arc from the caves to the stars; my friend noted how disappointed he’d been with a book about the future that backed away from tomorrow’s challenges into the shelter of a comforting thoughtstopper:  “Technology will always be with us.”
Let’s take a moment to follow the advice I gave in last week’s post and think about what, if anything, that actually means. Taken in the most literal sense, it’s true but trivial. Toolmaking is one of our species’ core evolutionary strategies, and so it’s a safe bet that human beings will have some variety of technology or other as long as our species survives. That requirement could just as easily be satisfied, though, by a flint hand axe as by a laptop computer—and a flint hand axe is presumably not what people who use that particular thoughtstopper have in mind.
Perhaps we might rephrase the credo, then, as “modern technology will always be with us.” That’s also true in a trivial sense, and false in another, equally trivial sense. In the first sense, every generation has its own modern technology; the latest up-to-date flint hand axes were, if you’ll pardon the pun, cutting-edge technology in the time of the Neanderthals.  In the second sense, much of every generation’s modern technology goes away promptly with that generation; whichever way the future goes, much of what counts as modern technology today will soon be no more modern and cutting-edge than eight-track tape players or Victorian magic-lantern projectors. That’s as true if we get a future of continued progress as it is if we get a future of regression and decline.
Perhaps our author means something like “some technology at least as complex as what we have now, and fulfilling most of the same functions, will always be with us.” This is less trivial but it’s quite simply false, as historical parallels show clearly enough. Much of the technology of the Roman era, from wheel-thrown pottery to central heating, was lost in most of the western Empire and had to be brought in from elsewhere centuries later.  In the dark ages that followed the fall of Mycenean Greece, even so simple a trick as the art of writing was lost, while the history of Chinese technology before the modern era is a cycle in which many discoveries made during the heyday of each great dynasty were lost in the dark age that followed its decline and fall, and had to be rediscovered when stability and prosperity returned. For people living in each of these dark ages, technology comparable to what had been in use before the dark age started was emphatically not always with them.
For that matter, who is the “us” that we’re discussing here? Many people right now have no access to the technologies that middle-class Americans take for granted. For all the good that modern technology does them, today’s rural subsistence farmers, laborers in sweatshop factories, and the like might as well be living in some earlier era. I suspect our author is not thinking about such people, though, and the credo thus might be phrased as “some technology at least as complex as what middle-class people in the industrial world have now, providing the same services they have come to expect, will always be available to people of that same class.” Depending on how you define social classes, that’s either true but trivial—if “being middle class” equals “having access to the technology todays middle classes have,” no middle class people will ever be deprived of such a technology because, by definition, there will be no middle class people once the technology stops being available—or nontrivial but clearly false—plenty of people who think of themselves as middle class Americans right now are losing access to a great deal of technology as economic contraction deprives them of their jobs and incomes and launches them on new careers of downward mobility and radical impoverishment.
Well before the analysis got this far, of course, anyone who’s likely to mutter the credo “Technology will always be with us” will have jumped up and yelled, “Oh for heaven’s sake, you know perfectly well what I mean when I use that word! You know, technology!”—or words to that effect. Now of course I do know exactly what the word means in that context: it’s a vague abstraction with no real conceptual meaning at all, but an ample supply of raw emotional force.  Like other thoughtstoppers of the same kind, it serves as a verbal bludgeon to prevent people from talking or even thinking about the brittle, fractious, ambivalent realities that shape our lives these days. Still, let’s go a little further with the process of analysis, because it leads somewhere that’s far from trivial.
Keep asking a believer in the credo we’re discussing the sort of annoying questions I’ve suggested above, and sooner or later you’re likely to get a redefinition that goes something like this: “The coming of the industrial revolution was a major watershed in human history, and no future society of any importance will ever again be deprived of the possibilities opened up by that revolution.” Whether or not that turns out to be true is a question nobody today can answer, but it’s a claim worth considering, because history shows that enduring shifts of this kind do happen from time to time. The agricultural revolution of c. 9000 BCE and the urban revolution of c. 3500 BCE were both decisive changes in human history.  Even though there were plenty of nonagricultural societies after the first, and plenty of nonurban societies after the second, the possibilities opened up by each revolution were always options thereafter, when and where ecological and social circumstances permitted.
Some 5500 years passed between the agricultural revolution and the urban revolution, and since it’s been right around 5500 years since the urban revolution began, a case could probably be made that we were due for another. Still, let’s take a closer look at the putative third revolution. What exactly was the industrial revolution? What changed, and what future awaits those changes?
That’s a far more subtle question than it might seem at first glance, because the cascade of changes that fit under the very broad label “the industrial revolution” weren’t all of a piece. I’d like to suggest, in fact, that there was not one industrial revolution, but four of them—or, more precisely, three and a half. Lewis Mumford’s important 1934 study Technics and Civilization identified three of those revolutions, though the labels he used for them—the eotechnic, paleotechnic, and neotechnic phases—shoved them into a linear scheme of progress that distorts many of their key features. Instead, I propose to borrow the same habit people use when they talk about the Copernican and Darwinian revolutions, and name the revolutions after individuals who played crucial roles in making them happen.
First of all, then—corresponding to Mumford’s eotechnic phase—is the Baconian revolution, which got under way around 1600. It takes its name from Francis Bacon, who was the first significant European thinker to propose that what he called natural philosophy and we call science ought to be reoriented away from the abstract contemplation of the cosmos, and toward making practical improvements in the technologies of the time. Such improvements were already under way, carried out by a new class of “mechanicks” who had begun to learn by experience that building a faster ship, a sturdier plow, a better spinning wheel, or the like could be a quick route to prosperity, and encouraged by governments eager to cash in new inventions for the more valued coinage of national wealth and military victory.
The Baconian revolution, like those that followed it, brought with it a specific suite of technologies. Square-rigged ships capable of  long deepwater voyages revolutionized international trade and naval warfare; canals and canal boats had a similar impact on domestic transport systems. New information and communication media—newspapers, magazines, and public libraries—were crucial elements of the Baconian technological suite, which also encompassed major improvements in agriculture and in metal and glass manufacture, and significant developments in the use of wind and water power, as well as the first factories using division of labor to allow mass production.
The second revolution—corresponding to Mumford’s paleotechnic phase—was the Wattean revolution, which got started around 1780. This takes its name, of course, from James Watt, whose redesign of the steam engine turned it from a convenience for the mining industry to the throbbing heart of a wholly new technological regime, replacing renewable energy sources with concentrated fossil fuel energy and putting that latter to work in every economically viable setting. The steamship was the new vehicle of international trade, the railroad the corresponding domestic transport system; electricity came in with steam, and so did the telegraph, the major new communications technology of the era, while mass production of steel via the Bessemer process had a massive impact across the economic sphere.
The third revolution—corresponding to Mumford’s neotechnic phase—was the Ottonian revolution, which took off around 1890. I’ve named this revolution after Nikolaus Otto, who invented the four-cycle internal combustion engine in 1876 and kickstarted the process that turned petroleum from a source of lamp fuel to the resource that brought the industrial age to its zenith. In the Ottonian era, international trade shifted to diesel-powered ships, supplemented later on by air travel; the domestic transport system was the automobile; the rise of vacuum-state electronics made radio (including television, which is simply an application of radio technology) the major new communications technology; and the industrial use of organic chemistry, turning petroleum and other fossil fuels into feedstocks for plastics, gave the Ottonian era its most distinctive materials.
The fourth, partial revolution, which hadn’t yet begun when Mumford wrote his book, was the Fermian revolution, which can be dated quite precisely to 1942 and is named after Enrico Fermi, the designer and builder of the first successful nuclear reactor.  The keynote of the Fermian era was the application of subatomic physics, not only in nuclear power but also in solid-state electronic devices such as the transistor and the photovoltaic cell. In the middle years of the 20th century, a great many people took it for granted that the Fermian revolution would follow the same trajectory as its Wattean and Ottonian predecessors: nuclear power would replace diesel power in freighters, electricity would elbow aside gasoline as the power source for domestic transport, and nucleonics would become as important as electronics as a core element in new technologies yet unimagined.
Unfortunately for those expectations, nuclear power turned out to be a technical triumph but an economic flop.  Claims that nuclear power would make electricity too cheap to meter ran face first into the hard fact that no nation anywhere has been able to have a nuclear power industry without huge and ongoing government subsidies, while nuclear-powered ships were relegated to the navies of very rich nations, which didn’t have to turn a profit and so could afford to ignore the higher construction and operating costs. Nucleonics turned out to have certain applications, but nothing like as many or as lucrative as the giddy forecasts of 1950 suggested.  Solid state electronics, on the other hand, turned out to be economically viable, at least in a world with ample fossil fuel supplies, and made the computer and the era’s distinctive communications medium, the internet, economically viable propositions.
The Wattean, Ottonian, and Fermian revolutions thus had a core theme in common. Each of them relied on a previously untapped energy resource—coal, petroleum, and uranium, respectively—and set out to build a suite of technologies to exploit that resource and the forms of energy it made available. The scientific and engineering know-how that was required to manage each power source then became the key toolkit for the technological suite that unfolded from it; from the coal furnace, the Bessemer process for making steel was a logical extension, just as the knowledge of hydrocarbon chemistry needed for petroleum refining became the basis for plastics and the chemical industry, and the same revolution in physics that made nuclear fission reactors possible also launched solid state electronics—it’s not often remembered, for example, that Albert Einstein got his Nobel prize for understanding the process that makes PV cells work, not for the theory of relativity.
Regular readers of this blog will probably already have grasped the core implication of this common theme. The core technologies of the Wattean, Ottonian, and Fermian eras all depend on access to large amounts of specific nonrenewable resources.  Fermian technology, for example, demands fissible material for its reactors and rare earth elements for its electronics, among many other things; Ottonian technology demands petroleum and natural gas, and some other resources; Wattean technology demands coal and iron ore. It’s sometimes possible to substitute one set of materials for another—say, to process coal into liquid fuel—but there’s always a major economic cost involved, even if there’s an ample and inexpensive supply of the other resource that isn’t needed for some other purpose.
In today’s world, by contrast, the resources needed for all three technological suites are being used at breakneck rates and thus are either already facing depletion or will do so in the near future. When coal has already been mined so heavily that sulfurous, low-energy brown coal—the kind that miners in the 19th century used to discard as waste—has become the standard fuel for coal-fired power plants, for example, it’s a bit late to talk about a coal-to-liquids program to replace any serious fraction of the world’s petroleum consumption: the attempt to do so would send coal prices soaring to economy-wrecking heights.  Richard Heinberg has pointed out in his useful book Peak Everything, for that matter, that a great deal of the coal still remaining in the ground will take more energy to extract than it will produce when burnt, making it an energy sink rather than an energy source.
Thus we can expect very large elements of Wattean, Ottonian, and Fermian technologies to stop being economically viable in the years ahead, as depletion drives up resource costs and the knock-on effects of the resulting economic contraction force down demand. That doesn’t mean that every aspect of those technological suites will go away, to be sure.  It’s not at all unusual, in the wake of a fallen civilization, to find “orphan technologies” that once functioned as parts of a coherent technological suite, still doing their jobs long after the rest of the suite has fallen out of use.  Just as Roman aqueducts kept bringing water to cities in the post-Roman dark ages whose inhabitants had neither the resources nor the knowledge to build anything of the kind, it’s quite likely that (say) hydroelectric facilities in certain locations will stay in use for centuries to come, powering whatever electrical equipment can maintained or built from local resources, even if the people who tend the dams and use the electricity have long since lost the capacity to build turbines, generators, or dams at all.
Yet there’s another issue involved, because the first of the four industrial revolutions I’ve discussed in this essay—the Baconian revolution—was not dependent on nonrenewable resources.  The suite of technologies that unfolded from Francis Bacon’s original project used the same energy sources that everyone in the world’s urban-agricultural societies had been using for more than three thousand years: human and animal muscle, wind, water, and heat from burning biomass. Unlike the revolutions that followed it, to put the same issue in a different but equally relevant way, the Baconian revolution worked within the limits of the energy budget the Earth receives each year from the Sun, instead of drawing down stored sunlight from the Earth’s store of fossil carbon or its much more limited store of fissible isotopes.  The Baconian era simply used that annual solar budget in a more systematic way than previous societies managed, by directing the considerable intellectual skills of the natural philosophers of the day toward practical ends.
Because of their dependence on nonrenewable resources, the three later revolutions were guaranteed all along to be transitory phases. The Baconian revolution need not be, and I think that there’s a noticeable chance that it will not be. By that I mean, to begin with, that the core intellectual leap that made the Baconian revolution possible—the  scientific method—is sufficiently widespread at this point that with a little help, it may well get through the decline and fall of our civilization and become part of the standard toolkit of future civilizations, in much the same way that classical logic survived the wreck of Rome to be taken up by successor civilizations across the breadth of the Old World.
Still, that’s not all I mean to imply here. The specific technological suite that developed in the wake of the Baconian revolution will still be viable in a post-fossil fuel world, wherever the ecological and social circumstances will permit it to exist at all. Deepwater maritime shipping, canal-borne transport across nations and subcontinents, mass production of goods using the division of labor as an organizing principle, extensive use of wind and water power, and widespread literacy and information exchange involving print media, libraries, postal services, and the like, are all options available to societies in the deindustrial world. So are certain other technologies that evolved in the post-Baconian era, but fit neatly within the Baconian model: solar thermal technologies, for example, and those forms of electronics that can be economically manufactured and powered with the limited supplies of concentrated energy a sustainable society will have on hand.
I’ve suggested in previous posts here, and in my book The Ecotechnic Future, that our current industrial society may turn out to be merely the first, most wasteful, and least durable of what might  best be called “technic societies”—that is, human societies that get a large fraction of their total energy supply from sources other than human and animal muscle, and support complex technological suites on that basis. The technologies of the Baconian era, I propose, offer a glimpse of what an emerging ecotechnic society might look like in practice—and a sense of the foundations on which the more complex ecotechnic societies of the future will build.
When the book mentioned at the beginning of this essay claimed that “technology will always be with us,” it’s a safe bet that the author wasn’t thinking of tall ships, canal boats, solar greenhouses, and a low-power global radio net, much less the further advances along the same lines that might well be possible in a post-fossil fuel world. Still, it’s crucial to get outside the delusion that the future must either be a flashier version of the present or a smoldering wasteland full of bleached bones, and start to confront the wider and frankly more interesting possibilities that await our descendants.
***************Along these same lines, I’d like to remind readers that this blog’s second post-peak oil science fiction contest has less than a month left to run. Those of you who are still working on stories need to get them finished, posted online, and linked to a comment on this blog before May 1 to be eligible for inclusion in the second After Oil anthology. Get ‘em in!

Mentats Wanted, Will Train

Wed, 2014-04-02 17:17
The theme of last week’s post here on The Archdruid Report—the strategy of preserving or reviving technologies for the deindustrial future now, before the accelerating curve of decline makes that task more difficult than it already is—can be applied very broadly indeed. Just now, courtesy of the final blowoff of the age of cheap energy, we have relatively easy access to plenty of information about what worked in the past; some other resources are already becoming harder to get, but there’s still time and opportunity to accomplish a great deal.

I’ll be talking about some of the possibilities as we proceed, and with any luck, other people will get to work on projects of their own that I haven’t even thought of. This week, though, I want to take Gustav Erikson’s logic in a direction that probably would have made the old sea dog scratch his head in puzzlement, and talk about how a certain set of mostly forgotten techniques could be put back into use right now to meet a serious unmet need in contemporary American society.
The unmet need I have in mind is unusually visible just now, courtesy of the recent crisis in the Ukraine. I don’t propose to get into the whys and wherefores of that crisis just now, except to note that since the collapse of the Austro-Hungarian Empire, the small nations of eastern Europe have been grist between the spinning millstones of Russia and whichever great power dominates western Europe. It’s not a comfortable place to be; Timothy Snyder’s terse description of 20th century eastern Europe as “bloodlands” could be applied with equal force to any set of small nations squeezed between empires, and it would take quite a bit of unjustified faith in human goodness to think that the horrors of the last century have been safely consigned to the past.
The issue I want to discuss, rather, has to do with the feckless American response to that crisis. Though I’m not greatly interested in joining the chorus of American antigovernment activists fawning around Vladimir Putin’s feet these days, it’s fair to say that he won this one. Russia’s actions caught the United States and EU off balance, secured the Russian navy’s access to the Black Sea and the Mediterranean, and boosted Putin’s already substantial popularity at home. By contrast, Obama came across as amateurish and, worse, weak.  When Obama announced that the US retaliation would consist of feeble sanctions against a few Russian banks and second-string politicians, the world rolled its eyes, and the Russian Duma passed a resolution scornfully requesting Obama to apply those same sanctions to every one of its members.
As the crisis built, there was a great deal of talk in the media about Europe’s dependence on Russian natural gas, and the substantial influence over European politics that Russia has as a result of that unpalatable fact. It’s a major issue, and unlikely to go away any time soon; around a third of the natural gas that keeps Europeans from shivering in the dark each winter comes from Russian gas fields, and the Russian government has made no bones about the fact that it could just as well sell that gas to somebody to Russia’s south or east instead. It was in this context that American politicians and pundits started insisting at the top of their lungs that the United States had a secret weapon against the Sov—er, Russian threat: exports of abundant natural gas from America, which would replace Russian gas in Europe’s stoves, furnaces, and power plants.
As Richard Heinberg pointed out trenchantly a few days back in a typically spot-on essay, there’s only one small problem with this cozy picture: the United States has no spare natural gas to export.  It’s a net importer of natural gas, as it typically burns over a hundred billion more cubic feet of gas each month than it produces domestically.  What’s more, even according to the traditionally rose-colored forecasts issued by the EIA, it’ll be 2020 at the earliest before the United States has any natural gas to spare for Europe’s needs. Those forecasts, by the way, blithely assume that the spike in gas production driven by the recent fracking bubble will just keep on levitating upwards for the foreseeable future; if this reminds you of the rhetoric surrounding tech stocks in the runup to 2000, housing prices in the runup to 2008, or equivalent phenomena in the history of any other speculative swindle you care to name, let’s just say you’re not alone.
According to those forecasts that start from the annoying fact that the laws of physics and geology do actually apply to us, on the other hand, the fracking boom will be well into bust territory by 2020, and those promised torrents of natural gas that will allegedly free Europe from Russian influence will therefore never materialize at all. At the moment, furthermore, boasting about America’s alleged surplus of natural gas for export is particularly out of place, because US natural gas inventories currently in storage are less than half their five-year average level for this time of year, having dropped precipitously since December. Since all this is public information, we can be quite confident that the Russians are aware of it, and this may well explain some of the air of amused contempt with which Putin and his allies have responded to American attempts to rattle a saber that isn’t there.
Any of the politicians and pundits who participated in that futile exercise could have found out the problems with their claim in maybe two minutes of internet time.  Any of the reporters and editors who printed those claims at face value could have done the same thing. I suppose it’s possible that the whole thing was a breathtakingly cynical exercise of Goebbels’ “Big Lie” principle, intended to keep Americans from noticing that the Obama’s people armed themselves with popguns for a shootout at the OK Corral. I find this hard to believe, though, because the same kind of thinking—or, more precisely, nonthinking—is so common in America these days.
It’s indicative that my post here two weeks ago brought in a bumper crop of the same kind of illogic. My post took on the popular habit of using the mantra “it’s different this time” to insist that the past has nothing to teach us about the present and the future. Every event, I pointed out, has some features that set it apart from others, and other features that it shares in common with others; pay attention to the common features and you can observe the repeating patterns, which can then be adjusted to take differences into account.  Fixate on the differences and deny the common features, though, and you have no way to test your beliefs—which is great if you want to defend your beliefs against reasonable criticism, but not so useful if you want to make accurate predictions about where we’re headed.
Did the critics of this post—and there were quite a few of them—challenge this argument, or even address it? Not in any of the peak oil websites I visited. What happened instead was that commenters brandished whatever claims about the future are dearest to their hearts and then said, in so many words, “It’s different this time”—as though that somehow answered me. It was quite an impressive example of sheer incantation, the sort of thing we saw not that long ago when Sarah Palin fans were trying to conjure crude oil into America’s depleted oilfields by chanting “Drill, baby, drill” over and over again. I honestly felt as though I’d somehow dozed off at the computer and slipped into a dream in which I was addressing an audience of sheep, who responded by bleating “But it’s different this ti-i-i-i-ime” in perfect unison.
A different mantra sung to the same bleat, so to speak, seems to have been behind the politicians and pundits, and all that nonexistent natural gas they thought was just waiting to be exported to Europe. The thoughtstopping phrase here is “America has abundant reserves of natural gas.” It will doubtless occur to many of my readers that this statement is true, at least for certain values of that nicely vague term “abundant,” just as it’s true that every historical event differs in at least some way from everything that’s happened in the past, and that an accelerated program of drilling can (and in fact did) increase US petroleum production by a certain amount, at least for a while. The fact that each of these statements is trivially true does not make any of them relevant.
That is to say, a remarkably large number of Americans, including the leaders of our country and the movers and shakers of our public opinion, are so inept at the elementary skills of thinking that they can’t tell the difference between mouthing a platitude and having a clue.
I suppose this shouldn’t surprise me as much as it does. For decades now, American public life has been dominated by thoughtstoppers of this kind—short, emotionally charged declarative sentences, some of them trivial, some of them incoherent, none of them relevant and all of them offered up as sound bites by politicians, pundits, and ordinary Americans alike, as though they meant something and proved something. The redoubtable H.L. Mencken, writing at a time when such things were not quite as universal in the American mass mind than they have become since then, called them “credos.”  It was an inspired borrowing from the Latin credo, “I believe,” but its relevance extends far beyond the religious sphere. 
Just as plenty of believing Americans in Mencken’s time liked to affirm their fervent faith in the doctrines of whatever church they attended without having the vaguest idea of what those doctrines actually meant, a far vaster number of Americans these days—religious, irreligious, antireligious, or concerned with nothing more supernatural than the apparent capacity of Lady Gaga’s endowments to defy the laws of gravity—gladly affirm any number of catchphrases about which they seem never to have entertained a single original thought. Those of my readers who have tried to talk about the future with their family and friends will be particularly familiar with the way this works; I’ve thought more than once of providing my readers with Bingo cards marked with the credos most commonly used to silence discussions of our future—“they’ll think of something,” “technology can solve any problem,” “the world’s going to end soon anyway,” “it’s different this time,” and so on—with some kind of prize for whoever fills theirs up first.
The prevalence of credos, though, is only the most visible end of a culture of acquired stupidity that I’ve discussed here in previous posts, and Erik Lindberg has recently anatomized in a crisp and thoughtful blog post. That habit of cultivated idiocy is a major contributor to the crisis of our age, but a crisis is always an opportunity, and with that in mind, I’d like to propose that it’s time for some of us, at least, to borrow a business model from the future, and start getting prepared for future job openings as mentats.
In Frank Herbert’s iconic SF novel Dune, as many of my readers will be aware, a revolt against computer technology centuries before the story opened led to a galaxywide ban on thinking machines—“Thou shalt not make a machine in the image of a human mind”—and a corresponding focus on developing human capacities instead of replacing them with hardware. The mentats were among the results: human beings trained from childhood to absorb, integrate, and synthesize information. Think of them as the opposite end of human potential from the sort of credo-muttering couch potatoes who seem to make up so much of the American population these days:  ask a mentat if it really is different this time, and after he’s spent thirty seconds or so reviewing the entire published literature on the subject, he’ll give you a crisp first-approximation analysis explaining what’s different, what’s similar, which elements of each category are relevant to the situation, and what your best course of action would be in response.
Now of course the training programs needed to get mentats to this level of function haven’t been invented yet, but the point still stands: people who know how to think, even at a less blinding pace than Herbert’s fictional characters manage, are going to be far better equipped to deal with a troubled future than those who haven’t.  The industrial world has been conducting what amounts to a decades-long experiment to see whether computers can make human beings more intelligent, and the answer at this point is a pretty firm no. In particular, computers tend to empower decision makers without making them noticeably smarter, and the result by and large is that today’s leaders are able to make bad decisions more easily and efficiently than ever before. That is to say, machines can crunch data, but it takes a mind to turn data into information, and a well-trained and well-informed mind to refine information into wisdom.
What makes a revival of the skills of thinking particularly tempting just now is that the bar is set so low. If you know how to follow an argument from its premises to its conclusion, recognize a dozen or so of the most common logical fallacies, and check the credentials of a purported fact, you’ve just left most Americans—including the leaders of our country and the movers and shakers of our public opinon—way back behind you in the dust. To that basic grounding in how to think, add a good general knowledge of history and culture and a few branches of useful knowledge in which you’ve put some systematic study, and you’re so far ahead of the pack that you might as well hang out your shingle as a mentat right away.
Now of course it may be a while before there’s a job market for mentats—in the post-Roman world, it took several centuries for those people who preserved the considerable intellectual toolkit of the classical world to find a profitable economic niche, and that required them to deck themselves out in tall hats with moons and stars on them. In the interval before the market for wizards opens up again, though, there are solid advantages to be gained by the sort of job training I’ve outlined, unfolding from the fact that having mental skills that go beyond muttering credos makes it possible to make accurate predictions about the future that are considerably more accurate than the ones guiding most Americans today. .
This has immediate practical value in all sorts of common, everyday situations these days. When all the people you know are rushing to sink every dollar they have in the speculative swindle du jour, for example, you’ll quickly recognize the obvious signs of a bubble in the offing, walk away, and keep your shirt while everyone else is losing theirs. When someone tries to tell you that you needn’t worry about energy costs or shortages because the latest piece of energy vaporware will surely solve all our problems, you’ll be prepared to ignore him and go ahead with insulating your attic, and when someone else insists that the Earth is sure to be vaporized any day now by whatever apocalypse happens to be fashionable that week, you’ll be equally prepared to ignore him and go ahead with digging the new garden bed. 
When the leaders of your country claim that an imaginary natural gas surplus slated to arrive six years from now will surely make Putin blink today, for that matter, you’ll draw the logical conclusion, and get ready for the economic and political impacts of another body blow to what’s left of America’s faltering global power and reputation. It may also occur to you—indeed, it may have done so already—that the handwaving about countering Russia is merely an excuse for building the infrastructure needed to export American natural gas to higher-paying global markets, which will send domestic gas prices soaring to stratospheric levels in the years ahead; this recognition might well inspire you to put a few extra inches of insulation up there in the attic, and get a backup heat source that doesn’t depend either on gas or on gas-fired grid electricity, so those soaring prices don’t have the chance to clobber you.
If these far from inconsiderable benefits tempt you, dear reader, I’d like to offer you an exercise as the very first step in your mentat training.  The exercise is this: the next time you catch someone (or, better yet, yourself) uttering a familiar thoughtstopper about the future—“It’s different this time,” “They’ll think of something,” “There are no limits to what human beings can achieve,” “The United States has an abundant supply of natural gas,” or any of the other entries in the long and weary list of contemporary American credos—stop right there and think about it. Is the statement true? Is it relevant? Does it address the point under discussion?  Does the evidence that supports it, if any does, outweigh the evidence against it? Does it mean what the speaker thinks it means? Does it mean anything at all?
There’s much more involved than this in learning how to think, of course, and down the road I propose to write a series of posts on the subject, using as raw material for exercises more of the popular idiocies behind which America tries to hide from the future. I would encourage all the readers of this blog to give this exercise a try, though. In an age of accelerating decline, the habit of letting arbitrary catchphrases replace actual thinking is a luxury that nobody can really afford, and those who cling to such things too tightly can expect to be blindsided by a future that has no interest in playing along with even the most fashionable credos.
*******************In not unrelated news, I’m pleased to report that the School of Economic Science will be hosting a five week course in London on Economics, Energy and Environment, beginning April 29 of this year, based in part on ideas from my book The Wealth of Nature. The course will finish up with a conference on June 1 at which, ahem, I’ll be one of the speakers. Details are at www.eeecourse.org.

Captain Erikson's Equation

Wed, 2014-03-26 17:15
I have yet to hear anyone in the peak oil blogosphere mention the name of Captain Gustaf Erikson of the Åland Islands and his fleet of windjammers.  For all I know, he’s been completely forgotten now, his name and accomplishments packed away in the same dustbin of forgotten history as solar steam-engine pioneer Augustin Mouchot, his near contemporary. If so, it’s high time that his footsteps sounded again on the quarterdeck of our collective imagination, because his story—and the core insight that committed him to his lifelong struggle—both have plenty to teach about the realities framing the future of technology in the wake of today’s era of fossil-fueled abundance.
Erikson, born in 1872, grew up in a seafaring family and went to sea as a ship’s boy at the age of nine. At 19 he was the skipper of a coastal freighter working the Baltic and North Sea ports; two years later he shipped out as mate on a windjammer for deepwater runs to Chile and Australia, and eight years after that he was captain again, sailing three- and four-masted cargo ships to the far reaches of the planet. A bad fall from the rigging in 1913 left his right leg crippled, and he left the sea to become a shipowner instead, buying the first of what would become the 20th century’s last major fleet of windpowered commercial cargo vessels.
It’s too rarely remembered these days that the arrival of steam power didn’t make commercial sailing vessels obsolete across the board. The ability to chug along at eight knots or so without benefit of wind was a major advantage in some contexts—naval vessels and passenger transport, for example—but coal was never cheap, and the long stretches between coaling stations on some of the world’s most important trade routes meant that a significant fraction of a steamship’s total tonnage had to be devoted to coal, cutting into the capacity to haul paying cargoes. For bulk cargoes over long distances, in particular, sailing ships were a good deal more economical all through the second half of the 19th century, and some runs remained a paying proposition for sail well into the 20th.
That was the niche that the windjammers of the era exploited. They were huge—up to 400 feet from stem to stern—square-sided, steel-hulled ships, fitted out with more than an acre of canvas and miles of steel-wire rigging.  They could be crewed by a few dozen sailors, and hauled prodigious cargoes:  up to 8,000 tons of Australian grain, Chilean nitrate—or, for that matter, coal; it was among the ironies of the age that the coaling stations that allowed steamships to refuel on long voyages were very often kept stocked by tall ships, which could do the job more economically than steamships themselves could. The markets where wind could outbid steam were lucrative enough that at the beginning of the 20th century, there were still thousands of working windjammers hauling cargoes across the world’s oceans.
That didn’t change until bunker oil refined from petroleum ousted coal as the standard fuel for powered ships. Petroleum products carry much more energy per pound than even the best grade of coal, and the better grades of coal were beginning to run short and rise accordingly in price well before the heyday of the windjammers was over. A diesel-powered vessel had to refuel less often, devote less of its tonnage to fuel, and cost much less to operate than its coal-fired equivalent. That’s why Winston Churchill, as head of Britain’s Admiralty, ordered the entire British Navy converted from coal to oil in the years just before the First World War, and why coal-burning steamships became hard to find anywhere on the seven seas once the petroleum revolution took place. That’s also why most windjammers went out of use around the same time; they could compete against coal, but not against dirt-cheap diesel fuel.
Gustav Erikson went into business as a shipowner just as that transformation was getting under way. The rush to diesel power allowed him to buy up windjammers at a fraction of their former price—his first ship, a 1,500-ton bark, cost him less than $10,000, and the pride of his fleet, the four-masted Herzogin Cecilie, set him back only $20,000.  A tight rein on operating expenses and a careful eye on which routes were profitable kept his firm solidly in the black. The bread and butter of his business came from shipping wheat from southern Australia to Europe; Erikson’s fleet and the few other windjammers still in the running would leave European ports in the northern hemisphere’s autumn and sail for Spencer Gulf on Australia’s southern coast, load up with thousands of tons of wheat, and then race each other home, arriving in the spring—a good skipper with a good crew could make the return trip in less than 100 days, hitting speeds upwards of 15 knots when the winds were right.
There was money to be made that way, but Erikson’s commitment to the windjammers wasn’t just a matter of profit. A sentimental attachment to tall ships was arguably part of the equation, but there was another factor as well. In his latter years, Erikson was fond of telling anyone who would listen that a new golden age for sailing ships was on the horizon:  sooner or later, he insisted, the world’s supply of coal and oil would run out, steam and diesel engines would become so many lumps of metal fit only for salvage, and those who still knew how to haul freight across the ocean with only the wind for power would have the seas, and the world’s cargoes, all to themselves.
Those few books that mention Erikson at all like to portray him as the last holdout of a departed age, a man born after his time. On the contrary, he was born before his time, and lived too soon. When he died in 1947, the industrial world’s first round of energy crises were still a quarter century away, and only a few lonely prophets had begun to grasp the absurdity of trying to build an enduring civilization on the ever-accelerating consumption of a finite and irreplaceable fuel supply. He had hoped that his sons would keep the windjammers running, and finish the task of getting the traditions and technology of the tall ships through the age of fossil fuels and into the hands of the seafarers of the future. I’m sorry to say that that didn’t happen; the profits to be made from modern freighters were too tempting, and once the old man was gone, his heirs sold off the windjammers and replaced them with diesel-powered craft.
Erikson’s story is worth remembering, though, and not simply because he was an early prophet of what we now call peak oil. He was also one of the very first people in our age to see past the mythology of technological progress that dominated the collective imagination of his time and ours, and glimpse the potentials of one of the core strategies this blog has been advocating for the last eight years.
We can use the example that would have been dearest to his heart, the old technology of windpowered maritime cargo transport, to explore those potentials. To begin with, it’s crucial to remember that the only thing that made tall ships obsolete as a transport technology was cheap abundant petroleum. The age of coal-powered steamships left plenty of market niches in which windjammers were economically more viable than steamers.  The difference, as already noted, was a matter of energy density—that’s the technical term for how much energy you get out of each pound of fuel; the best grades of coal have only about half the energy density of petroleum distillates, and as you go down the scale of coal grades, energy density drops steadily.  The brown coal that’s commonly used for fuel these days provides, per pound, rather less than a quarter the heat energy you get from a comparable weight of bunker oil.
As the world’s petroleum reserves keep sliding down the remorseless curve of depletion, in turn, the price of bunker oil—like that of all other petroleum products—will continue to move raggedly upward. If Erikson’s tall ships were still in service, it’s quite possible that they would already be expanding their market share; as it is, it’s going to be a while yet before rising fuel costs will make it economical for shipping firms to start investing in the construction of a new generation of windjammers.  Nonetheless, as the price of bunker oil keeps rising, it’s eventually going to cross the line at which sail becomes the more profitable option, and when that happens, those firms that invest in tall ships will profit at the expense of their old-fahioned, oil-burning rivals.
Yes, I’m aware that this last claim flies in the face of one of the most pervasive superstitions of our time, the faith-based insistence that whatever technology we happen to use today must always and forever be better, in every sense but a purely sentimental one, than whatever technology it replaced. The fact remains that what made diesel-powered maritime transport standard across the world’s oceans was not some abstract superiority of bunker oil over wind and canvas, but the simple reality that for a  while, during the heyday of cheap abundant petroleum, diesel-powered freighters were more profitable to operate than any of the other options.  It was always a matter of economics, and as petroleum depletion tilts the playing field the other way, the economics will change accordingly.
All else being equal, if a shipping company can make larger profits moving cargoes by sailing ships than by diesel freighters, coal-burning steamships, or some other option, the sailing ships will get the business and the other options will be left to rust in port. It really is that simple. The point at which sailing vessels become economically viable, in turn, is determined partly by fuel prices and partly by the cost of building and outfitting a new generation of sailing ships. Erikson’s plan was to do an end run around the second half of that equation, by keeping a working fleet of windjammers in operation on niche routes until rising fuel prices made it profitable to expand into other markets. Since that didn’t happen, the lag time will be significantly longer, and bunker fuel may have to price itself entirely out of certain markets—causing significant disruptions to maritime trade and to national and regional economies—before it makes economic sense to start building windjammers again.
It’s a source of wry amusement to me that when the prospect of sail transport gets raised, even in the greenest of peak oil circles, the immediate reaction from most people is to try to find some way to smuggle engines back onto the tall ships. Here again, though, the issue that matters is economics, not our current superstitious reverence for loud metal objects. There were plenty of ships in the 19th century that combined steam engines and sails in various combinations, and plenty of ships in the early 20th century that combined diesel engines and sails the same way.  Windjammers powered by sails alone were more economical than either of these for long-range bulk transport, because engines and their fuel supplies cost money, they take up tonnage that can otherwise be used for paying cargo, and their fuel costs cut substantially into profits as well.
For that matter, I’ve speculated in posts here about the possibility that Augustin Mouchot’s solar steam engines, or something like them, could be used as a backup power source for the windjammers of the deindustrial future. It’s interesting to note that the use of renewable energy sources for shipping in Erikson’s time wasn’t limited to the motive power provided by sails; coastal freighters of the kind Erikson skippered when he was nineteen were called “onkers” in Baltic Sea slang, because their windmill-powered deck pumps made a repetitive “onk-urrr, onk-urrr” noise. Still, the same rule applies; enticing as it might be to imagine sailors on a becalmed windjammer hauling the wooden cover off a solar steam generator, expanding the folding reflector, and sending steam down belowdecks to drive a propeller, whether such a technology came into use would depend on whether the cost of buying and installing a solar steam engine, and the lost earning capacity due to hold space being taken up by the engine, was less than the profit to be made by getting to port a few days sooner.
Are there applications where engines are worth having despite their drawbacks? Of course. Unless the price of biodiesel ends up at astronomical levels, or the disruptions ahead along the curve of the Long Descent cause diesel technology to be lost entirely, tugboats will probably have diesel engines for the imaginable future, and so will naval vessels; the number of major naval battles won or lost in the days of sail because the wind blew one way or another will doubtless be on the minds of many as the age of petroleum winds down. Barring a complete collapse in technology, in turn, naval vessels will no doubt still be made of steel—once cannons started firing explosive shells instead of solid shot, wooden ships became deathtraps in naval combat—but most others won’t be; large-scale steel production requires ample supplies of coke, which is produced by roasting coal, and depletion of coal supplies in a postpetroleum future guarantees that steel will be much more expensive compared to other materials than it is today, or than it was during the heyday of the windjammers.
Note that here again, the limits to technology and resource use are far more likely to be economic than technical. In purely technical terms, a maritime nation could put much of its arable land into oil crops and use that to keep its merchant marine fueled with biodiesel. In economic terms, that’s a nonstarter, since the advantages to be gained by it are much smaller than the social and financial costs that would be imposed by the increase in costs for food, animal fodder, and all other agricultural products. In the same way, the technical ability to build an all-steel merchant fleet will likely still exist straight through the deindustrial future; what won’t exist is the ability to do so without facing prompt bankruptcy. That’s what happens when you have to live on the product of each year’s sunlight, rather than drawing down half a billion years of fossil photosynthesis:  there are hard economic limits to how much of anything you can produce, and increasing production of one thing pretty consistently requires cutting production of something else. People in today’s industrial world don’t have to think like that, but their descendants in the deindustrial world will either learn how to do so or perish.
This point deserves careful study, as it’s almost always missed by people trying to think their way through the technological consequences of the deindustrial future. One reader of mine who objected to talk about abandoned technologies in a previous post quoted with approval the claim, made on another website, that if a deindustrial society can make one gallon of biodiesel, it can make as many thousands or millions of gallons as it wants.  Technically, maybe; economically, not a chance.  It’s as though you made $500 a week and someone claimed you could buy as many bottles of $100-a-bottle scotch as you wanted; in any given week, your ability to buy expensive scotch would be limited by your need to meet other expenses such as food and rent, and some purchase plans would be out of reach even if you ignored all those other expenses and spent your entire paycheck at the liquor store. The same rule applies to societies that don’t have the windfall of fossil fuels at their disposal—and once we finish burning through the fossil fuels we can afford to extract, every human society for the rest of our species’ time on earth will be effectively described in those terms.
The one readily available way around the harsh economic impacts of fossil fuel depletion is the one that Gunnar Erikson tried, but did not live to complete—the strategy of keeping an older technology in use, or bringing a defunct technology back into service, while there’s still enough wealth sloshing across the decks of the industrial economy to make it relatively easy to do so.  I’ve suggested above that if his firm had kept the windjammers sailing, scraping out a living on whatever narrow market niche they could find, the rising cost of bunker oil might already have made it profitable to expand into new niches; there wouldn’t have been the additional challenge of finding the money to build new windjammers from the keel up, train crews to sail them, and get ships and crews through the learning curve that’s inevitably a part of bringing an unfamiliar technology on line.
That same principle has been central to quite a few of this blog’s projects. One small example is the encouragement I’ve tried to give to the rediscovery of the slide rule as an effective calculating device. There are still plenty of people alive today who know how to use slide rules, plenty of books that teach how to crunch numbers with a slipstick, and plenty of slide rules around. A century down the line, when slide rules will almost certainly be much more economically viable than pocket calculators, those helpful conditions might not be in place—but if people take up slide rules now for much the same reasons that Erikson kept the tall ships sailing, and make an effort to pass skills and slipsticks on to another generation, no one will have to revive or reinvent a dead technology in order to have quick accurate calculations for practical tasks such as engineering, salvage, and renewable energy technology.
The collection of sustainable-living skills I somewhat jocularly termed “green wizardry,” which I learned back in the heyday of the appropriate tech movement in the late 1970s and early 1980s, passed on to the readers of this blog in a series of posts a couple of years ago, and have now explored in book form as well, is another case in point. Some of that knowledge, more of the attitudes that undergirded it, and nearly all the small-scale, hands-on, basement-workshop sensibility of the movement in question has vanished from our collective consciousness in the years since the Reagan-Thatcher counterrevolution foreclosed any hope of a viable future for the industrial world. There are still enough books on appropriate tech gathering dust in used book shops, and enough in the way of living memory among those of us who were there, to make it possible to recover those things; another generation and that hope would have gone out the window.
There are plenty of other possibilities along the same lines. For that matter, it’s by no means unreasonable to plan on investing in technologies that may not be able to survive all the way through the decline and fall of the industrial age, if those technologies can help cushion the way down. Whether or not it will still be possible to manufacture PV cells at the bottom of the deindustrial dark ages, as I’ve been pointing out since the earliest days of this blog, getting them in place now on a home or local community scale is likely to pay off handsomely when grid-based electricity becomes unreliable, as it will.  The modest amounts of electricity you can expect to get from this and other renewable sources can provide critical services (for example, refrigeration and long-distance communication) that will be worth having as the Long Descent unwinds.
That said, all such strategies depend on having enough economic surplus on hand to get useful technologies in place before the darkness closes in. As things stand right now, as many of my readers will have had opportunity to notice already, that surplus is trickling away. Those of us who want to help make a contribution to the future along those lines had better get a move on.

American Delusionalism, or Why History Matters

Wed, 2014-03-19 17:32
One of the things that reliably irritates a certain fraction of this blog’s readers, as I’ve had occasion to comment before, is my habit of using history as a touchstone that can be used to test claims about the future. No matter what the context, no matter how wearily familiar the process under discussion might be, it’s a safe bet that the moment I start talking about historical parallels, somebody or other is going to pop up and insist that it really is different this time.  In a trivial sense, of course, that claim is correct. The tech stock bubble that popped in 2000, the real estate bubble that popped in 2008, and the fracking bubble that’s showing every sign of popping in the uncomfortably near future are all different from each other, and from every other bubble and bust in the history of speculative markets, all the way back to the Dutch tulip mania of 1637. It’s quite true that tech stocks aren’t tulips, and bundled loans backed up by dubious no-doc mortgages aren’t the same as bundled loans backed up by dubious shale leases—well, not exactly the same—but in practice, the many differences of detail are irrelevant compared to the one crucial identity.  Tulips, tech stocks, and bundled loans, along with South Sea Company shares in 1730, investment trusts in 1929, and all the other speculative vehicles in all the other speculative bubbles of the last five centuries, different as they are, all follow the identical trajectory:  up with the rocket, down with the stick.
That is to say, those who insist that it’s different this time are right where it doesn’t matter and wrong where it counts. I’ve come to think of the words “it’s different this time,” in fact, as the nearest thing history has to the warning siren and flashing red light that tells you that something is about to go very, very wrong. When people start saying it, especially when plenty of people with plenty of access to the media start saying it, it’s time to dive for the floor, cover your head with your arms, and wait for the blast to hit.
With that in mind, I’d like to talk a bit about the recent media flurry around the phrase “American exceptionalism,” which has become something of a shibboleth among pseudoconservative talking heads in recent months. Pseudoconservatives? Well, yes; actual conservatives, motivated by the long and by no means undistinguished tradition of conservative thinking launched by Edmund Burke in the late 18th century, are interested in, ahem, conserving things, and conservatives who actually conserve are about as rare these days as liberals who actually liberate. Certainly you won’t find many of either among the strident voices insisting just now that the last scraps of America’s democracy at home and reputation abroad ought to be sacrificed in the service of their squeaky-voiced machismo.
As far as I know, the phrase “American exceptionalism” was originally coined by none other than Josef Stalin—evidence, if any more were needed, that American pseudoconservatives these days, having no ideas of their own, have simply borrowed those of their erstwhile Communist bogeyman and stood them on their heads with a Miltonic “Evil, be thou my good.”  Stalin meant by it the opinion of many Communists in his time that the United States, unlike the industrial nations of Europe, wasn’t yet ripe for the triumphant proletarian revolution predicted (inaccurately) by Marx’s secular theology. Devout Marxist that he was, Stalin rejected this claim with some heat, denouncing it in so many words as “this heresy of American exceptionalism,” and insisting (also inaccurately) that America would get its proletarian revolution on schedule. 
While Stalin may have invented the phrase, the perception that he thus labeled had considerably older roots. In a previous time, though, that perception took a rather different tone than it does today. A great many of the early leaders and thinkers of the United States in its early years, and no small number of the foreign observers who watched the American experiment in those days, thought and hoped that the newly founded republic might be able to avoid making the familiar mistakes that had brought so much misery onto the empires of the Old World. Later on, during and immediately after the great debates over American empire at the end of the 19th century, a great many Americans and foreign observers still thought and hoped that the republic might come to its senses in time and back away from the same mistakes that doomed those Old World empires to the misery just mentioned. These days, by contrast, the phrase “American exceptionalism” seems to stand for the conviction that America can and should make every one of those same mistakes, right down to the fine details, and will still somehow be spared the logically inevitable consequences.
The current blind faith in American exceptionalism, in other words, is simply another way of saying “it’s different this time.”  Those who insist that God is on America’s side when America isn’t exactly returning the favor, like those who have less blatantly theological reasons for their belief that this nation’s excrement emits no noticeable odor, are for all practical purposes demanding that America must not, under any circumstances, draw any benefit from the painfully learnt lessons of history.  I suggest that a better name for the belief in question might be "American delusionalism;" it’s hard to see how this bizarre act of faith can do anything other than help drive the American experiment toward a miserable end, but then that’s just one more irony in the fire.
The same conviction that the past has nothing to teach the present is just as common elsewhere in contemporary culture. I’m thinking here, among other things, of the ongoing drumbeat of claims that our species will inevitably be extinct by 2030. As I noted in a previous post here, this is yet another expression of the same dubious logic that generated the 2012 delusion, but much of the rhetoric that surrounds it starts from the insistence that nothing like the current round of greenhouse gas-driven climate change has ever happened before.
That insistence bespeaks an embarrassing lack of knowledge about paleoclimatology. Vast quantities of greenhouse gases being dumped into the atmosphere over a century or two? Check; the usual culprit is vulcanism, specifically the kind of flood-basalt eruption that opens a crack in the earth many miles in length and turns an area the size of a European nation into a lake of lava. The most recent of those, a smallish one, happened about 6 million years ago in the Columbia River basin of eastern Washington and Oregon states.  Further back, in the Aptian, Toarcian, and Turonian-Cenomanian epochs of the late Mesozoic, that same process on a much larger scale boosted atmospheric CO2 levels to three times the present figure and triggered what paleoclimatologists call "super-greenhouse events." Did those cause the extinction of all life on earth? Not hardly; as far as the paleontological evidence shows, it didn’t even slow the brontosaurs down.
Oceanic acidification leading to the collapse of calcium-shelled plankton populations? Check; those three super-greenhouse events, along with a great many less drastic climate spikes, did that. The ocean also contains very large numbers of single-celled organisms that don’t have calcium shells, such as blue-green algae, which aren’t particularly sensitive to shifts in the pH level of seawater; when such shifts happen, these other organisms expand to fill the empty niches, and everybody further up the food chain gets used to a change in diet. When the acidification goes away, whatever species of calcium-shelled plankton have managed to survive elbow their way back into their former niches and undergo a burst of evolutionary radiation; this makes life easy for geologists today, who can figure out the age of any rock laid down in an ancient ocean by checking the remains of foraminifers and other calcium-loving plankton against a chart of what existed when.
Sudden climate change recently enough to be experienced by human beings? Check; most people have heard of the end of the last ice age, though you have to read the technical literature or one of a very few popular treatments to get some idea of just how drastically the climate changed, or how fast.  The old saw about a slow, gradual warming over millennia got chucked into the dumpster decades ago, when ice cores from Greenland upset that particular theory. The ratio between different isotopes of oxygen in the ice laid down in different years provides a sensitive measure of the average global temperature at sea level during those same years. According to that measure, at the end of the Younger Dryas period about 11,800 years ago, global temperatures shot up by 20° F. in less than a decade.
Now of course that didn’t mean that temperatures shot up that far evenly, all over the world.  What seems to have happened is that the tropics barely warmed at all, the southern end of the planet warmed mildly, and the northern end experienced a drastic heat wave that tipped the great continental ice sheets of the era into rapid collapse and sent sea levels soaring upwards. Those of my readers who have been paying attention to recent scientific publications about Greenland and the Arctic Ocean now have very good reason to worry, because the current round of climate change has most strongly affected the northern end of the planet, too, and scientists have begun to notice historically unprecedented changes in the Greenland ice cap. In an upcoming post I plan on discussing at some length what those particular historical parallels promise for our future, and it’s not pretty.
Oh, and the aftermath of the post-Younger Dryas temperature spike was a period several thousand years long when global temperatures were considerably higher than they are today. The Holocene Hypsithermal, as it’s called, saw global temperatures peak around 7°F. higher than they are today—about the level, that is, that’s already baked into the cake as a result of anthropogenic emissions of greenhouse gases.  It was not a particularly pleasant time. Most of western North America was desert, baked to a crackly crunch by drought conditions that make today’s dry years look soggy; much of what’s now, at least in theory, the eastern woodland biome was dryland prairie, while both coasts got rapidly rising seas with a side order of frequent big tsunamis—again, we’ll talk about those in the upcoming post just mentioned. Still, you’ll notice that our species survived the experience.
As those droughts and tsunamis might suggest, the lessons taught by history don’t necessarily amount to "everything will be just fine." The weird inability of the contemporary imagination to find any middle ground between business as usual and sudden total annihilation has its usual effect here, hiding the actual risks of anthropogenic climate change behind a facade of apocalyptic fantasies. Here again, the question "what happened the last time this occurred?" is the most accessible way to avoid that trap, and the insistence that it’s different this time and the evidence of the past can’t be applied to the present and future puts that safeguard out of reach.
For a third example, consider the latest round of claims that a sudden financial collapse driven by current debt loads will crash the global economy once and for all. That sudden collapse has been being predicted year after weary year for decades now—do any of my readers, I wonder, remember Dr. Ravi Batra’s The Great Depression of 1990?—and its repeated failure to show up and perform as predicted seems only to strengthen the conviction on the part of believers that this year, like some financial equivalent of the Great Pumpkin, the long-delayed crash will finally put in its long-delayed appearance and bring the global economy crashing down.
I’m far from sure that they’re right about the imminence of a crash; the economy of high finance these days is so heavily manipulated, and so thoroughly detached from the real economy where real goods and services have to be produced using real energy and resources, that it’s occurred to me more than once that the stock market and the other organs of the financial sphere might keep chugging away in a state of blissful disconnection to the rest of existence for a very long time to come. Stil, let’s grant for the moment that the absurd buildup of unpayable debt in the United States and other industrial nations will in fact become the driving force behind a credit collapse, in which drastic deleveraging will erase trillions of dollars in notional wealth. Would such a crash succeed, as a great many people are claiming just now, in bringing the global economy to a sudden and permanent stop?
Here again, the lessons of history provide a clear and straightforward answer to that question, and it’s not one that supports the partisans of the fast-crash theory. Massive credit collapses that erase very large sums of notional wealth and impact the global economy are hardly a new phenomenon, after all. One example—the credit collapse of 1930-1932—is still just within living memory; the financial crises of 1873 and 1893 are well documented, and there are dozens of other examples of nations and whole continents hammered by credit collapses and other forms of drastic economic crisis. Those crises have had plenty of consequences, but one thing that has never happened as a result of any of them is the sort of self-feeding, irrevocable plunge into the abyss that current fast-crash theories require.
The reason for this is that credit is merely one way by which a society manages the distribution of goods and services. That’s all it is. Energy, raw materials, and labor are the factors that have to be present in order to produce goods and services.  Credit simply regulates who gets how much of each of these things, and there have been plenty of societies that have handled that same task without making use of a credit system at all. A credit collapse, in turn, doesn’t make the energy, raw materials, and labor vanish into some fiscal equivalent of a black hole; they’re all still there, in whatever quantities they were before the credit collapse, and all that’s needed is some new way to allocate them to the production of goods and services.
This, in turn, governments promptly provide. In 1933, for example, faced with the most severe credit collapse in American history, Franklin Roosevelt temporarily nationalized the entire US banking system, seized nearly all the privately held gold in the country, unilaterally changed the national debt from "payable in gold" to "payable in Federal Reserve notes" (which amounted to a technical default), and launched a flurry of other emergency measures.  The credit collapse came to a screeching halt, famously, in less than a hundred days. Other nations facing the same crisis took equally drastic measures, with similar results. While that history has apparently been forgotten across large sections of the peak oil blogosphere, it’s a safe bet that none of it has been forgotten in the corridors of power in Washington DC and elsewhere in the world.
More generally, governments have an extremely broad range of powers that can be used, and have been used, in extreme financial emergencies to stop a credit or currency collapse from terminating the real economy. Faced with a severe crisis, governments can slap on wage and price controls, freeze currency exchanges, impose rationing, raise trade barriers, default on their debts, nationalize whole industries, issue new currencies, allocate goods and services by fiat, and impose martial law to make sure the new economic rules are followed to the letter, if necessary, at gunpoint. Again, these aren’t theoretical possibilities; every one of them has actually been used by more than one government faced by a major economic crisis in the last century and a half. Given that track record, it requires a breathtaking leap of faith to assume that if the next round of deleveraging spirals out of control, politicians around the world will simply sit on their hands, saying "Whatever shall we do?" in plaintive voices, while civilization crashes to ruin around them.
What makes that leap of faith all the more curious is in the runup to the economic crisis of 2008-9, the same claims of imminent, unstoppable financial apocalypse we’re hearing today were being made—in some cases, by the same people who are making them today.  (I treasure a comment I fielded from a popular peak oil blogger at the height of the 2009 crisis, who insisted that the fast crash was upon us and that my predictions about the future were therefore all wrong.) Their logic was flawed then, and it’s just as flawed now, because it dismisses the lessons of history as irrelevant and therefore fails to take into account how the events under discussion play out in the real world.
That’s the problem with the insistence that this time it really is different: it disables the most effective protection we’ve got against the habit of thought that cognitive psychologists call "confirmation bias," the tendency to look for evidence that supports one’s pet theory rather than seeking the evidence that might call it into question. The scientific method itself, in the final analysis, is simply a collection of useful gimmicks that help you sidestep confirmation bias.  That’s why competent scientists, when they come up with a hypothesis to explain something in nature, promptly sit down and try to think up as many ways as possible to disprove the hypothesis.  Those potentials for disproof are the raw materials from which experiments are designed, and only if the hypothesis survives all experimental attempts to disprove it does it take its first step toward scientific respectability.
It’s not exactly easy to run controlled double-blind experiments on entire societies, but historical comparison offers the same sort of counterweight to confirmation bias. Any present or future set of events, however unique it may be in terms of the fine details, has points of similarity with events in the past, and those points of similarity allow the past events to be taken as a guide to the present and future. This works best if you’ve got a series of past events, as different from each other as any one of them is from the present or future situation you’re trying to predict; if you can find common patterns in the whole range of past parallels, it’s usually a safe bet that the same pattern will recur again.
Any time you approach a present or future event, then, you have two choices: you can look for the features that event has in common with other events, despite the differences of detail, or you can focus on the differences and ignore the common features.  The first of those choices, it’s worth noting, allows you to consider both the similarities and the differences.  Once you’ve got the common pattern, it then becomes possible to modify it as needed to take into account the special characteristics of the situation you’re trying to understand or predict: to notice, for example, that the dark age that will follow our civilization will have to contend with nuclear and chemical pollution on top of the more ordinary consequences of decline and fall.
If you start from the assumption that the event you’re trying to predict is unlike anything that’s ever happened before, though, you’ve thrown out your chance of perceiving the common pattern. What happens instead, with motononous regularity, is that pop-culture narratives such as the sudden overnight collapse beloved of Hollywood screenplay writers smuggle themselves into the picture, and cement themselves in place with the help of confirmation bias. The result is the endless recycling of repeatedly failed predictions that plays so central a role in the collective imagination of our time, and has helped so many people blind themselves to the unwelcome future closing in on us.

The Crocodiles of Reality

Wed, 2014-03-12 20:37
I've suggested in several previous posts that the peak oil debate may be approaching a turning point—one of those shifts in the collective conversation in which topics that have been shut out for years or decades finally succeed in crashing the party, and other topics that have gotten more than their quota of attention during that time get put out to pasture or sent to the glue factory.  I’d like to talk for a moment about some of the reasons I think that’s about to happen, and in the process, give a name to one of the common but generally unmentionable features of contemporary economic life.
We can begin with the fracking bubble, that misbegotten brat fathered by Wall Street’s love of Ponzi schemes on Main Street’s stark terror of facing up to the end of the age of cheap abundant energy. That bubble has at least two significant functions in today’s world. The first function, as discussed in these essays already, is to fill an otherwise vacant niche in the string of giddy speculative delusions that began with the stock market boom and bust of 1987 and is still going strong today. As with previous examples, the promoters of the fracking bubble dangled the prospect of what used to be normal returns on investment in front of the eager and clueless investors with which America seems to be so richly stocked these days.  These then leapt at the bait, and handed their money over to the tender mercies of the same Wall Street investment firms who gave us Pets.com and zero-doc mortgages.
You might think, dear reader, that after a quarter century of this, there might be a shortage of chumps willing to fall for such schemes. Whatever else might be depleting, though, the supply of lambs eager to be led to that particular slaughter seems to be keeping up handily with the demand. We live in what will doubtless be remembered as the Golden Age of financial fraud, an era of stunning fiscal idiocy in which even the most blatant swindles can count on drawing a crowd of suckers begging to have their money taken from them. Millennia from now, the grifters, con men, and bunco artists of civilizations yet unborn will look back in awe at our time, and wish that they, too, might be fortunate enough to live in an era when tens of millions of investors passionately wanted to believe that the laws of economics, thermodynamics, and plain common sense must surely be suspended for their benefit.
To some extent, in other words, the fracking bubble is simply one more reminder that Ben Franklin’s adage about a fool and his money has not lost any of its relevance since the old rascal slipped it into the pages of Poor Richard’s Almanac. Still, there’s more going on here than the ruthless fleecing of the unwary that’s the lifeblood of every healthy market economy. The fracking bubble, as most of my readers will be well aware, has not only served as an excuse for ordinary speculative larceny; it’s also provided a very large number of people with an excuse to scrunch up their eyes, stuff their fingers in their ears, shout "La, la, la, I can’t hear you," and thus keep clinging to the absurd faith that limitless resources really can be extracted from a finite planet.
For the last three or four years, accordingly, the fracking bubble has been the most common item brandished by practitioners of peak oil denial as evidence that petroleum production can too keep on increasing forever, so there!  The very modest additions to global petroleum production that resulted from hydrofracturing shales in North Dakota and Texas got talked up into an imaginary tidal wave of crude oil that would supposedly sweep all before it, and not incidentally restore the United States to its long-vanished status as the world’s premier oil producer. All that made good copy for the bunco artists mentioned earlier, to be sure, but it also fed into the futile attempts at denial that have taken the place of a sane energy policy in most industrial societies.
The problem with this fond fantasy is that the numbers don’t even begin to add up. The latest figures, neatly summarized by Ron Patterson in a recent post, show just how bad the situation has become. Each year, on average, the oil industry has had had to increase its investments by 10% over the previous year to get the same amount of oil out of the ground.  Even $100-a-barrel oil prices won’t support that kind of soaring overhead cost for long, and the problem has been made worse by the belated discovery that many of the shale beds ballyhooed in recent years don’t have anything like as much oil as their promoters claimed.  As a result, oil companies around the world are cutting back on capital investment and selling off assets. That’s not the behavior of an industry poised on the brink of a new age of abundance; it’s the behavior of an industry that has just slammed face first into hard supply limits and is backing away groggily from the impact site while trying to stanch the bleeding from deep fiscal cuts.
As a result, with mathematical certainty, a great many overpriced assets are going to lose most of their paper value in the years ahead of us, a great many businesses that have made their money providing goods and services to the drilling industry are going to downsize sharply or simply go bankrupt, a great many wells that can’t make money even at exorbitant oil prices are going to be shut in or go undrilled in the first place, and a very, very great many people who convinced themselves that they were going to get rich by investing in fracking are going to end up poor. It’s not going to be pretty.  Exactly what effect this is going to have on the price of oil is an interesting question; my guess, though it’s only a guess, is that a couple of years from now the price of oil will spike, possibly to the $250-$300 a barrel range, then crash to $60 a barrel, and slowly recover to $175 or so over a period of several years.
This has a great deal of relevance to the project of this blog.  The last time petroleum production failed to keep pace with potential demand, and the price of oil spiked accordingly, peak oil came in from the fringes and got discussed publicly in the pages of newspapers of record.  That window of opportunity gaped open from 2004 to 2010, roughly speaking, and during that period a great deal got accomplished. That was when peak oil stopped being a concern of the furthest fringe and found an audience in many corners of contemporary alternative culture, when local groups—some under the Transition Town banner, others outside it—began to organize around the imminence of peak oil, and when books on resource depletion and its consequences found a market for the first time since the early 1980s.
Those are significant gains. It’s true, of course, that these achievements didn’t make peak oil go away, or find some gimmick that will keep the lifestyles of the industrial world’s more privileged inmates rolling merrily along for the foreseeable future. What sometimes gets forgotten is that neither of those things was ever possible in the first place. The hard facts of our predicament have not changed a bit:  the age of cheap abundant energy is ending; the economic systems, social structures, and lifestyle habits that were made possible by that temporary condition are accordingly going away, and nothing anyone can do will bring them back again, not now, not ever. 
It’s worth being precise here:  for the rest of the time our species endures, we will have to deal with much more sharply constrained energy supplies than we’ve had handy over the last few centuries. That doesn’t mean that our descendants will be condemned to huddle in caves until the jaws of extinction close around them; I’ve argued at quite some length in one of my books that the endpoint of the mess we’re currently in, centuries from now, will most likely be the emergence of ecotechnic societies—societies that maintain relatively high technology on the modest energy and resource inputs that can be provided by renewable sources. I’ve suggested, there and elsewhere, that there’s quite a bit that can be done here and now to lay the foundations for the ecotechnic societies of the far future. I’ve also tried to point out that there’s quite a bit that can be done here and now to make the unraveling of the age of abundance less traumatic than it will otherwise be.
To my mind, those are worthwhile goals. What makes them difficult is simply that any meaningful attempt to pursue them has to start by accepting that the age of cheap abundant energy is ending, that the lifestyles that age made possible are ending with it, and that wasting all those fossil fuels on what amounts to a drunken binge three centuries long might not have been a very smart idea in the first place. Any one of those would be a bitter pill to take; all three of them together are far more than most people nowadays are willing to swallow, and so it’s not surprising that so much effort over the last few decades have gone into pretending that the squalid excesses of contemporary culture can somehow keep rolling along in the teeth of all the evidence to the contrary.
The frantic attempts to sustain the unsustainable driven by this pretense have done much to make the present day such a halcyon time for swindles of every description. Not all of those, however, have taken aim at the wallets of what we might as well call the lumpen-investmentariat, that class of people who have money to invest and not a clue in their heads that Wall Street might not have their best interests at heart. Some of the most colorful flops of recent years have instead attracted money from a different though equally gullible source: government subsidies for new energy technologies.
Those of my readers who were part of the peak oil scene a decade ago, for example, may remember the days when ethanol made from American corn was going to save us all.  Many of the same claims more recently deployed to inflate the fracking bubble were used to justify what was described, at the time, as America’s burgeoning new ethanol industry, but the target for these exercises was somewhat different. A certain amount of investment money from the clueless did find its way into the hands of ethanol-plant promoters, to be sure, but the financial core of the new industry was a flurry of federal mandates and federal and state subsidies, which in theory existed to lead America to a bright new energy future, and in practice existed to convince the voters that politicians really were doing something about gasoline prices that had just risen to the unheard-of level of $2 a gallon.
You won’t hear much about America’s burgeoning new ethanol industry these days. A substantial fraction of the ethanol plants that were subsidized by governments and lavishly praised by politicians a decade ago are bankrupt and shuttered today, having failed to turn a profit or, in some cases, cover the costs of construction.  The critics who pointed out that the burgeoning new industry made no economic sense, and that making ethanol from corn uses more energy than you get from burning the ethanol, turned out to be dead right, and the critics who dismissed them as naysayers turned out to be dead wrong. Still, the ethanol plants had accomplished the same two functions as the fracking bubble did later: it sucked a great deal of money into the hands of its promoters, and it helped everyone else pretend for a while that the end of the age of cheap abundant energy wasn’t going to happen after all.
It’s hardly the only example of the phenomenon.  Since I don’t want green-energy proponents to feel unduly picked on, let’s turn to the other side of the energy picture and take a look at nuclear fusion. Since the 1950s, a sizeable body of nuclear physicists have kept themselves gainfully employed and their laboratories stocked and staffed by proclaiming nuclear fusion as the wave of the future. In just another twenty years, we’ve repeatedly been told, clean, safe nuclear fusion plants will be churning out endless supplies of energy, if only the government subsidies keep pouring in. After sixty years of unbroken failure, even politicians are starting to have second thoughts, but the fusion-power industry keeps at it, pursuing a project that, as respected science writer Charles Seife pointed out trenchantly in Sun In A Bottle: The Strange History of Fusion and the Science of Wishful Thinking, has more in common with the quest for perpetual motion than its overeager fans like to think.
Every few years the media carries yet another enthusiastic announcement that some new breakthrough has happened in the quest for fusion power. Now of course it’s worth noting that none of these widely ballyhooed breakthroughs ever amount to a working fusion reactor capable of putting power into the grid, but let’s let that pass for now, because the point I want to make is a different one. As I pointed out in a post here last year, the question that matters about fusion is not whether fusion power is technically feasible, but whether it’s economically viable. That’s not a question anyone in the fusion research industry wants to discuss, and there are good reasons for that.
The ITER project in Europe offers a glimpse at the answer.  ITER is the most complex and also the most expensive machine ever built by human beings—the latest estimate of the total cost has recently gone up from $14 billion to $17 billion, and if past performance is anything to go by, it will have gone up a good deal more before the scheduled completion in 2020.  That stratospheric price tag results from the simple fact that six decades of hard work by physicists around the world, exploring scores of different approaches to fusion, have shown that any less expensive approach won’t produce a sustained fusion reaction. While commercial fusion reactors would doubtless cost less than ITER, it’s already clear that they won’t cost enough less to make fusion power economically viable. Even if ITER succeeds in creating its "sun in a bottle," in other words, that fact will be an expensive laboratory curiosity, not a solution to the world’s energy needs.
My more attentive readers will doubtless have noticed that the flaw in the current round of glowing prophecies of a future powered by fusion plants is the same as the flaw in the equally glowing sales pitches for corn-based ethanol fuel plants a decade ago. Turning corn into ethanol, and using the ethanol to fuel cars and trucks, is technically feasible; it just doesn’t happen to be economically viable. In the same way, whether fusion power is technically feasible or not may still be up in the air, but the question of its economic viability is not. The gap between technical capacity and economic reality provides the ecological niche in which both these projects make their home, and a great many other alleged solutions to the energy crisis of our time inhabit that same niche.
I’d like to suggest that it’s high time to put a name to the technological fauna that fill this role in our social ecology, and I even have a name to propose.  I think we should call them "subsidy dumpsters."
A subsidy dumpster, if I may venture on a definition, is an energy technology that looks like a viable option so long as nobody pays attention to the economic realities. Because it’s technically feasible, or at least hasn’t yet been proven to be unfeasible, promoters can brandish enthusiastic estimates of how much energy it will yield if only the government provides adequate funding, and point to laboratory tests of technical feasibility as evidence that so tempting a bait is within reach. The promoters of such schemes can also rely on the foam-flecked ravings of economists, who have proven to be so stunningly clueless about energy in recent years, and they can also count on one of the pervasive blind spots in modern thinking: the almost visceral inability of most people these days to think in terms of whole systems. Armed with these advantages, they descend upon politicians, and if energy costs are an irritation to the public—and these days, energy costs are always an irritation to the public—the politicians duly cough up a subsidy so they can claim to be doing something about the energy problem.
Once the subsidy dumpster gets its funding, it goes through however many twists and turns its promoters can manage before economic realities take their inevitable toll. If the dumpster in question has to compete in the marketplace, as fuel ethanol plants did, the normal result is a series of messy bankruptcies as soon as the government money runs short. If it can be shielded from the market, preferably by always being almost ready for commercial deployment but never actually quite getting there—the fusion-research industry has this one down pat, though it’s fair to say that the laws of nature seem to be giving them a great deal of help—the dumpster can keep on being filled with subsidies for as long as the prospect of an imminent breakthrough can be dangled in front of politicians and the public. Since most people these days consistently mistake technical feasibility for economic viability, there’s no shortage of easy marks for this sort of sales pitch.
There are plenty of subsidy dumpsters in the energy field just now. What makes this all the more unfortunate is that quite a few of them are based on technologies that could be used in less self-defeating ways. Solar power, to name only one example, could make a huge dent in America’s energy needs, if the available resources focused on proven technologies such as solar water heaters; once this sensible approach is replaced by attempts to claim that we can keep the grid powered by paving some substantial fraction of Nevada with solar PV cells, though, we’re in subsidy-dumpster territory, as a recent study of Spain’s much-lauded solar program has shown. Renewable energy is a viable option so long as its sharp limits of concentration and intermittency are kept in mind; ignore those, and pretend that we can keep on living today’s extravagant lifestyles on a basis that won’t support them, and you’ve got a perfect recipe for a subsidy dumpster. Now it’s only fair to point out that the energy issue is far from the only dimension of modern life that attracts subsidy dumpsters. Name a current crisis here in America—joblessness, urban blight, decaying infrastructure, and the list goes on—and there are plenty of subsidy dumpsters to be found, some empty and rusting like yesterday’s ethanol plants, some soaking up government funds like the ITER project, and many more that are still only a twinkle in the eyes of their eager promoters. Still, I’d like to suggest that subsidy dumpsters in the energy field have a particular importance just now.
The end of the age of cheap abundant energy requires that we stop using anything like as much energy as we’ve been using in recent decades. Any approach to dealing with the crisis of our age that doesn’t start by using much less energy, in other words, simply isn’t serious. The parade of subsidy dumpsters being hawked to politicians these days is merely one more attempt to refuse to take our predicament as seriously as it deserves, and thus serves mostly as a way to make that predicament even worse than it has to be. By and large, to borrow a neatly Pharaonic turn of phrase from one of my longtime readers—tip of the archdruidical hat to Robin Datta—that’s the trouble with spending all your time splashing around in the waters of denial; all that happens, in the final analysis, is that you attract the attention of the crocodiles of reality.

*  *  *
In not unrelated news, I'm pleased to report that my latest book on peak oil and the future of industrial society, Decline and Fall: The End of Empire and the Future of Democracy in 21st Century America, is now in print. Those of my readers who have preordered copies will have them soon; those who haven't...well, what's keeping you? ;-)

The Steampunk Future Revisited

Wed, 2014-03-05 18:54
One of the things I’ve noticed repeatedly, over the nearly eight years I’ve been writing this blog, is that I’m the last person to ask which of these weekly essays is most likely to find an audience or hit a nerve. Posts I think will be met with a shrug of the shoulders stir up a storm of protest, while those I expect to be controversial get calm approval instead. Nor do I find it any easier to guess which posts will have readers once the next week rolls around and a new essay goes up.

My favorite example just now, not least because it’s so close to the far end of the improbability curve, is a post that appeared here back in 2011, discussing Hermann Hesse's novel The Glass Bead Game as a work of deindustrial science fiction. If ever a post of mine seemed destined for oblivion, that was it; next to nobody reads Hesse nowadays, and even in the days when every other college student had a battered paperback copy of Siddhartha or Steppenwolf on hand, not that many people wrestled with the ironic ambiguities of Hesse's last and longest novel. More than three years after that post appeared, though, the site stats here at Blogger show me that there are still people reading it most evenings. Has it gotten onto the recommended-reading list of the League of Journeyers to the East, the mysterious fellowship that features in several Hesse stories? If so, nobody's yet given me the secret handshake.

There are other posts of mine that have gone on to have that sort of persistent afterlife. What interests me just now, though, is that one of my recent posts appears to be doing the same: the essay I posted just a month ago proposing the steampunk subculture as a potential model for future technology on the far side of the Long Descent. While steampunk isn't anything like as obscure as The Glass Bead Game, it's not exactly a massive cultural presence, either, and it interests me that a month after the post appeared, it's still getting read and discussed.
Courtesy of one of my regular readers, it's also appeared in an Australian newsletter for fans of penny farthing bicycles. Those of my readers who don't speak bicyclese may want to know that those are the old-fashioned cycles with a big wheel in front and a small one in back; the old British penny was about the size of a US quarter, the farthing about the size of a US dime, and if you put the two coins side by side you have a pretty fair image of the bicycle in question. I wasn't aware that anyone had revived the penny farthing cycle, and I was glad to hear it: they're much simpler than today's bicycles, requiring neither gears nor chains, and many penny farthing riders these days simply build their own cycles—a capacity well worth learning and preserving.
Mind you, there were plenty of people who took issue with the post, and I want to talk about some of those objections here, because they cast a useful light on the blind spots of the imagination I've been exploring in recent posts. My favorite example is the commenter who insisted with some heat that an advanced technology couldn't be based on the mechanical and pneumatic systems of the Victorian era. As an example, he pointed out that without electronics, there was no way to build a FMRI machine—that's "functional magnetic resonance imaging" for those of my readers who don't speak medicalese, one of the latest pieces of high-priced medical hardware currently bankrupting patients and their families across America.
He's quite correct, of course, but his choice of an example says much more about the limitations of his thinking than it does about anything else. Of course a steampunk-style technology wouldn't produce FMRI machines, or for that matter most of the electronic gimmickry that fills contemporary life in the industrial world, from video games to weather radar.  It would take advantage of the very different possibilities inherent in mechanical and pneumatic technology to do different things. It's only from within the tunnel vision of contemporary culture that the only conceivable kind of advanced technology is the kind that happens to produce FMRI machines, video games and weather radars.  An inhabitant of some alternate world where the petroleum and electronics revolutions never got around to happening, and something like steampunk technology became standard, could insist with equal force that a technology couldn't possibly be called advanced unless it featured funicular-morphoteny machines and photodyne nebulometers. 
The same sort of thinking expressed in a slightly different way drove the claim, which appeared repeatedly in the comments page here as well as elsewhere, that a neo-Victorian technology by definition meant Victorian customs such as child labor.  A very large number of people in the contemporary industrial world, that is, can't imagine a future that isn't either just like the present or just like some corner of the past. It should be obvious that a technology using mechanical, hydraulic and pneumatic power transfer can be applied to the needs of many different cultural forms, not merely those that were common in one corner of the late 19th century world. That this is far from obvious shows just how rigidly limited our imagination of the future has become.
That would be a serious difficulty even if we weren't picking up speed down the bumpy slope that leads toward the deindustrial dark ages of the not so distant future.  Given that that's where we are just now, it could very well turn into a fruitful source of disasters.  The economic arrangements that make it possible to build, maintain, and use FMRI machines in American hospitals are already coming apart around us; so are the equivalent arrangements that prop up most other advanced technological systems in today's industrial world.  In the absence of those arrangements, a good many simpler technological systems could be put in their places and used to take up some of the slack.  If enough of us are convinced that without FMRI machines we might as well just bring on the blood-sucking leeches, though, those steps will not be taken.
With this in mind, I want to circle back around to the neo-Victorian technology imagined by steampunk aficionados, and look at it from another angle.
It's not often remembered that paved roads of the modern type were not originally put there for automobiles.  In America, and I believe in other countries as well, the first generation of what were called "Macadamized" roads—the kind with a smooth surface rather than bare bricks or cobblestones—were built in response to lobbying by bicyclists. Here in the United States, the lobbying organization was the League of American Wheelmen. (There were plenty of wheelwomen as well, but the masculine gender still had collective force in the English of that time.) Their advocacy had a recreational side, but there was more to it than that.  A few people—among them the redoubtable Sir James Jeavons—were already pointing out in the 19th century that exponential growth in coal consumption could not be maintained forever; a great many more had begun to work out the practical implications of the soaring population of big cities in America and elsewhere, in terms of such homely but real problems as the disposal of horse manure, and these concerns fed into the emergence of the bicycle as the hot new personal transport technology of the age.
Similar concerns guided the career of a figure who has appeared in these essays more than once already, the brilliant French inventor Augustin Mouchot.  Noting that his native country had very limited coal reserves, and colonial possessions in North Africa with vast amounts of sunlight on offer, Mouchot devoted two decades of pioneering work to harnessing solar energy. His initial efforts focused on solar cookers, stills and water pumps, and his success at these challenges encouraged him to tackle a challenge no previous inventor had managed: a working solar steam engine. His first successful model was tested in 1866, and the Paris Exhibition of 1878 featured his masterpiece, a huge engine with a sun-tracking conical reflector focusing sunlight on tubes of blackened copper; the solar engine pumped water, cooked food, distilled first-rate brandy, and ran a refrigerator. A similar model exhibited in Paris in 1880 ran a steam-driven printing press, which obligingly turned out 500 copies of Le Journal Solaire.
Two other technologies I've discussed repeatedly in these essays came out of the same era. The first commercial solar water heater hit the market in 1891 and very quickly became a common sight over much of the United States; the colder regions used them in the summertime, the Sun Belt year round, in either case with very substantial savings in energy costs.  The fireless cooker or haybox was another successful and widely adopted technology of the age:  a box full of insulation with a well in the center for a cooking pot, it was the slow cooker of its time, but without the electrical cord.  Bring food to a boil on the stove and then pop the pot into the fireless cooker, and it finishes cooking by residual heat, again with substantial energy savings.
Such projects were on many minds in the last decades of the 19th century and the first decade of the 20th. There was good reason for that; the technology and prosperity of the Victorian era were alike utterly dependent on the extraction and consumption of nonrenewable resources, and for those who had eyes to see, the limits to growth were coming into sight. That’s the thinking that lay behind sociologist Max Weber’s eerie 1905 prediction of the future of the industrial economy:  “This order is now bound to the technical and economic conditions of machine production which today determine the lives of all the individuals who are born into this mechanism, not only those directly concerned with economic acquisiton, with irresistible force. Perhaps it will so determine them until the last ton of fossilized coal is burnt.”
It so happened that a temporary event pushed those limits back out of sight for three quarters of a century. The invention of the internal combustion engine, which turned gasoline from a waste product of lamp fuel refining to one of the most eagerly sought products of the age, allowed the industrial societies of that time to put off the day of reckoning for a while. It wasn't just that petroleum replaced coal in many applications, though of course this happened; coal production was also propped up by an energy subsidy from petroleum—the machines that mined coal and the trains that shipped it were converted to petroleum, so that energy-rich petroleum could subsidize the extraction of low-grade coal reserves.  If the petroleum revolution had not been an option, the 20th century would have witnessed the sort of scenes we're seeing now: rising energy costs and economic contraction leading to decreasing energy use per capita in leading industrial nations, as an earlier and more gradual Long Descent got under way.
Those of my readers who have been following this blog for a while may be feeling a bit of deja vu at this point, and they're not wrong to do so. We’ve talked here many times about the appropriate-tech movement of the 1970s, which made so many promising first steps toward sustainability before it was crushed by the Reagan-Thatcher counterrevolution and the reckless drawdown of the North Slope and North Sea oil fields. What I'd like to suggest, though, is that the conservation and ecology movement of the 1970s wasn’t the first attempt to face the limits of growth in modern times; it was the second. The first such attempt was in the late 19th century, and Augustin Mouchot, as well as the dozens of other solar and wind pioneers of that time—not to mention bicylists on penny farthing cycles!—were the original green wizards, the first wave of sustainability pioneers, whose work deserves to be revived as much as that of the 1970s does. 
Their work was made temporarily obsolete by the torrent of cheap petroleum energy that arrived around the beginning of the 20th century. One interesting consequence of taking their existence into account is that it’s easy to watch the law of diminishing returns at work in the can-kicking exercises made possible by petroleum. The first wave of petroleum energy pushed back the limits to growth for just over seventy years, from 1900 or so to 1972.  The second did the same trick for around twenty-five years, from 1980 to 2005. The third—well, we're still in it, but it started in 2010 or so and isn’t holding up very well just now.  A few more cycles of the same kind, and the latest loudly ballyhooed new petroleum bonanza that disproves peak oil might keep the media distracted for a week.
As a thought experiment, though, I encourage my readers to imagine what might have followed if that first great distraction never happened—if, let's say, due to some chance mutation among plankton back in the Cambrian period, carbon compounds stashed away in deepwater sediments turned into a waxy, chemically inert goo rather than into petroleum.  The internal combustion engine would still have been invented, but without some immensely abundant source of liquid fuel to burn, it would have become, like the Stirling engine, an elegant curiosity useful only for a few specialized purposes.  As coal reserves depleted, governments, industrial firms, and serious men of affairs doubtless would have become ever more fixated on seizing control of untapped coal mines wherever they could be found, and the twentieth century in this alternate world would likely have been ravaged by wars as destructive as the ones in our world.
At the same time, the pioneering work of Mouchot and his many peers would have become increasingly hard to ignore. Solar power was unquestionably less economical than coal, while there was coal, but as coal reserves dwindled—remember, there would be no huge diesel machines burning oceans of cheap petroleum, so no mountaintop removal mining, nor any of the other extreme coal-extraction methods so common today—pointing a conical mirror toward the Sun would rapidly become the better bet.  As wars and power shifts deprived entire nations of access to what was left of the world's dwindling coal production, the same principle would have applied with even more force.  Solar cookers and stills, solar pumps and engines, wind turbines and other renewable-energy technologies would have been the only viable options.
This alternate world would have had advantages that ours doesn't share. To begin with, energy use per capita in 1900 was a small fraction of current levels even in the most heavily industrialized nations, and whole categories of work currently done directly or indirectly by fossil fuels were still being done by human beings.  Agriculture hadn't been mechanized, so the food supply wouldn't have been at risk; square-rigged sailing vessels were still hauling cargoes on the seas, so as the price of coal soared and steamboats stopped being economical, maritime trade and travel could readily downshift to familiar sail technology.  As the new renewable-energy technologies became more widely distributed and more efficient, getting by with the energy supplied by sun and wind would have become second nature to everybody.
Perhaps, dear reader, you can imagine yourself sitting comfortably this afternoon in a caféin this alternate world, about to read my weekly essay. No, it isn’t on a glowing screen; it’s in the pages of a weekly newspaper printed, as of course everything is printed these days, by a solar-powered press. Before you get to my latest piece, you read with some interest that a Brazilian inventor has been awarded the prestigious Mouchot Prize for a solar steam engine that’s far better suited to provide auxiliary power to sailing ships than existing models. You skim over the latest news from the war between Austria and Italy, in which bicycle-mounted Italian troops have broken the siege of Gemona del Friuli, and a report from Iceland, which is rapidly parlaying its abundant supply of volcanic steam into a place as one of the 21st century’s industrial powerhouses.
It’s a cool, clear, perfectly seasonable day—remember, most of the gigatons of carbon we spent the 20th century dumping into the atmosphere stayed buried in this alternate world—and the proprietor of the café is beaming as he watches sunlight streaming through the windows. He knows that every hour of sunlight falling on the solar collectors on the roof is saving him plenty of money in expensive fuel the kitchen won’t have to burn. Outside the café, the sun gleams on a row of bicycles, yours among them: they’re the normal personal transport of the 21st century, after all.  Solar water heaters gleam on every roof, and great conical collectors track the sun atop the factory down the road.  High overhead, a dirigible soars silently past; we’ll assume, for the sake of today’s steampunk sensibility, that lacking the extravagant fuel supplies needed to make airplanes more than an exotic fad, the bugs got worked out of dirigible technology instead.
Back in the cafe, you begin to read the latest Archdruid Report—and my imagination fails me at this point, because that essay wouldn’t be about the subjects that have filled these posts for most of eight years now. A society of the kind I’ve very roughly sketched out wouldn’t be in the early stages of a long ragged slide into ecological failure, political disintegration, economic breakdown, and population collapse.  It would have made the transition from fossil fuels to renewable energy when its energy consumption per capita was an order of magnitude smaller than ours, and thus would have had a much easier time of it.  Of course a more or less stable planetary climate, and an environment littered with far fewer of the ugly end products of human chemical and nuclear tinkering, would be important advantages as well.
It’s far from impossible that our descendants, some centuries from now, could have a society and a technology something like the one I’ve outlined here, though we have a long rough road to travel before that becomes possible. In the alternate world I’ve sketched, though, that would be no concern of mine. Since ecology would be simple common sense and the unwelcome future waiting for us in this world would have gone wherever might-have-beens spend their time, I’d have many fewer worries about the future, and would probably have to talk about Hermann Hesse’s The Glass Bead Game instead. Maybe then the League of Journeyers to the East would show up to give me the secret handshake!

Fascism and the Future, Part Three: Weimar America

Wed, 2014-02-26 17:41
The discussion on fascism that’s taken up the last two weekly essays here on The Archdruid Report, and will finish up in this week’s post, has gone in directions that will very likely have surprised and dismayed many of my readers.  Some of you, in fact, may even be jumping up and down by this point shouting, “Okay, but what about fascism? We’ve heard more than enough about Depression-era European dictators in funny uniforms, and that’s all very well and good, but what about real fascism, the kind we have in America today?”
  If this is what’s going through your head just now, dear reader, you’re in interesting company.  It’s a curious detail that in the last years of the Weimar Republic, a large number of avant-garde intellectuals and cultural figures were convinced that they already lived in a fascist country. They pointed, as many Americans point today, to the blatant influence of big business on the political process, to civil rights violations perpetrated by the administration in power or by state and local governments, and to the other abuses of power  common to any centralized political system, and they insisted that this amounted to fascism, since their concept of fascism—like the one standard in today’s America—assumed as a matter of course that fascism must by definition defend and support the economic and political status quo.
In point of fact, as Walter Laqueur showed in his capable survey Weimar: A Cultural History, denouncing the Weimar Republic as a fascist regime was quite the lively industry in Germany in the very late 1920s and early 1930s. Unfortunately for those who made this claim, history has a wicked sense of humor.  A good many of the people who liked to insist that Weimar Germany was a fascist state got to find out—in many cases, at the cost of their lives—that there really is a difference between a troubled, dysfunctional, and failing representative democracy and a totalitarian state, and that a movement that promises to overturn a broken status quo, and succeeds in doing so, is perfectly capable of making things much, much worse.
It’s entirely possible that we could end up on the receiving end of a similar dose of history’s gallows humor. To an embarrassing degree, after all, political thought in modern America has degenerated into the kind of reflexive venting of rage George Orwell parodied in 1984 in the Two Minutes Hate. Instead of pouring out their hatred at a cinematic image of marching Eurasian soldiers juxtaposed with the sniveling face of Goldstein, the traitorous leader of the Brotherhood, the inhabitants of our contemporary Oceania have their choice of options neatly stapled to the insides of their brains. For Democrats, the standard target until recently was an image of George W. Bush dressed up as Heinrich Himmler, lighting a bonfire using the Constitution as tinder and then tossing endangered species into the flames; for Republicans right now, it’s usually a picture of Barack Obama dressed up as Ho Chi Minh, having sex with their daughters and then walking off with their gun collections. Either way, the effect is the same.
I wish I were joking. I know people who, during Dubya’s presidency, were incapable of passing a picture of the man without screaming obscenities at it, and I know other people who have the identical kneejerk reaction these days to pictures of the White House’s current inmate.  I’ve commented here before how our political demonology stands in the way of any response to the converging crises of our time. The same sort of denunciatory frenzy was all the rage, in any sense of that word you care to choose, in Germany during the Weimar Republic—and its most important consequence was that it blinded far too many people to the difference between ordinary political dysfunction and the far grimmer realities that were waiting in the wings.
To explore the way that unfolded, let’s engage in a little thought experiment. Imagine, then, that sometime this spring, when you visit some outdoor public place, you encounter a half dozen young people dressed identically in bright green T-shirts, surplus black BDU trousers, and army-style boots.  They’re clean-cut, bright, and enthusiastic, and they want to interest you in a new political movement called the American Peoples Party. You’re not interested, and walk on by.
A couple of months later you run across another dozen or so of them, just as bright and clean and enthusiastic as the first bunch.  Now the movement is called the National Progressive American Peoples Party, NPAPP for short, and it’s got a twenty-five-point program focused on the troubled economy. You take a flyer, mostly because the young person who hands it to you is kind of cute. The twenty-five points don’t seem especially original, but they make more sense than what either Obama or the Republicans are offering. What’s more, the flyer says that the economy’s a mess and peak oil and climate change are real problem that aren’t going away, and this impresses you.
Over the months to come you see more and more of them, handing out flyers, going door to door to invite people to local caucus meetings, and doing all the other things that political parties used to do back when they were serious about grassroots organizing. A news website you follow shows a picture of the party’s chairman, a man named Fred Halliot;* he’s an earnest-looking guy in his thirties, an Army vet who did three tours in Afghanistan and earned a Silver Star for courage under fire. You glance at his face and then go look at something more interesting.
(*Yes, it’s an anagram. Work it out yourself.)
Meanwhile, the economy’s getting worse in the same slow uneven way it’s been doing for years. Two of your friends lose their jobs, and the price of gasoline spikes up to $5.69 a gallon, plunges, and finds a new stable point again well above $4. Obama insists that the recovery is already here and people just need to be patient and wait for prosperity to get to them. The Republicans insist that the only reason the economy hasn’t recovered yet is that the rich still have to pay taxes. The media are full of cheery stories about how the 2014 holiday season is going to be so big a hit that stores may run out of toys and electronic gewgaws to sell; there are record crowds on Black Friday, or that’s what the TV says, but nobody you know has the spare money to buy much this year. Not until midway through January 2015 does the media admit that the shopping season was a disaster and that two big-box chains have just gone broke.
Through all this, the new party keeps building momentum. As spring comes, Halliot begins a nationwide speaking tour. He travels in a school bus painted green and black, the NPAPP colors, and a Celtic tree-of-life symbol, the party’s new emblem.  The bus goes from town to town, and the crowds start to build. A handful of media pundits start talking about Halliot and the NPPAP, making wistful noises about how nice it is to see young idealists in politics again; a few others fling denunciations, though they don’t seem to have any clear sense what exactly they’re denouncing.  Both mainstream parties, as well as the Libertarians and the Greens, launch youth organizations with their own t-shirts and slogans, but their lack of anything approaching new ideas or credible responses to the economic mess make these efforts a waste of time.
The speaking tour ends in Washington DC with a huge rally, and things get out of hand. Exactly what happened is hard to tell afterwards, with wildly different stories coming from the feds, the mass media, the internet, and the NPAPP headquarters in St. Louis. The upshot, though, is that Halliot and two of his chief aides are arrested on federal conspiracy charges.  The trial is a media circus. Halliot gives an impassioned speech justifying his actions on the grounds that the nation and the world are in deep trouble, radical change is needed to keep things from getting much worse, and civil disobedience is justified for that reason.  He gets sentenced to four years in prison, and the other political parties breathe a huge collective sigh of relief, convinced that the NPAPP is a flash in the pan.
They’re wrong. The NPAPP weathers the crisis easily, and publicity from the trial gives Halliot and his party a major boost. Candidates from the new party enter races across the country in the 2016 elections, seizing much of the limelight from the frankly dreary presidential race between Hillary Clinton and Haley Barbour.  When the votes are counted, the new party has more than three hundred city and county positions, forty-three seats in state legislatures, and two seats in the House of Representatives. The major parties try every trick in the book to overturn the results of each race, and succeed mostly in making themselves look corrupt and scared.
Then Halliot gets released from prison, having served only nine months of his sentence.  (Word on the internet has it that the whole point of locking him up was to keep him out of the way during the election—but is that simply a NPAPP talking point?  Nobody’s sure.) It turns out that he put the time to good use, and has written a book, A Struggle for the Soul of America, which hits the bookstalls the same week President Barbour is inaugurated. You leaf through a copy at the public library; it’s not exactly a great work of literature, and it’s written in a folksy, rambling style you find irritating, but it’s full of the kind of political notions that Americans swap over beers and pizza: the kind, in other words, that no mainstream party will touch.
The book has an edge that wasn’t in NPAPP literature before Halliot’s prison term, though.  The government of the parties, he insists, must be replaced by a government of the people, guided by a new values consensus that goes beyond the broken politics of greed and special interests to do what has to be done to cope with the disintegrating economy, the challenge of peak oil, and the impacts of climate change. Time is short, he insists, and half measures aren’t enough to avoid catastrophe; a complete transformation of every aspect of American life, a Great Turning, is the only option left.  Edgy though his language and ideas have become, you note, he’s still the only person in national politics who takes the economic, energy, and climate crises seriously.
The next autumn, as if on cue, the economic troubles go into overdrive.  Petroleum prices spike again—you start commuting via public transit when the price of gasoline breaks $8 a gallon—and a big Wall Street investment bank that had huge derivative bets the other direction goes messily broke.  Attempts to get a bailout through Congress freeze up in a flurry of partisan bickering. Over the next two months, despite frantic efforts by the Barbour administration, the stock market plunges and the credit markets seize up.  Job losses snowball. Through the fall and winter, NPAPP people are everywhere, leafleting the crowds, staffing impromptu soup kitchens, marching in the streets. You would pay less attention, but by spring you’re out of a job, too.
The following years are a blur of grim headlines, hungry crowds at soup kitchens, and marching crowds in green and black. In the 2018 election,  there are rumors, never proved, of NPAPP squads keeping opposition voters away from the polls in critical districts.  One way or another, though, Halliot’s party seats six senators and 185 representatives in Congress, and takes control of the governments of a dozen states. The three-way split in the House makes it all but impossible to get anything done there, not that the Democrats or Republicans have any idea what to do, and the administration copies its last two predecessors by flailing and fumbling to no noticeable effect. One thing of importance does happen; to get NPAPP support to push a stopgap budget through the House in 2019, President Barbour is forced to grant a full federal pardon to Halliot, removing the last legal barrier to the latter’s presidential ambitions.
Fast forward to the 2020 elections, which are fought out bitterly in a flurry of marches, protests, beatings, riots, and charges and countercharges of vote fraud. When the dust has settled, it turns out that no party has a majority in the electoral college.  The election goes to the House, and since neither of the major parties is willing to vote for the other major party’s candidate, Halliot ends up winning by a whisker-thin majority on the forty-second ballot. He is inaugurated on a bitterly cold day, surrounded by NPAPP banners and greeted by marching files of party faithful in green and black.  He announces that he’s about to call a constitutional convention to replace the government of the parties with a government of the people, get the country back on its feet, and sweep away everything that stands in the way of the Great Turning that will lead America and the world to a bright new future. The crowd roars its approval.
Later that year, the crowds go wilder still when the old constitution is scrapped and the new one enacted. Those with old-fashioned ideas find some aspects of the new constitution objectionable, as it lacks such minor details as checks and balances, not to mention meaningful and enforceable guarantees of due process and civil rights.  The media doesn’t mention that, though, because the “new values consensus” is enforced by Party officials—the capital letter becomes standard usage very quickly—and those who criticized the new constitution too forcefully, well, let’s just say that nobody’s quite sure where they are now, and most people know better than to ask.
And you, dear reader? At what point along that trajectory would you have decided that for all its seeming promise, for all the youth and enthusiasm and earnestness that surround it, the National Socialist German Workers Party and the folksy, charismatic veteran who led it were likely to be worse—potentially much, much worse—than the weary, dreary, dysfunctional mess of a political system they were attempting to replace?  Or would you end up as part of the cheering crowds in that last scene?  You don’t have to tell me the answer, but in the silence of your own mind, take the time to think it through and face the question honestly.
What almost always gets forgotten about the fascist movements of Europe between the wars is just how much promise they seemed to hold, and how many people of good will saw them as the best hope of the future.  Their leaders were young—Hitler was 43 when he became chancellor of Germany, the same age as John F. Kennedy at his inauguration, and Mussolini was only 39 when he became prime minister of Italy—and most of the rank and file of both men’s followers were younger still. Hitler’s party, for example, had a huge success among German college students long before it had a mass following anywhere else. Both parties also drew to a very great extent on the avant-garde culture and popular ideas of their time. How many people even remember nowadays that before the Second World War, the swastika was seen as a pagan symbol of life, redolent of ancient roots and primal vitality, with much the same cultural ambience that the NPAPP’s Celtic tree-of-life emblem might have in America today?
The fascist movements of the 1920s and 1930s were thus closely attuned to the hopes and fears of the masses, far more so than either the mainstream parties or the established radical groups of their respective countries. Unlike the imagined “fascism” of modern radical rhetoric, they were an alternative to business as usual, an alternative that positioned itself squarely in the abandoned center of the political discourse of their eras.  In terms of that discourse, in the context of their own times and places, the talking points of the fascist parties weren’t anything like so extreme as they appear to most people nowadays—and we forget that at our deadly peril.
That’s the thing I tried to duplicate in the thought experiment above, by changing certain details of  German national socialism so I could give the National Progressive American Peoples Party a contemporary slant—one that that calls up the same reactions its earlier equivalent got in its own place and time. Antisemitism and overt militarism were socially acceptable in Germany between the wars; they aren’t socially acceptable in today’s United States, and so they won’t play a role in a neofascist movement of any importance in the American future. What will play such roles, of course, are the tropes and buzzwords that appeal to Americans today, and those may very well include the tropes and buzzwords that appeal most to you.
There’s a deeper issue I’ve tried to raise here, too.  It’s easy, comfortable, and (for the manufacturers and distributors of partisan pablum) highly profitable to approach every political conflict in the simplistic terms of good versus evil. The habit of seeing political strife in those terms becomes a reliable source of problems when the conflict in question is actually between the good and the perfect—that is, between a flawed but viable option that’s within reach, and a supposedly flawless one that isn’t. The hardest of all political choices, though, comes when the conflict lies between the bad and the much, much worse—as in the example just sketched out, between a crippled, dysfunctional, failing democratic system riddled with graft and abuses of power, on the one hand, and a shiny new tyranny on the other.
It may be that there are no easy answers to that conundrum. Unless Americans can find some way to step back from the obsessive partisan hatreds that bedevil our political life, though, it’s probably a safe bet that there will be no answers at all—not, quite possibly, until the long and ugly list of the world’s totalitarian regimes gets another entry, complete with the usual complement of prison camps and mass graves. As long as the word “fascism” retains its current status as a meaningless snarl word that’s normally flung at the status quo, certainly, that last possibility seems far more likely than any of the alternatives.

Fascism and the Future, Part Two: The Totalitarian Center

Wed, 2014-02-19 16:24
As the first part of this series pointed out last week, there’s an odd mismatch between the modern use of “fascism” as an all-purpose political snarl word, on the one hand, and the mediocrity of the regime that put the term into general use, on the other. All things considered, as tyrants go, Benito Mussolini simply wasn’t that impressive, and while the regime he cobbled together out of a bucket of spare ideological parts had many objectionable features, it cuts a pretty poor figure in the rogue’s gallery of authoritarian states. Let’s face it, as an archetype of tyranny, Italian Fascismo just doesn’t cut it.  
  For that matter, it’s far from obvious that there’s enough common ground among the various European totalitarian movements between the wars to justify the use of a single label for them—much less to make that label apply to tyrants and tyrannies around the world and throughout time. Historians in Europe and elsewhere thus spent a good deal of time in recent decades arguing about whether there’s any such thing as fascism in general, and some very thoughtful writers ended up insisting that there isn’t—that more general words such as “dictatorship” cover the ground quite adequately, and the word “fascism” properly belongs to Mussolini’s regime and that alone.
On the other side of the equation were those who argued that a certain kind of authoritarian movement in Europe between the wars was sufficiently distinct from other kinds of tyranny that it deserves its own label. One of those was Ernst Nolte, whose 1968 book Die Krise des liberalen Systems und die faschistischen Bewegungen (The Crisis of the Liberal System and the Fascist Movements) played a central role in launching the debate just mentioned. Nolte was careful enough not to propose a hard and fast definition of fascism, and offered instead a list of six features that any movement had to have to count as fascist. The first three of them are organizational features: a cult of charismatic leadership, a uniformed Party militia, and the goal of totalitarianism.
That last word has been bandied around so freely over the years since then that it’s probably necessary to stop here and discuss what it means. A totalitarian political system is one in which the party in power claims the right to rule every sphere of life: political, religious, artistic, scientific, sexual, and so on through all the normally distinct dimensions of human existence. There are plenty of dictatorships that aren’t totalitarian—in fact, it’s fairly common for dictators to spare themselves a lot of extra work by focusing purely on the political sphere, and letting people do what they want in other spheres of life so long as their activities don’t stray into politics—and there are also totalitarian systems that aren’t dictatorships: there are plenty of religious communities, some of them more or less democratic in terms of governance, that claim totalitarian authority over every aspect of the life of the faithful.
The totalitarian dimension, though, is central to those movements and regimes that count as fascist by Nolte’s criteria, and it’s a crucial distinction. The charismatic leaders and party militias of between-the-wars European fascist parties presented themselves, and in at least some cases honestly saw themselves, as trying to overturn not merely a political system but an entire civilization they believed was rotten to the core. Crusades against “degenerate” art and literature thus weren’t simply the product of the individual vagaries of fascist leaders; they were part and parcel of an attempt to reshape an entire society from the ground up, and the cult of leadership and the party militia very often served mostly as vehicles for the broader totalitarian agenda.
A good deal of the discussion that followed the publication of Nolte’s book focused on whether the three organizational features just discussed were sufficiently unique to fascist movements to serve as touchstones, whether there were more features that might usefully be added to the list, and so on.  The other three features in Nolte’s description, by contrast, were broadly accepted by scholars. This is all the more interesting in that one of them is almost always rejected out of hand on the rare occasions it slips outside the charmed circle where professional historians practice their craft. These three features are the things that fascist movements and regimes consistently rejected. The first is Marxism, the second liberalism, and the third—the hot-button one—is conservatism.
Mention this to anyone in the contemporary American left, and you can expect blank incomprehension. Try to push past that, and if you get anywhere at all you can normally count on seeing the blank look replaced by flat rejection or incandescent rage. It’s one of the standard credos of current political folklore that fascism belongs to the conservative side of the political spectrum.  More specifically, it’s supposed to be the far end of that side of the spectrum, the thing that’s more conservative than the conservatives, just as—to the contemporary American right—Communism is the far end of the left side of the spectrum, the thing that’s more liberal than the liberals.
I mentioned in last week’s post the way that the riotous complexity of political thought in the early 20th century got flattened out into a Hobson’s choice between representative-democracy-plus-capitalism (the ideology of the American empire) and bureaucratic state socialism (the ideology of the Soviet empire) in the course of the Cold War. The same flattening process also affected domestic politics in the United States, though in a somewhat different way. Communism and fascism have long been the most overheated labels in our political culture’s demonology, and Republicans and Democrats eagerly applied these labels to each other.  Since Republicans and Democrats are themselves simply very minor variations on a common theme, it worked well thereafter to apply those labels to anyone who strayed too far from the midpoint between the two.  This allowed the parties to squabble about peripheral issues while maintaining perfect unanimity on core values such as maintaining America’s empire, say, or supporting the systemic imbalances in financial and resource flows that keep that empire in business.
One of the consequences of that strategy was the elimination of conservatism, in anything like the old meaning of that word, from the vocabulary of American politics. The Anglo-American tradition of conservatism—continental Europe has its own somewhat different form—has its roots in the writings of Edmund Burke, whose Reflections on the Revolution in France became a lightning rod for generations of thinkers who found the hubris of the radical Enlightenment too much to swallow. At the risk of oversimplifying a complex tradition, conservatism was based on the recognition that human beings aren’t as smart as they like to think. As a result, when intellectuals convince themselves that they know how to make a perfect human society, they’re wrong, and the consequences of trying to enact their fantasies in the real world normally range from the humiliating to the horrific.
To the conservative mind, the existing order of society has one great advantage that the arbitrary inventions of would-be world-reformers can’t match: it has actually been shown to work in practice. Conservatives thus used to insist that changes to the existing order of society ought to be made only when there was very good reason to think the changes will turn out to be improvements. The besetting vice of old-fashioned conservatism, as generations of radicals loved to point out, was thus that it tended to defend and excuse traditional injustices; among its great virtues was that it defended traditional liberties against the not always covert authoritarianism of would-be reformers.
In America before the Cold War, conservatives thus called for limitations on federal power, denounced the nation’s moves toward global empire, demanded balanced budgets and fiscal prudence, and upheld local and regional cultures and governments against the centralizing reach of Washington DC.  In the South, that reasoning was inevitably used to defend segregation, but it’s a distortion of history to claim that American conservatism was never anything more than a polite label for Jim Crow.  Like every political movement in the real world, it was a complex thing, and combined high ideals and base motives in roughly the same proportions as its rivals.
Whatever its faults or its virtues, though, it died a miserable death during the 20th century, as both parties and most of the competing power centers that form America’s governing classes joined eagerly in the rush to empire, and vied to see who could come up with more excuses for centralizing power in the executive branch of the federal government. As part of that process, the old conservatism was gutted, stuffed, and left to rot in cold storage, except for very occasional moments of pro forma display for the benefit of the dwindling few who hadn’t gotten the memo.
In Europe between 1919 and 1945, though, the European version of old-fashioned conservatism was still a major power, and Nolte was quite correct to say that one of the core themes of fascism was the rejection of conservative ideas. Where conservatives saw themselves as the defenders of the old order of Europe—Christian, aristocratic, agrarian, and committed to local custom and local autonomy—fascists wanted to impose a New Order (one of Hitler’s favorite phrases) in which traditional social hierarchies would dissolve in the orgiastic abandon of “one leader, one party, one people.” Fascists by and large hated and despised the conservatives, and the conservatives returned the compliment; it’s a matter of historical fact that the most diehard resistance Hitler’s regime faced, and the conspiracies that came closest to blowing Hitler himself to smithereens, all came straight out of the hardline aristocratic right wing of German society.
The bitter divide between fascists and conservatives, in fact, goes straight back to the origins of both movements. In a teasingly titled book, Hitler as Philosophe, Lawrence Birken showed in detail that the entire vocabulary of political ideas used by Hitler and the other ideologues of German national socialism came straight out of the same radical side of the Enlightenment that Edmund Burke critiqued so trenchantly.  When Hitler ranted about the will of das Volk, for example, he was simply borrowing Rousseau’s notion of the general will of the people, which both men believed ought to be free from the pettifogging hindrance of mere laws and institutions. Examples could be multiplied almost endlessly, and matched nearly word for word out of Mussolini’s speeches.  Despite the trope that fascism was a reversion to the Middle Ages, Hitler, Mussolini, and their fellow fascists were thoroughly modern figures pursuing some of the most avant-garde, cutting-edge ideas of their time.
Point this out to most people nowadays, though, and you’re likely to get pushback along two lines. The first is the claim that fascism equals racial bigotry, and racial bigotry is a right-wing habit, thus fascism must be a right-wing movement. That argument gets what force it has from the astonishing levels of historical ignorance found in the United States these days, but it’s common, and needs to be addressed.
Old-fashioned conservatism in the United States, as noted above, unquestionably had its racist side. South of the Mason-Dixon line, in particular, talk about local autonomy and resistance to edicts from Washington DC normally included a subtext favoring segregation and other policies meant to disadvantage Americans of African descent.  That’s one consequence of the tangled and bitter history of race in America. It’s conveniently forgotten, however, that well into the twentieth century, the labor movement in the US was as heavily committed to racial exclusion as any collection of Southern good ol’ boys—keeping African-Americans out of the skilled trades, for example, was seen by many labor activists as essential to boosting the wages of white laborers.  With embarrassingly few exceptions, racial prejudice was widely accepted straight across the American political scene until the convulsions of the 1960s finally pushed it into its present state of slow disintegration.
Elsewhere in the world, the notion that racial bigotry is purely a right-wing habit has even less support. Even to the extent that labels such as “left” and “right” apply to the n-dimensional continuum of competing political and economic viewpoints in the pre-Cold War era, racial prejudice, racial tolerance, and relative apathy on the subject were more or less evenly distributed among them. Fascist parties are a good sample of the whole. Some fascist regimes, such as Hitler’s, were violently racist. Others were not—Mussolini’s regime in Italy, for example, was no more racist or antisemitic than the democratic government it replaced, until Germany imposed its race laws on its ally at gunpoint. The easy equation of fascism with racism, and racism with contemporary American (pseudo)conservatism, is yet another example of the way that the complexities of politics and history get flattened out into a caricature in what passes for modern political discourse.
That’s the first standard argument for fascism as a right-wing movement.  The second is the claim that German national socialism was bought and paid for by big business, and therefore all fascism everywhere has to have been a right-wing movement. That’s an extremely common claim; you’ll find it splashed all over the internet, and in plenty of less evanescent media as well, as though it was a matter of proven fact. The only problem with this easy consensus is that it doesn’t happen to be true.
There have been two excellent scholarly studies of the issue, Pool and Pool’s Who Financed Hitler? (1978) and Turner’s German Big Business and the Rise of Hitler (1985). Both studies showed conclusively that the National Socialist German Workers Party got the vast majority of its financing from its own middle-class membership until the last year or two before it took power, and only then came in for handouts from business because most German businesses decided that given a choice between the two rising powers in the final crisis of the Weimar regime—the Nazis and the Communists—they would settle for the Nazis. In point of fact—and this can be found detailed in any social history of Germany between the wars—German big business by and large distrusted Hitler’s party, and bitterly resented the new regime’s policy of gleichschaltung, “coordination,” which subjected even the largest firms to oversight and regulation by Party officials.
So where did the claim that fascism is always a puppet of big business come from? Like the use of “fascism” as a generic label for regimes liberals don’t like, it’s a third-hand borrowing from the Soviet propaganda of an earlier day. In the political theology of Marxism, remember, everything boils down to the struggle between capitalists and the proletariat, the two contending forces of the Marxist cosmos. Everything and everyone that doesn’t support the interests of the proletariat as defined by Marxist theory is therefore by definition a tool of the capitalist ruling class, and any political movement that opposes Marxism thus has to be composed of capitalist lackeys and running dogs.  QED! 
More broadly, communist parties have generally pitched themselves to the public by insisting that all other political movements work out in practice to a vote for the existing order of society. A useful bit of marketing in any context, it became a necessity once Stalin’s regime demonstrated just how unpleasant a communist regime could be in practice.  Insisting that fascism is simply another name for what we’ve already got, though, had an enduring downside—it convinced a great many people, in the teeth of the evidence, that fascism by definition defends the status quo. The fact that Italian Fascism and German national socialism both rose to power promising radical change in their respective societies and delivered on that promise has been completely erased from the modern political imagination.
For that matter, the flattening out of American political thought into a linear spectrum from “the left” (the Democratic party, and the Communists who are presumed to be lurking in its leftward fringe) to “the right” (the Republican party, and the fascists who are presumed to have a similar hideout in the GOP’s rightward fringe) helps feed the same belief. Once all political thought has been forced onto that Procrustean bed of ideology, after all, if the fascists aren’t hiding out somewhere on the far end of the Republican half of the spectrum, where else could they be?
It’s at this point that we approach the most explosive dimension of the history of fascism, because the unthinking acceptance of the linear model of politics presupposed by that question isn’t merely a problem in some abstract sense. It also obscures some of the most important dimensions of contemporary political life, in the United States and elsewhere.  According to that model, the point in the middle of the spectrum—where “left” and “right” fade into one another—is the common ground of politics, the middle of the road, where most people either are or ought to be.  The further you get from that midpoint, the closer you are to “extremism.”  (Think about that last word for a moment.) What happens, though, if the common ground where the two major parties meet and shake hands is far removed from the actual beliefs and opinions of the majority?
That’s the situation we’re in today in America, of course. Americans may not agree about much, but a remarkably large number of them agree that neither political party is listening to them, or offering policies that Americans in general find appealing or even acceptable. Where the two major parties can reach a consensus—for example, in giving bankers a de facto amnesty for even the most egregious and damaging acts of financial fraud—there’s normally a substantial gap between that consensus and the policies that most Americans support. Where the parties remain at loggerheads, there are normally three positions: the Democratic position, the Republican position, and the position most Americans favor, which never gets brought up in the political arena at all.
That’s one of the pervasive occupational hazards of democratic systems under strain. In Italy before and during the First World War, and in Germany after it, democratic institutions froze up around a series of problems that the political systems in question were unwilling to confront and therefore were unable to address.  Every mainstream political party was committed to maintaining the status quo in the face of a rising spiral of crisis that made it brutally clear that the status quo no longer worked.  One government after another took office, promising to make things better by continuing the same policies that were making things worse, while the opposition breathed fire and brimstone, promising fierce resistance to the party in power on every issue except those that mattered—and so, in both countries, a figure from outside the political mainstream who was willing to break with the failed consensus won the support of enough of the voters to shoulder his way into power.
When fascism succeeds in seizing power, in other words, it’s not a right-wing movement, or for that matter a left-wing one. It seizes the abandoned middle ground of politics, takes up the popular causes that all other parties refuse to touch, and imposes a totalitarianism of the center. That’s the secret of fascism’s popularity—and it’s the reason why an outbreak of full-blown fascism is a real and frightening possibility as America stumbles blindly into an unwelcome future. We’ll talk about that next week.

Fascism and the Future, Part One: Up From Newspeak

Wed, 2014-02-12 17:39
Over the nearly eight years that I’ve been posting these weekly essays on the shape of the deindustrial future, I’ve found that certain questions come up as reliably as daffodils in April or airport food on a rough flight. Some of those fixate on topics I’ve discussed here recently, such as the vaporware du jour that’s allegedly certain to save industrial civilization and the cataclysm du jour that’s just as allegedly certain to annihilate it. Still, I’m glad to say that not all the recurring questions are as useless as these.
  One of these latter deserves a good deal more attention than I’ve given it so far:  whether the Long Descent of industrial society will be troubled by a revival of fascism. It’s a reasonable question to ask, since the fascist movements of the not so distant past were given their shot at power by the political failure and economic implosion of Europe after the First World War, and “political failure and economic implosion” is a tolerably good description of the current state of affairs in the United States and much of Europe these days. For that matter, movements uncomfortably close to the fascist parties of the 1920s and 1930s already exist in a number of European countries.  Those who dismiss them as a political irrelevancy might want to take a closer look at history, for that same mistake was made quite regularly by politicians and pundits most of a century ago, too.
Nonetheless, with one exception—a critique some years back of talk in the peak oil scene about the so-called “feudal-fascist” society the rich were supposedly planning to ram down our throats—I’ve done my best to avoid the issue so far. This isn’t because it’s not important.  It’s because the entire subject is so cluttered with doubletalk and distortions of historical fact that communication on the subject has become all but impossible. It’s going to take an entire post just to shovel away some of the manure that’s piled up in this Augean stable of our collective imagination, and even then I’m confident that many of the people who read this will manage to misunderstand every single word I say.
There’s a massive irony in that situation. When George Orwell wrote his tremendous satire on totalitarian politics, 1984, one of the core themes he explored was the debasement of language for political advantage.  That habit found its lasting emblem in Orwell’s invented language Newspeak, which was deliberately designed to get in the way of clear thinking.  Newspeak remains fictional—well, more or less—but the entire subject of fascism, and indeed the word itself, has gotten tangled up in a net of debased language and incoherent thinking as extreme as anything Orwell put in his novel.
These days, to be more precise, the word “fascism” mostly functions as what S.I. Hayakawa used to call a snarl word—a content-free verbal noise that expresses angry emotions and nothing else. One of my readers last week commented that for all practical purposes, the word “fascism” could be replaced in everyday use with “Oogyboogymanism,” and of course he’s quite correct; Aldous Huxley pointed out many years ago that already in his time, the word “fascism” meant no more than “something of which one ought to disapprove.”  When activists on the leftward end of today’s political spectrum insist that the current US government is a fascist regime, they thus mean exactly what their equivalents on the rightward end of the same spectrum mean when they call the current US government a socialist regime: “I hate you.”  It’s a fine example of the way that political discourse nowadays has largely collapsed into verbal noises linked to heated emotional states that drowns out any more useful form of communication.
The debasement of our political language quite often goes to absurd lengths. Back in the 1990s, for example, when I lived in Seattle, somebody unknown to me went around spraypainting “(expletive) FACISM” on an assortment of walls in a couple of Seattle’s hip neighborhoods. My wife and I used to while away spare time at bus stops discussing just what “facism” might be. (Her theory was that it’s the prejudice that makes businessmen think that employees in front office jobs should be hired for their pretty faces rather than their job skills; mine, recalling the self-righteous declaration of a vegetarian cousin that she would never eat anything with a face, was that it’s the belief that the moral value of a living thing depends on whether it has a face humans recognize as such.) Beyond such amusements, though, lay a real question:  what on earth did the graffitist think he was accomplishing by splashing that phrase around oh-so-liberal Seattle? Did he perhaps think that members of the American Fascist Party who happened to be goose-stepping through town would see the slogan and quail?
To get past such stupidities, it’s going to be necessary to take the time to rise up out of the swamp of Newspeak that surrounds the subject of fascism—to reconnect words with their meanings, and political movements with their historical contexts. Let’s start in the obvious place. What exactly does the word “fascism” mean, and how did it get from there to its current status as a snarl word?
That takes us back to southern Italy in 1893. In that year, a socialist movement among peasant farmers took to rioting and other extralegal actions to try to break the hold of the old feudal gentry on the economy of the region; the armed groups fielded by this movement were called fasci, which might best be translated “group” or “band.” Various other groups in the troubled Italian political scene borrowed the label thereafter, and it was also used for special units of shock troops in the First World War—Fasci di Combattimento, “combat groups,” were the exact equivalent of the Imperial German Army’s Sturmabteilungen, “storm troops.”
After the war, in 1919, an army veteran and former Socialist newspaperman named Benito Mussolini borrowed the label Fasci di Combattimento for his new political movement, about the same time that another veteran on the other side of the Alps was borrowing the term Sturmabteilung for his party’s brown-shirted bullies. The movement quickly morphed into a political party and adapted its name accordingly, becoming the Fascist Party, and the near-total paralysis of the Italian political system allowed Mussolini to seize power with the March on Rome in 1922.  The secondhand ideology Mussolini’s aides cobbled together for their new regime accordingly became known as Fascism—“Groupism,” again, is a decent translation, and yes, it was about as coherent as that sounds. Later on, in an attempt to hijack the prestige of the Roman Empire, Mussolini identified Fascism with another meaning of the word fasci—the bundle of sticks around an axe that Roman lictors carried as an emblem of their authority—and that became the emblem of the Fascist Party in its latter years.
Of all the totalitarian regimes of 20th century Europe, it has to be said, Mussolini’s was far from the most bloodthirsty. The Fascist regime in Italy carried out maybe two thousand political executions in its entire lifespan; Hitler’s regime committed that many political killings, on average, every single day the Twelve-Year Reich was in power, and when it comes to political murder, Hitler was a piker compared to Josef Stalin or Mao Zedong.  For that matter, political killings in some officially democratic regimes exceed Italian Fascism’s total quite handily.  Why, then, is “fascist” the buzzword of choice to this day for anybody who wants to denounce a political system?  More to the point, why do most Americans say “fascist,” mean “Nazi,” and then display the most invincible ignorance about both movements?
There’s a reason for that, and it comes out of the twists of radical politics in 1920s and 1930s Europe.
The founding of the Third International in Moscow in 1919 forced radical parties elsewhere in Europe to take sides for or against the Soviet regime. Those parties that joined the International were expected to obey Moscow’s orders without question, even when those orders clearly had much more to do with Russia’s expansionist foreign policy than they did with the glorious cause of proletarian revolution; at the same time, many idealists still thought the Soviet regime, for all its flaws, was the best hope for the future. The result in most countries was the emergence of competing Marxist parties, a Communist party obedient to Moscow and a Socialist party independent of it.
In the bare-knuckle propaganda brawl that followed, Mussolini’s regime was a godsend to Moscow. Since Mussolini was a former socialist who had abandoned Marx in the course of his rise to power, parties that belonged to the Third International came to use the label “fascist” for those parties that refused to join it; that was their way of claiming that the latter weren’t really socialist, and could be counted on to sell out the proletariat as Mussolini was accused of doing. Later on, when the Soviet Union ended up on the same side of the Second World War as its longtime enemies Britain and the United States, the habit of using “fascist” as an all-purpose term of abuse spread throughout the left in the latter two countries. From there, its current status as a universal snarl word was a very short step.
What made “fascist” so useful long after the collapse of Mussolini’s regime was the sheer emptiness of the word. Even in Italian, “Groupism” doesn’t mean much, and in other languages, it’s just a noise; this facilitated its evolution into an epithet that could be applied to anybody.  The term “Nazi” had most of the same advantages: in most languages, it sounds nasty and doesn’t mean a thing, so it can be flung freely at any target without risk of embarrassment.  The same can’t be said about the actual name of the German political movement headed by Adolf Hitler, which is one reason why next to nobody outside of specialist historical works ever mentions national socialism by its proper name.
That name isn’t simply a buzzword coined by Hitler’s flacks, by the way.  The first national socialist party I’ve been able to trace was founded in 1898 in what’s now the Czech Republic, and the second was launched in France in 1903. National socialism was a recognized position in the political and economic controversies of early 20th century Europe. Fail to grasp that and it’s impossible to make any sense of why fascism appealed to so many people in the bitter years between the wars.  To grasp that, though, it’s necessary to get out from under one of the enduring intellectual burdens of the Cold War.
After 1945, as the United States and the Soviet Union circled each other like rival dogs contending for the same bone, it was in the interest of both sides to prevent anyone from setting up a third option. Some of the nastier details of postwar politics unfolded from that shared interest, and so did certain lasting impacts on political and economic thought. Up to that point, political economy in the western world embraced many schools of thought.  Afterwards, on both sides of the Iron Curtain, the existence of alternatives to representative-democracy-plus-capitalism, on the one hand, and bureaucratic state socialism on the other, became a taboo subject, and remains so in America to this day.
You can gain some sense of what was erased by learning a little bit about the politics in European countries between the wars, when the diversity of ideas was at its height. Then as now, most political parties existed to support the interests of specific social classes, but in those days nobody pretended otherwise. Conservative parties, for example, promoted the interests of the old aristocracy and rural landowners; they supported trade barriers, low property taxes, and an economy biased toward agriculture.  Liberal parties furthered the interests of the bourgeoisie—that is, the urban industrial and managerial classes; they supported free trade, high property taxes, military spending, and colonial expansion, because those were the policies that increased bourgeios wealth and power. 
The working classes had their choice of several political movements. There were syndicalist parties, which sought to give workers direct ownership of the firms for which they worked; depending on local taste, that might involve anything from stock ownership programs for employees to cooperatives and other worker-owned enterprises.  Syndicalism was also called corporatism; “corporation” and its cognates in most European languages could refer to any organization with a government charter, including craft guilds and cooperatives.  It was in that sense that Mussolini’s regime, which borrowed some syndicalist elements for its eclectic ideology, liked to refer to itself as a corporatist system. (Those radicals who insist that this meant fascism was a tool of big corporations in the modern sense are thus hopelessly misinformed—a point I’ll cover in much more detail next week.)
There were also socialist parties, which generally sought to place firms under government control; this might amount to anything from government regulation, through stock purchases giving the state a controlling interest in big firms, to outright expropriation and bureaucratic management. Standing apart from the socialist parties were communist parties, which (after 1919) spouted whatever Moscow’s party line happened to be that week; and there were a variety of other, smaller movements—distributism, social credit, and many more—all of which had their own followings and their own proposed answers to the political and economic problems of the day.
The tendency of most of these parties to further the interests of a single class became a matter of concern by the end of the 19th century, and one result was the emergence of parties that pursued, or claimed to pursue, policies of benefit to the entire nation. Many of them tacked the adjective “national” onto their moniker to indicate this shift in orientation. Thus national conservative parties argued that trade barriers and economic policies focused on the agricultural sector would benefit everyone; national liberal parties argued that free trade and colonial expansion was the best option for everyone; national syndicalist parties argued that giving workers a stake in the firms for which they worked would benefit everyone, and so on. There were no national communist parties, because Moscow’s party line didn’t allow it, but there were national bolshevist parties—in Europe between the wars, a bolshevist was someone who supported the Russian Revolution but insisted that Lenin and Stalin had betrayed it in order to impose a personal dictatorship—which argued that violent revolution against the existing order really was in everyone’s best interests.
National socialism was another position along the same lines. National socialist parties argued that business firms should be made subject to government regulation and coordination in order to keep them from acting against the interests of society as a whole, and that the working classes ought to receive a range of government benefits paid for by taxes on corporate income and the well-to-do. Those points were central to the program of the National Socialist German Workers Party from the time it got that name—it was founded as the German Workers Party, and got the rest of the moniker at the urging of a little man with a Charlie Chaplin mustache who became the party’s leader not long after its founding—and those were the policies that the same party enacted when it took power in Germany in 1933.
If those policies sound familiar, dear reader, they should. That’s the other reason why next to nobody outside of specialist historical works mentions national socialism by name: the Western nations that defeated national socialism in Germany promptly adopted its core economic policies, the main source of its mass appeal, to forestall any attempt to revive it in the postwar world.   Strictly speaking, in terms of the meaning that the phrase had before the beginning of the Second World War, national socialism is one of the two standard political flavors of political economy nowadays. The other is liberalism, and it’s another irony of history that in the United States, the party that hates the word “liberal” is a picture-perfect example of a liberal party, as that term was understood back in the day.
Now of course when people think of the National Socialist German Workers Party nowadays, they don’t think of government regulation of industry and free vacations for factory workers, even though those were significant factors in German public life after 1933.  They think of such other habits of Hitler’s regime as declaring war on most of the world, slaughtering political opponents en masse, and exterminating whole ethnic groups. Those are realities, and they need to be recalled.  It’s crucial, though, to remember that when Germany’s National Socialists were out there canvassing for votes in the years before 1933, they weren’t marching proudly behind banners saying VOTE FOR HITLER SO FIFTY MILLION WILL DIE!  When those same National Socialists trotted out their antisemitic rhetoric, for that matter, they weren’t saying anything the average German found offensive or even unusual; to borrow a highly useful German word, antisemitism in those days was salonfähig, “the kind of thing you can bring into the living room.” (To be fair, it was just as socially acceptable in England, the United States, and the rest of the western world at that same time.)
For that matter, when people talked about fascism in the 1920s and 1930s, unless they were doctrinaire Marxists,  they didn’t use it as a snarl word.  It was the official title of Italy’s ruling party, and a great many people—including people of good will—were impressed by some of the programs enacted by Mussolini’s regime, and hoped to see similar policies put in place in their own countries. Fascism was salonfähig in most industrial countries.  It didn’t lose that status until the Second World War and the Cold War reshaped the political landscape of the western world—and when that happened, the complex reality of early 20th century authoritarian politics vanished behind a vast and distorted shadow that could be, and was, cast subsequently onto anything you care to name.
The downsides to this distortion aren’t limited to a failure of historical understanding.  If a full-blown fascist movement of what was once the standard type were to appear in America today, it’s a safe bet that nobody except a few historians would recognize it for what it is. What’s more, it’s just as safe a bet that many of those people who think they oppose fascism—even, or especially, those who think they’ve achieved something by spraypainting “(expletive) FACISM” on a concrete wall—would be among the first to cheer on such a movement and fall in line behind its banners. How and why that could happen will be the subject of the next two posts.

The Steampunk Future

Wed, 2014-02-05 19:22
For those of us who’ve been watching the course of industrial civilization’s decline and fall, the last few weeks have been a bit of a wild ride.  To begin with, as noted in last week’s post, the specter of peak oil has once again risen from the tomb to which the mass media keeps trying to consign it, and stalks the shadows of contemporary life, scaring the bejesus out of everyone who wants to believe that infinite economic growth on a finite planet isn’t a self-defeating absurdity.
Then, of course, it started seeping out into the media that the big petroleum companies have lost a very large amount of money in recent quarters, and a significant part of those losses were due to their heavy investments in the fracking boom in the United States—you know, the fracking boom that was certain to bring us renewed prosperity and limitless cheap fuel into the foreseeable future?  That turned out to a speculative bubble, as readers of this blog were warned a year ago. The overseas investors whose misspent funds kept the whole circus going are now bailing out, and the bubble has nowhere to go but down. How far down? That's a very good question that very few people want to answer.
The fracking bubble is not, however, the only thing that's falling. What the financial press likes to call “emerging markets”—I suspect that “submerging markets” might be a better label at the moment—have had a very bad time of late, with stock markets all over the Third World racking up impressive losses, and some nasty downside action spilled over onto Wall Street, Tokyo and the big European exchanges as well. Meanwhile, the financial world has been roiled by the apparent suicides of four important bankers. If any of them left notes behind, nobody's saying what those notes might contain; speculation, in several senses of that word, abounds.
Thus it's probably worth being aware of the possibility that in the weeks and months ahead, we'll see another crash like the one that hit in 2008-2009: another milestone passed on the road down from the summits of industrial civilization to the deindustrial dark ages of the future. No doubt, if we get such a crash, it'll be accompanied by a flurry of predictions that the whole global economy will come to a sudden stop. There were plenty of predictions along those lines during the 2008-2009 crash; they were wrong then, and they'll be wrong this time, too, but it'll be few months before that becomes apparent.
In the meantime, while we wait to see whether the market crashes and another round of fast-crash predictions follows suit, I'd like to talk about something many of my readers may find whimsical, even irrelevant. It's neither, but that, too, may not become apparent for a while.
Toward the middle of last month, as regular readers will recall, I posted an essay here suggesting seven sustainable technologies that could be taken up, practiced, and passed down to the societies that will emerge out of the wreckage of ours. One of those was computer-free mathematics, using slide rules and the other tools people used to crunch numbers before they handed over that chunk of their mental capacity to machines. In the discussion that followed, one of my readers—a college professor in the green-technology end of things—commented with some amusement on the horrified response he’d likely get if he suggested to his students that they use a slide rule for their number-crunching activities.
Not at all, I replied; all he needed to do was stand in front of them, brandish the slide rule in front of their beady eyes, and say, “This, my friends, is a steampunk calculator.”
It occurs to me that those of my readers who don’t track the contemporary avant-garde may have no idea what that next to last word means;  like so many labels these days, it contains too much history to have a transparent meaning. Doubtless, though, all my readers have at least heard of punk rock.  During the 1980s, a mostly forgettable literary movement in science fiction got labeled “cyberpunk;” the first half of the moniker referenced the way it fetishized the behavioral tics of 1980s hacker culture, and the second was given it because it made a great show, as punk rockers did, of being brash and belligerent.  The phrase caught on, and during the next decade or so, every subset of science fiction that hadn’t been around since Heinleins roamed the earth got labeled fill-in-the-blankpunk by somebody or other.
Steampunk got its moniker during those years, and that’s where the “-punk” came from. The “steam” is another matter. There was an alternative-history novel, The Difference Engine by William Gibson and Bruce Sterling, set in a world in which Victorian computer pioneer Charles Babbage launched the cybernetic revolution a century in advance with steam-powered mechanical computers.  There was also a roleplaying game called Space 1889—take a second look at those numbers if you think that has anything to do with the 1970s TV show about Moonbase Alpha—that had Thomas Edison devising a means of spaceflight, and putting the Victorian earth in contact with alternate versions of Mars, Venus and the Moon straight out of Edgar Rice Burroughs-era space fantasy.
Those and a few other sources of inspiration like them got artists, craftspeople, writers, and the like  thinking about what an advanced technology might look like if the revolutions triggered by petroleum and electronics had never happened, and Victorian steam-powered technology had evolved along its own course.  The result is steampunk:  part esthetic pose, part artistic and literary movement, part subculture, part excuse for roleplaying and assorted dress-up games, and part—though I’m far from sure how widespread this latter dimension is, or how conscious—a collection of sweeping questions about some of the most basic presuppositions undergirding modern technology and the modern world.
It’s very nearly an article of faith in contemporary industrial society that any advanced technology—at least until it gets so advanced that it zooms off into pure fantasy—must by definition look much like ours. I’m thinking here of such otherwise impressive works of alternate history as Kim Stanley Robinson’s The Years of Rice and Salt. Novels of this kind portray the scientific and industrial revolution happening somewhere other than western Europe, but inevitably it’s the same scientific and industrial revolution, producing much the same technologies and many of the same social and cultural changes. This reflects the same myopia of the imagination that insists on seeing societies that don’t use industrial technologies as “stuck in the Middle Ages” or “still in the Stone Age,” or what have you:  the insistence that all human history is a straight line of progress that leads unstoppably to us.
Steampunk challenges that on at least two fronts. First, by asking what technology would look like if the petroleum and electronics revolutions had never happened, it undercuts the common triumphalist notion that of course an advanced technology must look like ours, function like ours, and—ahem—support the same poorly concealed economic, political, and cultural agendas hardwired into the technology we currently happen to have. Despite such thoughtful works as John Ellis’ The Social History of the Machine Gun, the role of such agendas in defining what counts for progress remains a taboo subject, and the idea that shifts in historical happenstance might have given rise to wholly different “advanced technologies” rarely finds its way even into the wilder ends of speculative fiction.
If I may be permitted a personal reflection here, this is something I watched during the four years when my novel Star’s Reach was appearing as a monthly blog post. 25th-century Meriga—yes, that’s “America” after four centuries—doesn’t fit anywhere on that imaginary line of progress running from the caves to the stars; it’s got its own cultural forms, its own bricolage of old and new technologies, and its own way of understanding history in which, with some deliberate irony, I assigned today’s industrial civilization most of the same straw-man roles that we assign to the societies of the preindustrial past.
As I wrote the monthly episodes of Star’s Reach, though, I fielded any number of suggestions about what I should do with the story and the setting, and a good any of those amounted to requests that I decrease the distance separating 25th-century Meriga from the modern world, or from some corner of the known past.  Some insisted that some bit of modern technology had to find a place in Merigan society, some urged me to find room somewhere in the 25th-century world for enclaves where a modern industrial society had survived, some objected to a plot twist that required the disproof of a core element of today’s scientific worldview—well, the list is long, and I think my readers will already have gotten the point.
C.S. Lewis was once asked by a reporter whether he thought he’d influenced the writings of his friend J.R.R. Tolkien. If I recall correctly, he said, “Influence Tolkien? You might as well try to influence a bandersnatch.” While I wouldn’t dream of claiming to be Tolkien’s equal as a writer, I share with him—and with bandersnatches, for that matter—a certain resistance to external pressures, and so Meriga succeeded to some extent in keeping its distance from more familiar futures. The manuscript’s now at the publisher, and I hope to have a release date to announce before too long; what kind of reception the book will get when it’s published is another question and, at least to me, an interesting one.
Outside of the realms of imaginative fiction, though, it’s rare to see any mention of the possibility that the technology we ended up with might not be the inevitable outcome of a scientific revolution. The boldest step in that direction I’ve seen so far comes from a school of historians who pointed out that the scientific revolution depended, in a very real sense, on the weather in the English Channel during a few weeks in 1688.  It so happened that the winds in those weeks kept the English fleet stuck in port while William of Orange carried out the last successful invasion (so far) of England by a foreign army. 
As a direct result, the reign of James II gave way to that of William III, and Britain dodged the absolute monarchy, religious intolerance, and technological stasis that Louis XIV was imposing in France just then, a model which most of the rest of Europe promptly copied. Because Britain took a different path—a path defined by limited monarchy, broad religious and intellectual tolerance, and the emergence of a new class of proto-industrial magnates whose wealth was not promptly siphoned off into the existing order, but accumulated the masses of capital needed to build the world’s first industrial economy—the scientific revolution of the late 17th and early 18th century was not simply a flash in the pan. Had James II remained on the throne, it’s argued, none of those things would have happened.
It shows just how thoroughly the mythology of progress has its claws buried in our imaginations that many people respond to that suggestion in an utterly predictable way—by insisting that the scientific and industrial revolutions would surely have taken place somewhere else, and given rise to some close equivalent of today’s technology anyway. (As previously noted, that’s the underlying assumption of the Kim Stanley Robinson novel cited above, and many other works along the same lines.)  At most, those who get past this notion of industrial society’s Manifest Destiny imagine a world in which the industrial revolution never happened:  where, say, European technology peaked around 1700 with waterwheels, windmills, square-rigged ships, and muskets, and Europe went from there to follow the same sort of historical trajectory as the Roman Empire or T’ang-dynasty China.
Further extrapolations along those lines can be left to the writers of alternative history. The point being made by the writers, craftspeople, and fans of steampunk, though, cuts in a different direction. What the partly imaginary neo-Victorian tech of steampunk suggests is that another kind of advanced technology is possible: one that depends on steam and mechanics instead of petroleum and electronics, that accomplishes some of the same things our technology does by different means, and that also does different things—things that our technologies don’t do, and in some cases quite possibly can’t do.
It’s here that steampunk levels its second and arguably more serious challenge against the ideology that sees modern industrial society as the zenith, so far, of the march of progress. While it drew its original inspiration from science fiction and roleplaying games, what shaped steampunk as an esthetic and cultural movement was a sense of the difference between the elegant craftsmanship of the Victorian era and the shoddy plastic junk that fills today’s supposedly more advanced culture. It’s a sense that was already clear to social critics such as Theodore Roszak many decades ago. Here’s Roszak’s cold vision of the future awaiting industrial society, from his must-read book Where the Wasteland Ends:
“Glowing advertisements of undiminished progress will continue to rain down upon us from official quarters; there will always be well-researched predictions of light at the end of every tunnel. There will be dazzling forecasts of limitless affluence; there will even be much real affluence. But nothing will ever quite work the way the salesmen promised; the abundance will be mired in organizational confusion and bureaucratic malaise, constant environmental emergency, off-schedule policy, a chaos of crossed circuits, clogged pipelines, breakdowns in communication, overburdened social services. The data banks will become a jungle of misinformation, the computers will suffer from chronic electropsychosis. The scene will be indefinably sad and shoddy despite the veneer of orthodox optimism. It will be rather like a world’s fair in its final days, when things start to sag and disintegrate behind the futuristic façades, when the rubble begins to accumulate in the corners, the chromium to grow tarnished, the neon lights to burn out, all the switches and buttons to stop working. Everything will take on that vile tackiness which only plastic can assume, the look of things decaying that were never supposed to grow old, or stop gleaming, never to cease being gay and sleek and perfect.”
As prophecies go, you must admit, this one was square on the mark.  Roszak’s nightmare vision has duly become the advanced, progressive, cutting-edge modern society in which we live today.  That’s what the steampunk movement is rejecting in its own way, by pointing out the difference between the handcrafted gorgeousness of an older generation of technology and the “vile tackiness which only plastic can assume” that dominates contemporary products and, indeed, contemporary life. It’s an increasingly widespread recognition, and helps explain why so many people these days are into some form of reenactment.
Whether it’s the new Middle Ages of the Society for Creative Anachronism, the frontier culture of buckskinners and the rendezvous scene, the military-reenactment groups recreating the technologies and ambience of any number of of long-ago wars, the primitive-technology enthusiasts getting together to make flint arrowheads and compete at throwing spears with atlatls, or what have you:  has any other society seen so many people turn their backs on the latest modern conveniences to take pleasure in the technologies and habits of earlier times? Behind this interest in bygone technologies, I suggest, lies a concept that’s even more unmentionable in polite company than the one I discussed above: the recognition that most of the time, these days, progress no longer means improvement.
By and large, the latest new, advanced, cutting-edge products of modern industrial society are shoddier, flimsier, and more thickly frosted with bugs, problems, and unwanted side effects than whatever they replaced. It’s becoming painfully clear that we’re no longer progressing toward some shiny Jetsons future, if we ever were, nor are we progressing over a cliff into a bigger and brighter apocalypse than anyone ever had before. Instead, we’re progressing steadily along the downward curve of Roszak’s dystopia of slow failure, into a crumbling and dilapidated world of spiraling dysfunctions hurriedly patched over, of systems that don’t really work any more but are never quite allowed to fail, in which more and more people every year find themselves shut out of a narrowing circle of paper prosperity but in which no public figure ever has the courage to mention that fact.
Set beside that bleak prospect, it’s not surprising that the gritty but honest hands-on technologies and lifeways of earlier times have a significant appeal.  There’s also a distinct sense of security that comes from the discovery that one can actually get by, and even manage some degree of comfort, without having a gargantuan fossil-fueled technostructure on hand to meet one’s every need. What intrigues me about the steampunk movement, though, is that it’s gone beyond that kind of retro-tech to think about a different way in which technology could have developed—and in the process, it’s thrown open the door to a reevaluation of the technologies we’ve got, and thus to the political, economic, and cultural agendas which the technologies we’ve got embody, and thus inevitably further.
Well, that’s part of my interest, at any rate. Another part is based on the recognition that Victorian technology functioned quite effectively on a very small fraction of the energy that today’s industrial societies consume. Estimates vary, but even the most industrialized countries in the world in 1860 got by on something like ten per cent of the energy per capita that’s thrown around in industrial nations today.  The possibility therefore exists that something like a Victorian technology, or even something like the neo-Victorian extrapolations of the steampunk scene, might be viable in a future on the far side of peak oil, when the much more diffuse, intermittent, and limited energy available from renewable sources will be what we have left to work with for the rest of our species’ time on this planet.
For the time being, I want to let that suggestion percolate through the crawlspaces of my readers’ imaginations.  Those who want to pick up a steampunk calculator and start learning how to crunch numbers with it—hint:  it’s easy to learn, useful in practice, and slide rules come cheap these days—may just have a head start on the future, but that’s a theme for a later series of posts. Well before we get to that, it’s important to consider a far less pleasant kind of blast from the past, one that bids fair to play a significant role in the future immediately ahead.
That is to say, it’s time to talk about the role of fascism in the deindustrial future. We’ll begin that discussion next week.

A Bargain with the Archdruid

Wed, 2014-01-29 18:09
My anomalous position as a writer and speaker on the future of industrial society who holds down a day job as an archdruid has its share of drawbacks, no question, but it also has significant advantages.  One of the most important of those is that I don’t have to worry about maintaining a reputation as a serious public figure. That may not sound like an advantage, but believe me, it is one.
  Most of the other leading figures in the peak oil scene have at least some claim to respectability, and that pins them down in subtle and no-so-subtle ways. Like it or not, they have to know that being right about peak oil means that they might just pick up the phone one of these days and field an invitation to testify before a Senate subcommittee or a worried panel of long-range planners from the Pentagon. The possibility of being yanked out of their current role as social critics and being called on to tell a failing industrial society how it can save itself has got to hover in front of them in the night now and then. Such reflections tend to inspire a craving for consensus, or at least for neatly labeled positions within the accepted parameters of the peak oil scene.
I can only assume that’s what lies behind the tempest in an oil barrel that’s rocked the peak oil end of the blogosphere in recent weeks, following the publication of an essay by Permaculture guru David Holmgren titled Crash on Demand. Holmgren’s piece was quite a sensible one, suggesting that we’re past the point that a smooth transition to green tech is possible and that some kind of Plan B is therefore needed. It included some passages, though, suggesting that the best way to deal with the future immediately ahead might be to trigger a global financial crash.  If just ten per cent of the world’s population stopped using fossil fuels, he noted, that might be enough to bring the whole system down all at once.
That proposal got a flurry of responses, but only a few—Dmitry Orlov’s, predictably, was one of those few—noted the chasm that yawns between Holmgren’s modest proposal and the world we actually inhabit.  It’s all very well to talk about ten per cent of the population withdrawing from the global economy, but the fact of the matter is that it’ll be a cold day in Beelzebub’s back yard before even ten per cent of self-proclaimed green activists actively embrace such a project ,to the extent of making more than the most modest changes in their own livestyles—and let’s not even talk about how likely it is that anybody at all outside the culturally isolated fringe scene that contains today’s green subcultures will even hear of Holmgren’s call to arms.
Mind you, David Holmgren is a very smart man, and I’m quite sure he’s well aware of all this. An essay by David MacLeod pointed out that the steps Holmgren’s proposed to bring down industrial society are what he’s been encouraging people to do all along.  It occurs to me that he may simply have decided to try another way to get people to do what we all know we need to do anyway: give up the hopelessly unsustainable lifestyles currently provided us by the contemporary industrial system, downsize our desires as well as our carbon footprints, and somehow learn to get by on the kind of energy and resource basis that most other human beings throughout history have considered normal. 
Still, a nod is as good as a wink to a blind horse; as far as I can tell, Holmgren’s essay hasn’t inspired any sudden rush on the part of permaculturists and peak oil activists to ditch their Priuses in the hopes of sticking it to the Man. Instead, it veered off into debates about whether and how “we” (meaning, apparently, the writers and readers of peak oil blogs) could in fact crash the global economy.  There was a flurry of talk about how violence shouldn’t be considered, and that in turn triggered a surge of people babbling earnestly about how we need not to rule out the use of violence against the system.
It’s probably necessary to say a few words about that here. Effective violence of any kind is a skill, a difficult and demanding one, and effective political violence against an established government is among the most difficult and demanding kinds. I’m sorry if this offends anybody’s sense of entitlement, but it’s not simply a matter of throwing a tantrum so loud that Daddy has to listen to you, you know.  To force a government to do your bidding by means of violence, you have to be more competent at violence than the government is, and the notion that the middle-class intellectuals who do most of the talking in the peak oil scene can outdo the US government in the use of violence would be hilarious if the likely consequences of that delusion weren’t so ghastly. This is not a game for dabblers; people get thrown into prison for decades, dumped into unmarked graves, or vaporized by missiles launched from drones for trying to do what the people in these discussions were chattering about so blandly.
For that matter, I have to wonder how many of the people who were so free with their online talk about violence against the system stopped to remember that every word of those conversations is now in an NSA data file, along with the names and identifying details of everybody involved. The radicals I knew in my younger days had a catchphrase that’s apposite here: “The only people that go around publicly advocating political violence are idiots and agents provocateurs. Which one are you?”
Meanwhile, in that distant realm we call the real world, the hastily patched walls of peak oil denial are once again cracking under the strain of hard reality. The Royal Society—yes, that Royal Society—has just published a volume of its Philosophical Transactions devoted to peak oil; they take it seriously.  Word has also slipped into the media that in December, a select group of American and British military, business, and political figures held a conference on peak oil; they also take it seriously.
Meanwhile, air is leaking out of the fracking bubble as firms lose money, the foreign investors whose wallets have been the main target of the operation are backing away, and the cheerleading of the media is sounding more and more like the attempts to boost housing prices around the beginning of 2008. The latest data point? Longtime peak oil researcher Jean Laherrere, who (let us not forget) successfully predicted the 2005 peak in conventional oil production well in advance, used the same modeling techniques to predict future production from the Bakken Shale. His call? A production peak in the fall of this year, with steep declines after that. He’s the latest to join the chorus of warnings that the fracking bubble is merely one more overblown financial scam moving inexorably toward a massive bust.
Of course we’ve been here before. Every few years, the mass media starts to talk about peak oil, proponents of business as usual look nervous, and those in the peak oil scene who are inexperienced enough not to remember the last few cycles of the same process start talking about the prospects of imminent victory. (Yes, I made that mistake a while back; I think we all have.) Then the walls of denial get patched up again, the mass media scurries back to some comforting fairy tale about ethanol, wind power, biodiesel, fracking or what have you; the proponents of business as usual go back to their normal blustering, and peak oil activists who got overenthusiastic about predictions of imminent triumph end up with egg on their faces. That’s standard for any social movement trying to bring about an unwelcome but necessary change in society. Each time around the cycle, more people get the message, and a movement smart enough to capitalize on the waves of media interest can grow until it starts having a significant influence on society as a whole.
That final step can arrive on various time scales; a successful movement for change can see its viewpoint filter gradually into the collective conversation, or there can be a sudden break, after which the movement can still be denounced but can no longer be ignored. Glance back through the last few centuries and it’s easy to find examples of either kind, not to mention every point between those two ends of the spectrum. I’m far from sure if there’s a way to tell how peak oil activism will play out, but my hunch is that it may be closer to the sudden-break end of the spectrum than otherwise. What lies behind that hunch isn’t anything so sturdy as a headline or a new study; rather, it’s something subtle—a shift in tone in the denunciations that The Archdruid Report fields each week.
I don’t know if other bloggers share this experience, but I’ve found that internet trolls are a remarkably subtle gauge of the mass imagination. There are some trolls who only show up when a post of mine is about to go viral, and others whose tirades reliably forecast the new themes of peak oil denial three or four months in advance. When some bit of high-tech vaporware is about to be ballyhooed as the miracle that’s going to save us all, or some apocalyptic fantasy is about to become the new reason why it’s okay to keep your middle class lifestyle since we’re all going to die soon anyway, I usually hear about it first from trolls who can’t wait to let me know just how wrong I am. It’s an interesting fringe benefit of a blogger’s job, and it’s alerted me more than once to trends worth watching.
It so happens that in recent weeks, some of the criticisms I’ve fielded have struck a distinctly new note. I still get the classic cornucopians who insist I’m babbling pessimistic nonsense and of course we’ll all be just fine, just as I still get the apocalypse fanboys who insist that I’m ignoring the fact that the doom du jour is sure to annihilate us all, but I’m now seeing a third position—that of course it’s a crisis and we can’t just go on the way we’ve been living, a lot of things will have to change, but if we do X and Y and Z, we can keep some of the benefits of industrial society going, and I’m being too pessimistic when I suggest that no, we can’t. Maybe everyone else in the peak oil scene has been getting these all along, but they’re new to my comments page, and they have a tone that sets them apart from the others.
To be precise, it sounds like bargaining.
I don’t imagine that anyone in the peak oil scene has missed the discussions of Elisabeth Kübler-Ross’ five stages of coming to terms with impending death—denial, anger, bargaining, depression, and acceptance—and their application to the not dissimilar experience of facing up to the death of the industrial age. Many of us can look back on our own transits through the five stages, and I’ve long since lost track of the times I’ve heard people at a peak oil event roll their eyes and mutter the name of one of the stages to whomever is sitting next to them. For the most part, though, it’s been a matter of individuals going through their own confrontations with the death of progress at their own pace.
Maybe this is still what’s happening, but I wonder. For a very long time, even among peak oil activists, the prevailing mood was still one of denial—we can solve this, whether the solution consists of solar panels, thorium reactors, revitalized communities, permacultured forest gardens, supposedly imminent great turnings of one sort or another, or what have you. After the 2008-2009 crash, that shifted to a mood of anger, and furious denunciations of “the 1%” and an assortment of more familiar supervillains became much more common on peak oil forums than they had been. The rise of apocalypse fandom has arguably been driven by this same stage of anger—suicidal fantasies very often get their force from unresolved rage turned inwards, after all, and it’s likely that the habit of projecting daydreams of mass extermination onto the future is rooted in the same murky emotional soil.
If that’s indeed what’s been happening, then bargaining is the next stage.  If so, this is good news, because unlike the two stages before it or the one that follows, the stage of bargaining can have practical benefits. If a dying person hits that stage and decides to give up habits that make her condition worse, for example, the result may be an improved quality of life during her final months; if the bargain includes making big donations to charity, the patient may not benefit much from it but the charity and the people it helps certainly will. People under the stress of impending death try to strike bargains that range all the way from the inspiring to the absurd, though, and whether something constructive comes out of it depends on whether the bargain involves choices that will actually do some good.
If this stage is like the ones the peak oil scene seems to have transited so far, we can expect to see a flurry of earnest blog posts and comments over the next few years seeking reassurance in a manner peculiar to the internet—that is, by proclaiming something as absolute fact, then looking around nervously to see if anyone else agrees. This time, instead of proclaiming that this or that or the other is sure to save us, or out to get us, or certain to kill us all, they’ll be insisting that this or that or the other will be an acceptable sacrifice to the gods of petroleum depletion and climate change, sufficient to persuade those otherwise implacable powers to leave us untouched. The writers will be looking for applause and approval, and if that I think their offering might do some good, I’m willing to meet them halfway. In fact, I’ll even suggest things that I’m sure to applaud, so they don’t even have to guess.
First is conservation. That’s the missing piece in most proposals for dealing with peak oil. The chasm into which so many well-intentioned projects have tumbled over the last decade is that nothing available to us can support the raw extravagance of energy and resource consumption we’re used to, once cheap abundant fossil fuels aren’t there any more, so—ahem—we have to use less.  Too much talk about using less in recent years, though, has been limited to urging energy and resource abstinence as a badge of moral purity, and—well, let’s just say that abstinence education did about as much good there as it does in any other context.
The things that played the largest role in hammering down US energy consumption in the 1970s energy crisis were unromantic but effective techniques such as insulation, weatherstripping, and the like, all of which allow a smaller amount of energy to do the work previously done by more.  Similar initiatives were tried out in business and industry, with good results; expanding public transit and passenger rail did the same thing in a different context, and so on.  All of these are essential parts of any serious response to the end of cheap energy.  If your proposed bargain makes conservation the core of your response to fossil fuel and resource depletion, in other words, you’ll face no criticism from me.
Second is decentralization.  One of the things that makes potential failures in today’s large-scale industrial infrastructures so threatening is that so many people are dependent on single systems. Too many recent green-energy projects have tried to head further down the same dangerous slope, making whole continents dependent on a handful of pipelines, power grids, or what have you. In an age of declining energy and resource availability, coupled with a rising tide of crises, the way to ensure resilience and stability is to decentralize intead: to make each locality able to meet as many of its own needs as possible, so that troubles in one area don’t automatically propagate to others, and an area that suffers a systems failure can receive help from nearby places where everything still works.
Here again, this involves proven techniques, and extends across a very broad range of human needs. Policies that encourage local victory gardens, truck farms, and other food production became standard practice in the great wars of the 20th century precisely because they took some of the strain off  overburdened economies and food-distribution systems. Home production of goods and services for home use has long played a similar role. For that matter, transferring electrical power and other utilities and the less urgent functions of government to regional and local bodies instead of doing them on the national level will have parallel benefits in an age of retrenchment and crisis. Put decentralization into your bargain, and I’ll applaud enthusiastically.
Third is rehumanization. That’s an unfamiliar word for a concept that will soon be central to meaningful economic policy throughout the developed world. Industrial societies are currently beset with two massive problems:  high energy costs, on the one hand, and high unemployment on the other. Both problems can be solved at a single stroke by replacing energy-hungry machines with human workers. Rehumanizing the economy—hiring people to do jobs rather than installing machines to do them—requires removing and reversing a galaxy of perverse incentives favoring automation at the expense of employment, and this will need to be done while maintaining wages and benefits at levels that won’t push additional costs onto government or the community.
The benefits here aren’t limited to mere energy cost savings. Every economic activity that can be done by human beings rather than machinery is freed from the constant risk of being whipsawed by energy prices, held hostage by resource nationalism, and battered in dozens of other ways by the consequences of energy and resource depletion. That applies to paid employment, but it also applies to the production of goods and services in the household economy, which has also been curtailed by perverse incentives, and needs to be revived and supported by sensible new policies. A rehumanized economy is a resilient economy for another reason, too:  the most effective way to maximize economic stability is to provide ample employment at adequate wages for the workforce, whose paychecks fund the purchases that keep the economy going. Make rehumanization an important part of your plan to save the world and I won’t be the only one cheering.
Those are my proposals, then: conservation, decentralization, rehumanization.  Those readers who are looking for applause for their efforts at collective bargaining with the forces driving industrial society toward its destiny now know how to get it here. I’d like to ask you to step out of the room for the next paragraph, though, as I have a few things to say to those who aren’t at the bargaining stage just now.
(Are they gone?  Good.  Now listen closely while I whisper:  none of the things I’ve just suggested will save industrial civilization. You know that, of course, and so do I.  That said, any steps in the direction of conservation, decentralization, and rehumanization that get taken will make the descent less disruptive and increase the chances that communities, localities, and whole regions may be able to escape the worst impacts of the industrial system’s unraveling. That’s worth doing, and if it takes their panicked efforts to bargain with an implacable fate to get those things under way, I’m good with that.  Got it? Okay, we can call them back into the room.)
Ahem. So there you have it; if you want to bargain with the archdruid, those are the terms I’ll accept. For whatever it’s worth, those are also the policies I’d propose to a Senate subcommittee or a worried panel of long-range planners from the Pentagon if I were asked to testify to some such body,. Of course that’s not going to happen; archdruids can draw up proposals on the basis of what might actually work, instead of worrying about the current consensus in or out of the peak oil scene, because nobody considers archdruids to be serious public figures. That may not sound like an advantage, but believe me, it is one.

Return of the Space Bats

Wed, 2014-01-22 21:08
"Some seventeen notable empires rose in the Middle Period of Earth. These were the Afternoon Cultures. All but one are unimportant to this narrative, and there is little need to speak of them save to say that none of them lasted for less than a millennium, none for more than ten; that each extracted such secrets and obtained such comforts as its nature (and the nature of the Universe) enabled it to find; and that each fell back from the Universe in confusion, dwindled, and died.
  “The last of them left its name written in the stars, but no-one who came later could read it. More important, perhaps, it built enduringly despite its failing strength—leaving certain technologies that, for good or ill, retained their properties of operation for well over a thousand years. And more important still, it was the last of the Afternoon Cultures, and was followed by evening, and by Viriconium.”

Those are the opening lines of The Pastel City by M. John Harrison, one of the fixtures of my arguably misspent youth. I pulled it down from my shelf of old paperbacks the other day, for the first time in years, and spent part of an evening rereading it. There were several reasons for that. Partly it was a fit of nostalgia; partly, I’ve finished all but the last minor edits on Star’s Reach, my post-peak oil SF novel, preparatory to placing it with a publisher, and was thinking back on some of the other fictional explorations of the coming deindustrial future I’ve read; partly—well, there’s a practical reason, but we’ll get to that a little later on in this post.
There were any number of books like it back in the day, mass-market SF paperbacks with gaudy covers ringing variations on a handful of familiar themes. You could almost describe The Pastel Cityas an example of what used to be called the sword and sorcery genre, one of the standard modes of 1970s fantasy, except that there isn’t any sorcery. The weird powers and monstrous enemies against which tegeus-Cromis the swordsman and his motley allies do battle are surviving relics of advanced technology, and the deserts of rust and the windblown ruins through which they pursue their fairly standard heroic quest are the remnants of an industrial society a millennium dead. In a real sense, it belongs to a genre of its own, which I suppose ought to be called postindustrial fantasy.
I first read it in the bleak little apartment where my father lived, in a down-at-the-heels Seattle suburb. Not long before, just as soon as he finished paying her way through college, my mother dumped my father like a sack of old clothes, and followed through on the metaphor by taking him to the cleaners in the divorce settlement. He took what refuge he could find in model airplanes, jigsaw puzzles, and science fiction novels, and I used to catch a bus north from my mother’s house to spend weekends with him. His apartment was a short walk from the library, which was one blessing, and close to a hobby shop where I could fritter away my allowance on spacecraft models, which was another; still, many of the best hours I recall from those days were spent curled up on his secondhand couch, reading the volumes of science fiction he’d bring home from the little bookstore six blocks away.
The Pastel City was one of those. I recall vividly the first time I read it, because it’s the book that first suggested to me that an industrial society could crumble away to ruins without benefit of apocalyptic fireworks and be succeeded by simpler societies. What was more, the last Afternoon Culture was simply backstory, no more immediately relevant to tegeus-Cromis and his friends than the fall of Rome is to you and me, and the wastelands, the marshes gone brackish with metallic salts, the dead and half-drowned city called Thing Fifty, and most of the other relics of that vanished time were just part of the landscape. It was easy for me, as I sat there on the couch, to imagine the wreckage of today’s world forming the background for adventures a thousand years in our own future.
I was curled up on the same couch when I first read Davy by Edgar Pangborn. He’s mostly forgotten these days, but Pangborn’s was a name to conjure with in the more literate end of the science fiction scene of the 1960s and 1970s, and Davy was much of the reason why. It’s a coming-of-age story—shall be literary and call it a Bildungsroman?—that goes on the same Platonic ideal of a bookshelf as Tom Jones and Huckleberry Finn, except that it takes place about five centuries from now in what’s left of northeastern North America.
Pangborn’s future history is as precise as it is uncomfortably plausible. There was a nuclear war, which killed a lot of people, and an epidemic among the survivors, which killed a lot more. Rising sea levels driven by global warming—yes, Pangborn was already onto that in 1964—flooded the lowlands. Stick in time’s oven and bake for centuries, and you’ve got a world of neofeudal statelets with more or less medieval subsistence economies clinging to the threadbare rhetoric of an earlier day.  Here’s Davy’s description of his world: “Katskil is a kingdom. Nuin is a commonwealth, with a hereditary presidency of absolute powers. Levannon is a kingdom, but governed by a Board of Trade. Lomeda and the other Low Countries are ecclesiastical states, the boss panjandrum being called a Prince Cardinal. Rhode, Vairmant, and Penn are republics; Conicut’s a kingdom; Bershar is mostly a mess. But they’re all great democracies, and I hope this will grow clearer to you one day when the ocean is less wet.”
Those ecclesiastical states aren’t Christian, by the way, but they might as well be. Pangborn was gay, and thus got an extra helping of the hypocrisy and intolerance that characterize American Christianity in its worst moods; he was accordingly an atheist; and his Holy Murcan Church was partly a parody of the mainstream American churches of his time, partly the standard atheist caricature of the medieval Catholic church, and partly a counterblast against Walter M. Miller’s A Canticle for Leibowitz, with its portrayal of Catholicism as a force for good in a future dark age America. Those of my readers who are atheists will enjoy that side of Davy; those who aren’t may find Pangborn’s bouts of religion-bashing annoying, or if they’ve heard the same talking points often enough elsewhere, simply dull. Fortunately there’s plenty more in Davy that makes it worth the read anyway.
Brilliant though it was, Davy still followed the conventions of the then-thriving postapocalyptic genre and didn’t quite manage, as The Pastel City did, to think its way right out from under the myth of progress and stop seeing history as a straight line that always leads back to us. Pangborn thus needed, or thought he needed, the diabolus ex machina of a nuclear war to bring Old Time crashing down. I forgave him that because his nuclear war wasn’t the usual canned Hollywood death-fantasy, just a bunch of cities being blasted to rubble, a lot of corpses, and high rates of sterility and birth defects among the descendants of the survivors, followed by normal decline and recovery.
Pangborn’s own summary is typically succinct: “barbarism, not actually ‘like’ Fifth Century Europe because history can’t repeat itself that way, but just as dark. Here and there enclaves where some of the valuable bits of the old culture survived. In some places, primitive savagery in its varied forms; and monarchies, petty states, baronies, whatever. Then through many centuries, a gradual recovery toward some other peak of some other kind of civilization. Without the resources squandered by the 20th Century.” That’s from Still I Persist In Wondering, an anthology of short stories set in the same future as Davy, as  were two other novels of his, The Judgment of Eve and The Company of Glory. Yes, I read all of them, many times; those of my readers who followed  Star’s Reach and pick up Pangborn’s tales will doubtless catch the deliberate homages I put in my story, and probably some influences I didn’t intend that sneaked in anyway.
The third book I want to mention here didn’t come to my attention until long after the secondhand couch went out of my life, but it came at another bleak time. That was the autumn of 1982, toward the end of my first unsuccessful pass through college, and right about the time it was becoming painfully clear that the great leap toward a sustainable future through appropriate technology, in which I planned on making my career, wasn’t going to happen after all. Those were the years when the Reagan administration’s gutting of grant money for every kind of green initiative was really starting to hit home, and attempts to mobilize any kind of support for those initiatives were slamming face first into the simple fact that most Americans wanted to cling to their cozy lifestyles even if that meant flushing their grandchildren’s future down the drain.
Somewhere in the middle of that autumn, under a hard gray sky, I went for a long and random walk in the grimy end of downtown Bellingham, Washington, down past old canneries and retail buildings that used to serve the working waterfront when there still was one.  Somewhere in there was a shop selling girlie mags—this was before the internet, when those who liked pictures of people with their clothes off had to go buy them from a store—and out in front, for no reason I was ever able to figure out, was a tray of non-erotic used paperbacks for 50 cents each. I had a dollar to spare; I stopped to look through them, and that’s how I found The Masters of Solitude by Marvin Kaye and Parke Godwin.
If The Pastel City is sword and sorcery without the sorcery, The Masters of Solitude might best be described as a post-apocalyptic novel without the apocalypse. It’s a three-handed poker game of a story set maybe two thousand years  from now in the eastern United States. From Karli in the south to Wengen in the north runs Coven country, a tribal realm following a faith and a culture descended from today’s Wicca. Off in what’s now western Pennsylvania are the Kriss, who worship a dead god. To the east, the urban enclave that extends from Boston down to Washington DC is simply the City, its people maintaining high technology with solar power, living for centuries by way of advanced organ-transplant techniques, sealing out the rest with an electronic barrier that shreds minds.
Long ago things were different; everyone in the world of The Masters of Solitude remembers that, however dimly.  Then the land was invaded and conquered by another people, the Jings, who left their name and some of their genetics and then faded from history.  Out of the ordinary chaos of a fallen civilization, the ordinary process of reorganization and cultural coalescence birthed new societies drawing in various ways on the legacies of the past. It’s history the way it actually happens, the normal rise and fall of nations and cultures, and it’s in the course of their ordinary history, the Covens, the Kriss, and the City stumble toward a confrontation that will shatter them all.
There were other books that could go into a list of postindustrial fantasy classics, of course, and I may talk about some of them another day. Still, the question I suspect a fair number of my readers are wondering is why any of this matters. Modern industrial civilization is beginning to pick up speed along a trajectory of decline and fall that differs from the ones we’ve just discussed in that it’s not safely confined to the realm of imaginative fiction. Is there any point in reading about imaginary societies that fell back from the Universe, dwindled, and died, when ours is doing that right now?
As it happens, I think there is.
Most of what’s kept people in today’s industrial world from coming to grips with the shape and scale of our predicament is precisely the inability to imagine a future that’s actually different from the present. Francis Fukuyama’s proclamation of the end of history may have been a masterpiece of unintentional comedy—I certainly read it in that light—but it spoke for an attitude that has deep roots all through contemporary culture. Nor is that attitude limited to the cornucopians who can’t imagine any future that isn’t a linear continuation of the present; what is it that gives the contemporary cult of apocalypse fandom its popularity, after all, but a conviction that the only alternative to a future just like the present  is quite precisely no future at all?
It would be pleasant if human beings were so constituted that this odd myopia of the imagination could be overcome by the simple expedient of pointing out all the reasons why it makes no sense, or by noting how consistently predictions made on that basis turn out to be abject flops. Unfortunately, that doesn’t happen to be the case. My regular readers will long since have noticed how easily believers in a business-as-usual future brush aside such issues as though nobody ever mentioned them at all, and keep on insisting that of course we can keep an industrial system running indefinitely because, well, because we can, just you watch!  The only thing I can think of that compares with this is the acrobatic ingenuity with which believers in imminent apocalypse keep on coming up with new reasons why this week’s prediction of mass death must be true when all previous examples have turned out dead wrong.
What underlies both of these curious phenomena, and a great many other oddities of contemporary culture, is simply that the basic building blocks of human thinking aren’t facts or logical relationships, but stories. The narratives we know are the patterns by which we make sense of the world; when the facts or the testimony of logic don’t fit one narrative, and we have a selection of other narratives to hand, we can compare one story to another and find the one that’s the best fit to experience. That process of comparison is at the heart of logic and science, and provides a necessary check on the normal tendency of the human mind to get stuck on a single story even when it stops making sense.
As I pointed out here in the earliest days of this blog, though, that check doesn’t work if you only have one story handy—if, for example, the story of onward and upward progress forever is the only story about the future you know. Then it doesn’t matter how badly the story explains the facts on the ground, or how many gross violations of logic are needed to explain away the mismatches: given a choice between a failed narrative and no narrative at all, most people will cling to the one they have no matter how badly it fits. That’s the game in which both the cornucopians and the apocalypse fans are engaged; the only difference between them, really, is that believers in apocalypse have decided that the way to make the story of progress make sense is to insist that we’re about to reach the part of it that says “The End.”
The one way out of that trap is to learn more stories—not simply rehashes of the same plot with different names slapped on the characters, mind you, but completely different narrative structures that, applied to the same facts and logical relationships, yield different predictions. That’s what I got from the three novels I’ve discussed in this post. All three were fictions, to be sure, but all three were about that nebulous place we call the future, and all three gave me narratives I could compare with the narrative of progress to see which made the better fit to the facts.  I’ve met enough other people who’ve had similar experiences that I’ve come to think of fiction about the future as a powerful tool for getting outside the trap of knowing just one story, and thus coming to grips with the failure of that story and the need to understand the future ahead of us in very different ways.
All of which brings me to the practical dimension of this week’s post.
A few weeks ago, I fielded an email from the proprietor of the small publishing house that released After Oil: SF Visions of a Post-Petroleum Future, the anthology of stories that came out of the peak oil fiction contest I held here in 2011 and 2012.  He was pleased to report that sales have been modest but steady—contributors should expect a statement and royalty check shortly—and asked about whether I was perhaps interested in putting together a second anthology along the same lines. (He also expressed a definite interest in hearing from writers who have novels on peak oil-related themes and are looking for a place to publish them; those of my readers who fall into this category—I know you’re out there—will want to check out the submissions requirements page on the Founders House website.)
I’m certainly game for a second story contest, and for editing a second anthology; given the torrent of creativity that the last contest called forth, I don’t expect to have any trouble fielding an abundance of good stories from this blog’s readers, either, so the contest is on. The requirements are the same as before:
  • Stories should be between 2500 and 7500 words in length;
  • They should be entirely the work of their author or authors, and should not borrow characters or setting from someone else’s work;
  • They should be in English, with correct spelling, grammar and punctuation;
  • They should be stories—narratives with a plot and characters—and not simply a guided tour of some corner of the future as the author imagines it;
  • They should be set in our future, not in an alternate history or on some other planet;
  • They should be works of realistic fiction or science fiction, not magical or supernatural fantasy—that is, the setting and story should follow the laws of nature as those are presently understood;
  • They should deal directly with the impact of peak oil, and the limits to growth in general, on the future; and as before,
  • They must not rely on “alien space bats”—that is, dei ex machina inserted to allow humanity to dodge the consequences of the limits to growth. (Aspiring authors might want to read the whole “Alien Space Bats” post for a more detailed explanation of what I mean here.)
That is to say, the stories that will find a place in the second anthology, like those that populated the first, will feature human beings like you and me, coping with the aftermath of the industrial age in a world that could reasonably be our future, and living lives that are challenging, interesting, and maybe even appealing in that setting. I’d like to make an additional suggestion this time around: don’t settle for your  ordinary, common or garden variety postpetroleum future. Make it plausible, make it logical, but make it different.
The mechanics are also the same as before. Write your story and post it to the internet—if you don’t have a blog, you can get one for free from Blogspot or Wordpress. Post a link to it in the comments section of this blog. Yes, you can write more than one story, and yes, that will increase your chances of making the cut. The contest ends at the beginning of May, so get typing; for reasons I’ll discuss in next week’s post, we may just be entering into a window of opportunity in which new visions of the future could have a significant impact, and it might as well be your vision, dear reader, that helps make that happen.

Seven Sustainable Technologies

Wed, 2014-01-15 20:14
Last week’s post on the contemporary culture of apocalypse fandom was also, more broadly, about the increasingly frantic attempts being made to ignore the future that’s looming ahead of us. Believing that the world as we know it is about to crash into ruin, popular as it is, is only one of several strategies put to work in those attempts. There’s also the claim that we can keep industrial civilization going on renewable energy sources, the claim that a finite planet can somehow contain an infinite supply of cheap fossil fuel—well, those of my readers who know their way around today’s nonconversation about energy and the future will be all too familiar with the thirty-one flavors of denial.
  It’s ironic, though predictable, that these claims have been repeated ever more loudly as the evidence for a less comfortable view of things has mounted up. Most recently, for example, a thorough study of the Spanish solar energy program by Pedro Prieto and Charles A.S. Hall has worked out the net energy of large-scale solar photovoltaic systems on the basis of real-world data. It’s not pleasant reading if you happen to believe that today’s lifestyles can be supported on sunlight; they calculate that the energy return on energy invested (EROEI) of Spain’s solar energy sector works out to 2.48—about a third of the figure suggested by less comprehensive estimates.
The Prieto/Hall study has already come in for criticism, some of it reasonable, some of it less so. A crucial point, though, has been left out of most of the resulting discussions. According to best current estimates, the EROEI needed to sustain an industrial civilization of any kind is somewhere between 10 and 12; according to most other calculations—leaving out the optimistic estimates being circulated by solar promoters as sales pitches—the EROEI of large scale solar photovoltaic systems comes in between 8 and 9. Even if Prieto and Hall are dead wrong, in other words, the energy return from solar PV isn’t high enough to support the kind of industrial system needed to manufacture and maintain solar PV.  If they’re right, or if the actual figure falls between their estimate and those of the optimists, the point’s even harder to dodge.
Similar challenges face every other attempt to turn renewable energy into a replacement for fossil fuels. I’m thinking especially of the study published a few years back that showed, on solid thermodynamic grounds, that the total energy that can be taken from the planet’s winds is a small fraction of what windpower advocates think they can get. The logic here is irrefutable:  there’s a finite amount of energy in wind, and what you extract in one place won’t turn the blades of another wind turbine somewhere else. Thus there’s a hard upper limit to how much energy windpower can put into the grid—and it’s not enough to provide more than a small fraction of the power needed by an industrial civilization; furthermore, estimates of the EROEI of windpower cluster around 9, which again is too little to support a society that can build and maintain wind turbines.
Point such details out to people in the contemporary green movement, and you can count on fielding an angry insistence that there’s got to be some way to run industrial civilization on renewables, since we can’t just keep on burning fossil fuels.  I’m not at all sure how many of the people who make this sort of statement realize just how odd it is. It’s as though they think some good fairy promised them that there would always be enough energy to support their current lifestyles, and the only challenge is figuring out where she hid it. Not so; the question at issue is not how we’re going to keep industrial fueled, but whether we can do it at all, and the answer emerging from the data is not one that they want to hear: nothing—no resource or combination of resources available to humanity at this turning of history’s wheel—can support industrial civilization once we finish using up the half a billion years of fossil sunlight that made industrial civilization briefly possible in the first place.
Green activists are quite right, though, that we can’t just keep on burning fossil fuels.  We can’t just keep on burning fossil fuels because fossil fuels are a finite resource, we’ve already burnt through most of what’s economically viable to extract, and the EROEI of what’s left is dropping steadily as quality declines and costs rise. Back in the day when most petroleum on the market was light sweet crude from shallow onshore wells, its EROEI could be as high as 200; nowadays, a large and growing fraction of liquid fuels comes from deep offshore fields, fracked shales, tar sands, and other energy- and resource-intensive places, so the average for petroleum as a whole is down somewhere around 30 and sinking.
A common bad habit of contemporary thought assumes that gradual changes don’t mean anything until some threshold slips past, at which point things go boom in one way or another. Some processes in the real world happen that way, but it’s far more common for gradual shifts to have gradual impacts all along the trajectory of change. A good case can be made that EROEI decline is one such process.  For more than a decade now, the world’s economies have stumbled from one crisis to another, creaking and groaning through what would likely have been visible contraction if the mass production of paper wealth out of thin air hadn’t been been cranked into overdrive to produce the illusion of normality. 
Plenty of explanations have been proposed for the current era of economic unraveling, but I’d like to suggest that the most important factor is the overall decline in the “energy profit” that makes modern economies possible at all. EROEI is to a civilization what gross profit is to a business, the source of the surplus that supports the entire enterprise.  As the overall EROEI of industrial civilization contracts, habits that were affordable in an era of abundance profit stop being viable, and decline sets in. Long before that figure drops to the point that an industrial system can no longer be supported at all, most of us will have long since lost access to the products of that system, because every drop of liquid fuel and every scrap of most other industrial resources will long since have been commandeered for critical needs or reserved for the wealthiest and most powerful among us.
The twilight of the industrial age, in other words, isn’t somewhere conveniently far off in the future; it’s happening now, in the slow, ragged, uneven, but inexorable manner that’s normal for great historical transformations. Trying to insist that this can’t be happening, that there has to be some way to keep up our extravagant lifestyles when the energetic and material basis of that extravagance is rapidly depleting away from beneath us, may be emotionally comforting but it doesn’t change, or even address, the hard facts of our predicament.  Like the fashionable apocalypticism discussed last week, it simply provides an excuse for inaction at a time when action is necessary but difficult. 
Set aside all those excuses, and the hard question that remains is what to do about it all.
Any answer to that question has to start by taking seriously the limits imposed by our situation, and by choices made in the decades already past. Proposing some grand project to get the entire world ready for the end of the age of abundance, for example, is wasted breath; even if the political will could be found—and it’s been missing in action since 1980 or so—the resources that might have made such a project possible were burned to fuel three decades of unsustainable extravagance. While new systems are being built, remember, the old ones have to stay functional long enough to keep people fed, housed, and supplied with other necessities of life, and we’ve passed the point at which the resources still exist to do both on any large scale. As the Hirsch report pointed out back in 2005, a meaningful response to the peaking of petroleum production had to begin at least twenty years in advance of the peak to avoid catastrophic disruptions; that didn’t happen in time, and there’s no point in pretending otherwise.
Any response to the twilight of the industrial age, in other words, will have to function within the constraints of a society already in the early stages of the Long Descent—a society in which energy and resources are increasingly hard for most people to obtain, in which the infrastructure that supports current lifestyles are becoming ever more brittle and prone to dysfunction, and in which most people will have to contend with the consequences of economic contraction, political turmoil, and social disintegration. As time passes, furthermore, all these pressures can be counted on to increase, and any improvement in conditions that takes place will be temporary.
All this places harsh constraints on any attempt to do anything constructive in response to the end of industrial civilization. Still, there are still options available, and I want to talk about one of those here:  an option that could make the decline a little less bitter, the dark age that will follow it a little less dark, and the recovery afterwards a little easier. Compared to grand plans to save the world in a single leap, that may not sound like much—but it certainly beats sitting one one’s backside daydreaming about future societies powered by green vaporware, on the one hand, or imaginary cataclysms that will relieve us of our responsibility toward the future on the other.
It’s only in the imagination of true believers in the invincibility of progress that useful technologies can never be lost. History shows the same thing with painful clarity:  over and over again, technologies in common use during the peak years of a civilization have been lost during the dark age that followed, and had to be brought in again from some other society or reinvented from scratch once the dark age was over and rebuilding could begin. It’s a commonplace of history, though, that if useful technologies can be preserved during the declining years of a society, they can spread relatively rapidly through the successor states of the dark age period and become core elements of the new civilization that follows. A relatively small number of people can preserve a technology, furthermore, by the simple acts of learning it, practicing it, and passing it on to the next generation.
Not every technology is well suited for this sort of project, though. The more complex a technology is, the more dependent it is on exotic materials or concentrated energy sources, and the more infrastructure it requires, the less the chance that it can be preserved in the face of a society in crisis. Furthermore, if the technology doesn’t provide goods or services that will be useful to people during the era of decline or the dark age that follows, its chances of being preserved at all are not good at a time when resources are too scarce to divert into unproductive uses.
Those are tight constraints, but I’ve identified seven technological suites that can be sustained on a very limited resource base, produce goods or services of value even under dark age conditions, and could contribute mightily to the process of rebuilding if they get through the next five centuries or so.
1. Organic intensive gardening.  I’ve commented before that when future historians look back on the twentieth century, the achievement of ours that they’ll consider most important is the creation of food growing methods that build soil fertility rather than depleting it and are sustainable on a time scale of millennia. The best of the current systems of organic intensive gardening require no resource inputs other than locally available biomass, hand tools, and muscle power, and produce a great deal of food from a relatively small piece of ground. Among the technologies included in this suite, other than the basics of soil enhancement and intensive plant and animal raising, are composting, food storage and preservation, and solar-powered season extenders such as cold frames and greenhouses.
2. Solar thermal technologies.  Most of the attention given to solar energy these days focuses on turning sunlight into electricity, but electricity isn’t actually that useful in terms of meeting basic human needs. Far more useful is heat, and sunlight can be used forheat with vastly greater efficiencies than it can be turned into electrical current. Water heating, space heating, cooking, food preservation, and many other useful activities can all be done by concentrating the rays of the sun or collecting solar heat in an insulated space. Doing these things with sunlight rather than wood heat or some other fuel source will take significant stress off damaged ecosystems while meeting a great many human needs.
3. Sustainable wood heating.  In the Earth’s temperate zones, solar thermal technologies can’t stand alone, and a sustainable way to produce fuel is thus high up on the list of necessities. Coppicing, a process that allows repeated harvesting of fuel wood from the same tree, and other methods of producing flammable biomass without burdening local ecosystems belong to this technological suite; so do rocket stoves and other high-efficiency means of converting wood fuel into heat.
4. Sustainable health care. Health care as it’s practiced in the world’s industrial nations is hopelessly unsustainable, dependent as it is on concentrated energy and resource inputs and planetwide supply chains.  As industrial society disintegrates, current methods of health care will have to be replaced by methods that require much less energy and other resources, and can be put to use by family members and local practitioners. Plenty of work will have to go into identifying practices that belong in this suite, since the entire field is a minefield of conflicting claims issuing from the mainstream medical industry as well as alternative health care; the sooner the winnowing gets under way, the better.
5. Letterpress printing and its related technologies.  One crucial need in an age of decline is the ability to reproduce documents from before things fell apart. Because the monasteries of early medieval Europe had no method of copying faster than monks with pens, much of what survived the fall of Rome was lost during the following centuries as manuscripts rotted faster than they could be copied. In Asia, by contrast, hand-carved woodblock printing allowed documents to be mass produced during the same era; this helps explain why learning, science, and technology recovered more rapidly in post-Tang dynasty China and post-Heian Japan than in the post-Roman West.  Printing presses with movable type were made and used in the Middle Ages, and inkmaking, papermaking, and bookbinding are equally simple, so these are well within the range of craftspeople in the deindustrial dark ages ahead.
6. Low-tech shortwave radio.  The ability to communicate over long distances at a speed faster than a horse can ride is another of the significant achievements of the last two centuries, and deserves to be passed onto the future. While the scientific advances needed to work out the theory radio required nearly three hundred years of intensive study of physics, the technology itself is simple—an ordinarily enterprising medieval European or Chinese alchemist could easily have put together a working radio transmitter and receiver, along with the metal-acid batteries needed to power them, if he had known how.  The technical knowledge in the amateur radio community, which has begun to get interested in low-tech, low-power methods again after a long flirtation with high-end technologies, could become a springboard to handbuilt radio technologies that could keep going after the end of industrial society.
7. Computer-free mathematics.  Until recently, it didn’t take a computer to crunch the numbers needed to build a bridge, navigate a ship, balance profits against losses, or do any of ten thousand other basic or not-so-basic mathematical operations; slide rules, nomographs, tables of logarithms, or the art of double-entry bookkeeping did the job.  In the future, after computers stop being economically viable to maintain and replace, those same tasks will still need to be done, but the knowledge of how to do them without a computer is at high risk of being lost. If that knowledge can be gotten back into circulation and kept viable as the computer age winds down, a great many tasks that will need to be done in the deindustrial future will be much less problematic.
(It’s probably necessary to repeat here that the reasons our descendants a few generations from now won’t be surfing the internet or using computers at all are economic, not technical. If you want to build and maintain computers, you need an industrial infrastructure that can manufacture integrated circuits and other electronic components, and that requires an extraordinarily complex suite of technologies, sprawling supply chains, and a vast amount of energy—all of which has to be paid for. It’s unlikely that any society in the deindustrial dark ages will have that kind of wealth available; if any does, many other uses for that wealth will make more sense in a deindustrialized world; and in an age when human labor is again much cheaper than mechanical energy, it will be more affordable to hire people to do the routine secretarial, filing, and bookkeeping tasks currently done by computers than to find the resources to support the baroque industrial infrastructure needed to provide computers for those tasks.
(The reason it’s necessary to repeat this here is that whenever I point out that computers won’t be economically viable in a deindustrial world, I field a flurry of outraged comments pretending that I haven’t mentioned economic issues at all, and insisting that computers are so cool that the future can’t possibly do without them. Here again, it’s as though they think a good fairy promised them something—and they aren’t paying attention to all the legends about the way that fairy gifts turn into a handful of dry leaves the next morning. We now return you to your regularly scheduled Archdruid Report.)
Organic gardens, solar and wood heat, effective low-tech health care, printed books, shortwave radios and a facility with slide rules and logarithms:  those aren’t a recipe for the kind of civilization we have today, nor are they a recipe for a kind of civilization that’s existed in the past. It’s precisely the inability to imagine anything else that’s crippled our collective ability to think about the future. One of the lessons of history, as Arnold Toynbee pointed out, is that the decline and fall of every civilization follows the same track down but the journey back up to a new civilization almost always breaks new ground. It would be equally accurate to point out that the decline and fall of a civilization is driven by humanity in the mass, but the way back up is inevitably the work of some small creative minority with its own unique take on things.  The time of that minority is still far in the future, but plenty of things that can be done right now can give the creative minds of the future more options to work with.
Those of my readers who want to do something constructive about the harsh future ahead thus could do worse than to adopt one or more of the technologies I’ve outlined, and make a personal commitment to learning, practicing, preserving, and transmitting that technology into the future.  Those who decide that some technology I haven’t listed deserves the same treatment, and are willing to make an effort to get it into the waiting hands of the future, will get no argument from me.  The important thing is to get off the couch and do something, because the decline is already under way and time is getting short.

2030 is the New 2012

Wed, 2014-01-08 19:22
Last week’s discussion of failed predictions in the peak oil movement inevitably touched on the latest round of claims that the world as we know it is going to come to a full stop sometime very soon. That was inevitable partly because these claims account for a fairly large fraction of the predictions made by peak oil writers each year, and partly because those same claims flop so reliably. Still, there’s another factor, which is that this sort of apocalypse fandom has become increasingly popular of late—as well as increasingly detached from the world the rest of us inhabit.
Late last year, for example, I was contacted by a person who claimed to be a media professional and wanted to consult with me about an apocalypse-themed video he was preparing to make. As I think most of my readers know, I make my living as a writer, editor, and occasional consultant, and so—as one professional to another—my wife, who is also my business manager, sent him back a polite note asking what sort of time commitment he was interested in and how much he was offering to pay. We got back a tirade accusing me of being too cheap to save the world, followed not long thereafter by another email in which he insisted that he couldn’t afford to pay anyone because his project would inevitably be the least popular video in history; after all, he claimed, nobody wants to hear about how the world as we know it is about to crash into ruin.
That was when I sat back on the couch and very nearly laughed myself into hiccups, because there’s nothing Americans like better than a good loud prediction of imminent doom. From Jonathan Edwards’ famous 1741 sermon “Sinners in the Hands of an Angry God” right through to today’s zombie apocalypse craze, a good number of the biggest pop-culture phenomena in American history have focused on the end of the world in one way or another. A first-rate example is the 2012 furore, which turned a nonexistent Mayan prophecy of doom into one of the most successful media cash cows in recent times. I can testify from personal experience that toward the end of the last decade, every publisher I know of with a presence in the New Age market was soliciting 2012-themed books from all their regular authors because that was the hottest market going.
The 2012 prophecy may have been a predictive failure, in other words, but it was a whopping financial success. With that in mind, among other things, I predicted shortly after December 21, 2012 that a new date for the end of everything would soon be selected, and would promptly attract the same enthusiasm as its predecessor. As noted in a post last May, that was one of my more successful predictions; the new date is 2030, and it’s already picking up the same eager attention that made 2012 such a lucrative gimmick.
One of the great innovations of the runup to 2012, which will probably continue to shape apocalyptic fads for some time to come, is that you don’t actually have to propose a specific mechanism of doom; all you need is a date. The architect of the 2012 phenomenon, the late Jose Arguelles, seems to have been the marketing genius who first realized this.  His 1984 book The Mayan Factor, which launched the furore on its way, insisted that something really, truly amazing was going to happen on December 21, 2012, without offering more than vague hints about what that amazing event might be. Those who piled onto the bandwagon he set in motion more than made up for Arguelles’ reticence, coming up with a flurry of predictions about what was going to happen on that day.
It didn’t matter that most of these predictions contradicted one another, and none of them rested on any logic more solid than, hey, we know something amazing is going to happen on that day, so here’s some speculation, with or without cherrypicked data, about what the amazing event might be.  The pileup of predictions, all by itself, made the date itself sound more convincing to a great many people. Far from incidentally, it also offered believers a convenient source of shelter from skepticism:  if a nonbeliever succeeded in disproving a hundred different claims about what was supposed to happen on the big day, a hundred and first claim would inevitably pop up as soon as he turned his back, so that the believers could keep on believing that the world as we know it was indeed going to end as scheduled.
The same logic is already being deployed with equal verve on behalf of a 2030 doomsday. So far, without making any particular effort to find them, I’ve fielded claims that on or by that year, global warming will spin out of control, driving humanity into extinction; oceanic acidification will kill off all the phytoplankton, crashing oxygen levels in the atmosphere and driving humanity into extinction; the supervolcano underneath Yellowstone Park will erupt, plunging the planet into a volcanic winter and driving humanity into extinction; an asteroid will come spinning out of space and driving humanity into extinction, and so on.  I haven’t yet seen anyone proclaim that 2030 will see the Earth swallowed whole by a gigantic space walrus with photon flippers, but no doubt it’s simply a matter of time.
Now of course it’s possible to raise hard questions about each of these claims—well, in fact, it’s more than possible, it’s easy, since none of them rely on more than a few fringe studies on the far end of scientific opinion, if that, and most of them quietly ignore the fact that greenhouse-gas spikes, oceanic acidification, and nearly everything else but the aforementioned space walrus have occurred before in the planet’s history without producing the results that are being expected from them this time around. I’ll be taking the time to raise some of those questions, and offer some answers for them grounded in solid science, in a series of posts I’ll start later this year. Still, fans and promoters of the 2030 fad have nothing to fear from such exercises; like the legendary hydra, a good apocalypse fad can sprout additional heads at will to replace those that are chopped off by critics.
Thus it’s pretty much baked into the cake at this point that 2030 will be the new 2012, and that we can count on another sixteen years of increasingly overheated claims clustering around that date before it, too, slips by and a new date has to be found.  We’ll be discussing the trajectory of the resulting furore from time to time on this blog, if only because there’s a certain wry amusement to be gained from watching people make epic fools of themselves.  Still, the point I want to raise this week is a little different. Granted that apocalypse fandom is an enduring feature of American pop culture, that very few people ever lost money or failed to attract an audience by insisting that the end is nigh, that a huge and well-oiled marketing machine lost its cash cow when 2012 passed without incident and thus has every reason to pile into the next apocalypse fad with redoubled enthusiasm: even so, why should fantasies of imminent doom attract so much larger an audience now than ever before, and play so much more central a role in the contemporary imagination of the future?
There are, as I see it, at least four factors involved.
The first is a habit of collective thought I spent much of last year discussing—the widespread popular conviction, amounting to religious faith, that today’s industrial civilization is an unstoppable juggernaut that will keep rolling onwards forever unless some even more gargantuan catastrophe mashes it flat to the dust. That conviction, as I’ve noted in previous posts, is not confined to those who are cheering the march of progress.  Plenty of people who claim that they hate industrial civilization and all its works are as convinced as any cornucopian that it’s certain to keep moving along its current trajectory, until it finally vanishes on the horizon of whatever grand or dreadful destiny it’s supposed to have this week.
As a heretic and a dissenter from  that secular faith, I’ve repeatedly watched otherwise thoughtful people engage in the most spectacular mental backflips to avoid noticing that perpetual progress and overnight annihilation aren’t the only possible futures for the modern industrial world. What’s more, a great many people seem to be getting more fervent in their faith in progress, not less so, as the onward march of progress just mentioned shows increasing signs of grinding to a halt. That’s a common feature in social psychology; it’s precisely when a popular belief system starts failing to explain everyday experiences that people get most passionate about treating it as unquestionable fact and shouting down those who challenge it. Believing that our civilization and our species will be gone by 2030 feeds into this, since that belief makes it much easier to brush aside the uncomfortable awareness that progress is over and faith in industrial society’s omnipotence has turned out to be utterly misplaced.
That’s one reason why apocalyptic fantasies are so popular these days. A second reason, which I’ve also discussed at some length in this blog, is the role such fantasies have in justifying inaction, when action involves significant personal costs. One of the hard facts of our present predicament is that the steps that have to be taken to get ready for the future bearing down on us all require letting go of the privileges and perquisites that most Americans consider theirs by right. A few years ago, I coined the acronym LESS—Less Energy, Stuff, and Stimulation—to summarize the changes that we’re all going to have to make as things proceed, and began pointing out that any response to our predicament that doesn’t start with using LESS simply isn’t serious.
I’m pleased to say that a certain fraction of my readers have taken that advice seriously, and tackled the uncomfortable job of downsizing their dependence on the absurd amounts of energy, stuff, and artificial stimulation that are involved in an ordinary American lifestyle these days. I’m equally pleased to say that an even larger number of people who don’t read The Archdruid Report and don’t know me from Hu Gadarn’s off ox have gotten to work doing the same thing. Those people are going to be in a much better position not merely to weather the crises ahead, but to help their loved ones, friends and neighbors do the same thing, and potentially also contribute to the preservation of the more useful achievements of the last few centuries. Still, it’s hard work, and it also requires a willingness to step outside the conventional wisdom of our society, which claims to be open to new and innovative ideas but in practice tolerates only endless rehashings of the same old notions.
Inevitably, a good many people who sense the necessity of change won’t act on that awareness because they realize the personal costs involved. Fantasies of imminent doom provide an escape hatch from the resulting cognitive dissonance. If the world is going to crash into ruin soon anyway, the reasoning runs, it’s easy to excuse further wallowing in the benefits the American system currently gives to its more privileged inmates, and any remaining sense that something is wrong can be redirected onto whatever cataclysm du jour the true believer in apocalypse happens to fancy.  Believing that the end is nigh thus allows people to have their planet and eat it too—or, more to the point, to convince themselves that they can keep on chomping away on what’s left of the planet for just a little while longer.
The third factor, which relates to the second one, unfolds from the historical tragedy of the Baby Boom generation, which is massively overrepresented in apocalypse fandom just now.  The Boomers were among the most idealistic generations in US history, but they were also far and away the most privileged, and the conflict between those two influences has defined much of their trajectory through time. Starting when the Sixties youth culture crashed and burned, the Boomers have repeatedly faced forced choices between their ideals and their privileges.  Each time, the majority of Boomers—there have always been noble exceptions—chose to cling to their privileges, and then spent the next decade or so insisting at the top of their lungs that their ideals hadn’t been compromised by that choice.
Thus the early 1970s were enlivened by the loud insistence of former hippies, as they cut their hair and donned office clothing to take up the corporate jobs they’d vowed never to accept, that they were going to change the system from within. (Even at the time, that was generally recognized as a copout, but it was a convenient one and saw plenty of use.) By the 1980s, many of these same former hippies were quietly voting for Ronald Reagan and his allies because the financial benefits of Reagan’s borrow-and-spend policies were just too tempting to pass up, though they insisted all the while that they would put part of the windfall into worthy causes. Rinse and repeat, and today you’ve got people who used to be in the environmental movement pimping for nuclear power and GMOs, because the conserver lifestyles they were praising to the skies forty years ago have become unthinkable for them today.
One consequence of these repeated evasions has been an ongoing drumbeat of books and other media proclaiming as loudly as possible that that the Baby Boom generation would change the world just by existing, without having to accept the hard work and sacrifices that changing the world actually entails. From 1972’s The Greening of America right on down to the present, this sort of literature has been lucrative and lavishly praised, but the great change never quite got around to happening and, as the Boomers head step by step toward history’s exit door, there’s no reason to think it ever will.
Perhaps the saddest of all these works came from the once-fiery pen of the late Theodore Roszak, whose 1969 book The Making of a Counter Culture played a significant role in shaping the Boomer generation’s self-image. His last book, The Making of an Elder Culture, expressed a wistful hope that once the Boomers retired, they would finally get around to fulfilling the expectations he’d loaded on them all those years ago. Of course they haven’t, and they won’t, because doing so would put their pensions and comfortable retirements at risk. Mutatis mutandis, that’s why the Age of Aquarius turned out to be a flash in the pan:  “Let’s change the system, but keep the privileges we get from it” reliably works out in practice to “Let’s not change the system.”
The expectation of imminent apocalypse is the despairing counterpoint to the literature just described. Instead of insisting that the world would shortly become Utopia (and no action on the part of Boomers is needed to cause this), it insists that the world will shortly become the opposite of Utopia (and no action on the part of Boomers is capable of preventing this). This serves the purpose of legitimizing inaction at a time when action would involve serious personal costs, but there’s more to it than that; it also feeds into the Boomer habit of insisting on the cosmic importance of their own experiences.  Just as normal adolescent unruliness got redefined in Boomer eyes as a revolution that was going to change the world, the ordinary experience of approaching mortality is being redefined as the end of everything—after all, the universe can’t just go on existing after the Boomers are gone, can it?  It’s thus surely no accident that 2030 is about the time the middle of the Baby Boom generation will be approaching the end of its statistically likely lifespan.
The three factors just listed all have a major role in fostering the apocalypse fandom that plays so large a part in today’s popular culture and collective imagination. Still, I’ve come to think that a fourth factor may actually be the most significant of all.
To grasp that fourth factor, I’d like to encourage my readers to engage in a brief thought experiment. Most people these days have noticed that for the last decade or so, each passing year has seen a broad worsening of conditions on a great many fronts. Here in America, certainly, jobs are becoming scarcer, and decent jobs with decent pay scarcer still, while costs for education, health care, and scores of other basic social goods are climbing steadily out of reach of an ever-larger fraction of the population.  State and local governments are becoming less and less able to provide even essential services, while the federal government sinks ever further into partisan gridlock and bureaucratic paralysis, punctuated by outbursts of ineffectual violence flung petulantly outward at an ever more hostile world.  The human and financial toll of natural disasters keeps going up while the capacity to do anything about the consequences keeps going down—and all the while, resource depletion and environmental disruption impose a rising toll on every human activity.
That’s the shape of the recent past. The thought experiment I’d like to recommend to my readers is to imagine that things just keep going the same way, year after year, decade after decade, without any of the breakthroughs or breakdowns in which so many of us like to put our faith.
Imagine a future in which all the trends I’ve just sketched out just keep on getting worse, a tunnel growing slowly darker without any light at the far end—not even the lamp of an oncoming train. More to the point, imagine that this is your future: that you, personally, will have to meet ever-increasing costs with an income that has less purchasing power each year; that you will spend each year you still have left as an employee hoping that it won’t be your job’s turn to go away forever, until that finally happens; that you will have to figure out how to cope as health care and dozens of other basic goods and services stop being available at a price you can afford, or at any price at all; that you will spend the rest of your life in the conditions I’ve just sketched out, and know as you die that the challenges waiting for your grandchildren will be quite a bit worse than the ones you faced.
I’ve found that most people these days, asked to imagine such a future, will flatly refuse to do it, and get furiously angry if pressed on the topic. I want to encourage my readers to push past that reaction, though, and take a few minutes to imagine themselves, in detail, spending the rest of their lives in the conditions I’ve just outlined. Those who do that will realize something about apocalyptic fantasies that most believers in such fantasies never mention: even the gaudiest earth-splattering cataclysm is less frightening than the future I’ve described—and the future I’ve described, or one very like it, is where current trends driven by current choices are taking us at their own implacable pace.
My guess is that that’s the most important factor behind the popularity of apocalyptic thinking these days.  After so many promised breakthroughs have failed to materialize, cataclysmic mass death is the one option many people can still believe in that’s less frightening than the future toward which we’re actually headed, and which our choices and actions are helping to create. I suggest that this, more than anything else, is why 2030 is going to be the next 2012, why promoters of the it’s-all-over-in-2030 fad will find huge and eager audiences for their sales pitches, and why some other date will take 2030’s place in short order once the promised catastrophes fail to appear on schedule and the future nobody wants to think about continues to take shape around them.
Mind you, there are less delusional and less self-defeating ways to face the challenging times ahead—ways that might actually accomplish something positive in a harsh future. We’ll talk about one of those next week.

The Song of the Snallygaster

Wed, 2014-01-01 20:54
I've noticed, over the past few years, a growing lack of enthusiasm in the formerly raucous festivities that once marked the end of one year and the beginning of another. That was certainly in evidence last night in our old red brick mill town here in the eastern end of the Rust Belt. As my wife and I clicked glasses together, the night outside was hushed. It was as though all the people who were grateful to see 2013 hauled away to the glue factory suddenly realized that the new year might well be worse.
  I’m not sure why I didn’t share in the general gloom. When you live in a decaying empire that’s trying to meet the rising costs of its short-term survival by selling its own grandchildren down the river, contemplating the future that results from that choice isn’t exactly a recipe for hilarity, and making a career out of writing about that future might seem like a good way to meet each new year in a profound depression or a drunken stupor, take your pick. Still, that hasn’t been the case for me. Maybe it’s that I came to terms with the reality of our civilization’s impending decline back in the 1970s, when you could still talk about such things in public without being shouted down by true believers in perpetual progress and instant apocalypse, the Tweedledoom and Tweedledee of our collective non-conversation about the future; whatever the cause, I waved hello to the New Year and sipped a glass of bourbon in relatively good cheer.
There was at least one extraneous reason for that cheer. One of my solstice presents this year was a lively little book, Mysteries and Lore of Western Maryland by local author Susan Fair. Most parts of the United States have at least one book like it, collecting the ghost stories, spooky tales, and weird creatures of the area, and this one is a good and highly readable example of the species.  Reading it took me back to some of the least wretched hours of my childhood, when I found refuge from a disintegrating family, and a school life marred by American education’s culture of mediocrity and bullying, by chasing down anything that made the walls of the world press a little less closely against me on all sides. Monster lore played a significant role in that process, and so I was delighted, in reading Fair’s book, to make the acquaintance of one of the local “fearsome critters,” the snallygaster.
Any of my readers who happen to be adept in American monster folklore will no doubt be leaping for their keyboards to tell me that the same word’s spelled “snoligoster” or “snollygoster” elsewhere. Yes, I know; it’s different here. (In western Maryland, for reasons I haven’t yet deciphered, the letter A is more popular than its rival vowels; the Native American name for the local mountain range, written out as “Allegheny” everywhere else, is spelled “Allegany” here and rhymes with “rainy.”) The snallygaster, as I was saying, was a dragonlike creature with huge wings, a long pointed tail, a single eye in the middle of its forehead, and octopuslike tentacles that dangled behind it as it flew. 
This remarkable apparition was apparently all over western Maryland in 1909, and then again in 1932; during both flaps, the newspapers splashed the story all over the front page of issue after issue, and everyone in the region seems to have known someone who knew someone who knew someone who just missed seeing it. Even when it met its end—it drowned, according to newspaper reports, after falling into a 2500-gallon vat of illegal whiskey in the notorious local moonshiner’s haven of Frog Hollow—its presence was evanescent:  its body dissolved in the liquor, and the government agents who were raiding the operation when the snallygaster met its doom proceeded to break open the vat and spill the contents, rather than bottling and selling what would surely have been the most unique beverage ever produced by the region’s less-than-legal distilling industry.
Snallygasters thus share in the most important characteristic of the “fearsome critters” of American monster folklore:  everybody knows about them, but you’ll never actually meet anybody who’s seen one. That characteristic isn’t unique to monsters. It also features in the peak oil scene, and in particular in one of the more common habits of that scene, especially though not only around this time of year. That’s as much justification as you’ll get for the cameo appearance of snallygasters in this week’s post—well, beyond the fact that snallygasters, and local folklore generally, deserve more attention than they usually get these days—because it’s mostly these habits that I want to talk about this week.
The weeks to either side of January 1 each year, as regular readers of peak oil blogs will have noted long since, are festooned with predictions about what’s going to happen in the year to come. That habit’s not limited to the peak oil scene, to be sure, but a curious feature is shared by many peak oil blogs: the prophecies that pop up during this annual orgy of prognostication are quite often the same ones that appeared the previous year, and the year before that, and so on back as far as the archives go. Most of those predictions, furthermore, flopped—that is, the events so confidently predicted did not happen within the time frame the prediction assigned them—but reflections on that awkward reality in these same blogs are about as rare as snallygasters singing duets in your back yard.
It’s high time for this bad habit to get drowned in a tub of moonshine once and for all. Those of us who talk about peak oil and the other troubles closing in on contemporary industrial civilization have precisely one thing to offer that deserves the attention of anybody else—that our ideas, and the predictions we base on them, might help others figure out what’s happening to the world around them at a time when more familiar ways of thinking aren’t providing useful guidance. If our predictions are no more accurate than those of raving maniacs, mainstream economists, and the like, nobody has any reason to listen to us. In point of fact, I’ve come to think that one of the most important reasons why the peak oil movement is all but dead in the water these days is that too many people outside it have read too many rehashes of the same failed predictions too many times, and decided that peak oil writers simply don’t know what they’re talking about.
I’d like to suggest that the lack of attention paid by peak oil authors to the failure rate of their predictions has contributed heartily to that reaction. For this reason, before I review the predictions I made in my first post of 2013 and offer some new ones for the year that’s just begun, I’m going to take a moment to discuss predictions of mine that flopped, and what I’ve learned from those flops.
The first of my failed predictions has evaded my attempts to find it in this blog’s archives, but it appeared sometime in 2007 or 2008, if I recall correctly.  I noted the skyrocketing price of oil, surveyed the claims then being made by other peak oil writers that it would just keep on zooming up forever, and argued instead that it would plateau and then decline over the course of the next few decades. I was, of course, quite wrong, but so were the people whose ideas I was challenging; the price of crude oil spiked up to just shy of $150 a barrel and then crashed at once, plunging to not much more than a fifth of that figure, before resuming a ragged upwards movement to its present level just north of $100 a barrel. The raw volatility of the oil market blindsided me, as it did many others; it was an embarrassing lesson, and one that’s shaped my efforts to estimate oil price movements since that time.
The second appeared in several of my posts and responses to comments in 2009 and 2010. At that time, I was convinced that Barack Obama would be a one-term president.  It was inconceivable to me that the Democrats who spent eight years in a state of spit-slinging fury at George W. Bush’s war crimes, abuses of civil liberties, and huge giveaways to big corporations, would fall all over themselves finding excuses for the identical actions performed by Barack Obama. It was also inconceivable to me that the Republicans, faced with the weakest Democratic incumbent in many decades, would ransack the nation to find a candidate the American people would like even less.  Of course that’s exactly what happened; a great many Democrats demonstrated with painful clarity that the respect for civil liberties and the rule of law they paraded so loudly during the Bush years was simply a cover for the usual partisan hatreds, while the Republican party showed just as clearly how detached it’s become from those Americans—a substantial majority, by the way—who don’t belong to the handful of isolated pressure groups whose vagaries currently shape GOP policy. We haven’t yet had another national election, but I can assure my readers that I won’t be making those mistakes again.
The third appeared in several posts in late 2011 and early 2012, as the hoopla around the fake-Mayan 2012 prophecy was shifting into high gear. I thought, based on historical parallels, that 2012 might turn out to be an apocalyptic frenzy for the record books, with crowds of believers gathering on hilltops to wait for December 21 to dawn and watch the Space Brothers land, or what have you.  That didn’t happen; quite the contrary, in the weeks just prior to December 21, quite a few of the big names in the 2012 scene started backpedaling on their previous insistence that some worldchanging event would surely happen that day. I’ve factored that curious turn of events into my thinking about the next big apocalyptic furore—of which more shortly.
The fourth appeared last February in one of my posts on the current fracking bubble. At that time, for a variety of reasons, I thought that the financial bubble that’s inflated around shale oil in the last few years was within a few months of a messy collapse. I remain convinced that it’s going to pop—bubbles always do, and the fracking business has all the classic signs of a speculative bubble, from the bellowing about a new era of prosperity right around the corner all the way down to the dodgy financial underpinnings—but I was obviously wrong about the timing; the month before, I’d suggested 2014 as a likely date for the inevitable crash, and I should have stuck with that estimate.
So those are the four failed predictions of mine I recalled while glancing back over the nearly eight years this blog has been in existence. There may have been others that I missed—if so, I encourage my readers to bring them up on the comments page and we’ll talk about them. Such mistakes are all but impossible to avoid when discussing something as elusive as the future, and archdruids are no more infallible than anyone else; in fact, the late head of a different Druid order once promulgated an official Dogma of Archdruidical Fallibility to declare in formal terms that he was going to make mistakes.  We don’t have any dogmas at all in the Druid order I head, but the principle still applies.
With that cautionary note in mind, let’s turn to the predictions I made at the beginning of 2013 about the year ahead. Here they are:
“I thus predict that just as 2012 looked like a remake of 2011 a little further down the curve of decline, 2013 will look a good deal like 2012, but with further worsening along the same broad array of trends and yet another round of local crises and regional disasters. The number of billion-dollar weather disasters will tick up further, as will the number of Americans who have no job—though, to be sure, the official unemployment rate and other economic statistics will be gimmicked then as now.  The US dollar, the Euro, and the world’s stock markets will still be in business at year’s end, and there will still be gas for sale in gas stations, groceries for sale in grocery stores, and more people interested in the Super Bowl than in global warming or peak oil, as 2013 gives way to 2014.
“As the year unfolds, I’d encourage my readers to watch the fracking bubble...I don’t expect the bubble to pop this year—my best guess at this point is that that’ll happen in 2014—but it’s already losing air as the ferocious decline rates experienced by fracked oil and gas wells gnaw the bottom out of the fantasy.  Expect the new year to bring more strident claims of the imminent arrival of a shiny new future of energy abundance, coupled with a steady drumbeat of bad financial news suggesting, in essence, that the major players in that end of the oil and gas industry are well and truly fracked.
“I’d also encourage my readers to watch the climate...Most of the infrastructure of industrial society was built during the period of abnormally good weather we call the twentieth century.  A fair amount of it, as New York subway riders have had reason to learn, is poorly designed to handle extreme weather, and if those extremes become normal, the economics of maintaining such complex systems as the New York subways in the teeth of repeated flooding start to look very dubious indeed.  I don’t expect to see significant movements out of vulnerable coastal areas quite yet, but if 2011’s Hurricane Irene and 2012’s Hurricane Sandy turn out to have a bouncing baby sibling who decides to pay a visit to the Big Apple in 2013, 2014 might see the first businesses relocating further inland, perhaps to the old mill towns of the southern Hudson valley and the eastern end of Pennsylvania, perhaps further still.
“That’s speculative. What isn’t speculative is that all the trends that have been driving the industrial world down the arc of the Long Descent are still in play, and so are all the parallel trends that are pushing America’s global empire along its own trajectory toward history’s dustbin  Those things haven’t changed; even if anything could be done about them, which is far from certain, nothing is being done about them; indeed, outside of a handful of us on the fringes of contemporary culture, nobody is even talking about the possibility that something might need to be done about them.  That being the case, it’s a safe bet that the trends I’ve sketched out will continue unhindered, and give us another year of the ordinary phenomena of slowly accelerating decline and fall.”
The bouncing baby sibling of Hurricanes Irene and Sandy didn’t put in an appearance, and so the first tentative shifts of businesses and population inland from the vulnerable Atlantic coast will have to wait a few more years. Other than that, I think it’s fair to say that once again, I called it.
My prediction for 2014, in turn, is that we’ll see more of the same:  another year, that is, of uneven but continued downward movement along the same arc of decline and fall, while official statistics here in the United States will be doctored even more extravagantly than before to manufacture a paper image of prosperity. The number of Americans trying to survive without a job will continue to increase, the effective standard of living for most of the population will continue to decline, and what used to count as the framework of ordinary life in this country will go on unraveling a thread at a time. Even so, the dollar, the Euro, the stock market, and the Super Bowl will still be functioning as 2015 begins; there will still be gas in the gas pumps and food on grocery store shelves, though fewer people will be able to afford to buy either one.
The fracking bubble has more than lived up to last year’s expectations, filling the mass media with vast amounts of meretricious handwaving about the coming era of abundance:  the same talk, for all practical purposes, that surrounded the equally delusional claims made for the housing bubble, the tech bubble, and so on all the way back to the Dutch tulip bubble of 1637. That rhetoric will prove just as dishonest as its predecessors, and the supposed new era of prosperity will come tumbling back down to earth once the bubble pops, taking a good chunk of the American economy with it. Will that happen in 2014? That’s almost impossible to know in advance. Timing the collapse of a bubble is one of the trickiest jobs in economic life; no less a mind than Isaac Newton’s was caught flatfooted by the collapse of the South Sea Bubble in 1720, and the current bubble is far more opaque. My guess is that the collapse will come toward the end of 2014, but it could have another year or so to run first.
It’s probably a safe bet that weather-related disasters will continue to increase in number and severity. If we get a whopper on the scale of Katrina or Sandy, watch the Federal response; it’s certain to fall short of meeting the needs of the survivors and their communities, but the degree to which it falls short will be a useful measure of just how brittle and weak the national government has become. One of these years—just possibly this year, far more likely later on—that weakness is going to become one of the crucial political facts of our time, and responses to major domestic disasters are among the few good measures we’ll have of how close we are to the inevitable crisis.
Meanwhile, what won’t happen is at least as important as what will. Despite plenty of enthusiastic pronouncements and no shortage of infomercials disguised as meaningful journalism, there will be no grand breakthroughs on the energy front. Liquid fuels—that is to say, petroleum plus anything else that can be thrown into a gas tank—will keep on being produced at something close to 2013’s rates, though the fraction of the total supply that comes from expensive alternative fuels with lower net energy and higher production costs will continue to rise, tightening a noose around the neck of every other kind of economic activity. Renewables will remain as dependent on government subsidies as they’ve been all along, nuclear power will remain dead in the water, fusion will remain a pipe dream, and more exotic items such as algal biodiesel will continue to soak up their quotas of investment dollars before going belly up in the usual way. Once the fracking bubble starts losing air, expect something else to be scooped up hurriedly by the media and waved around to buttress the claim that peak oil won’t happen, doesn’t matter, and so on; any of my readers who happen to guess correctly what that will be, and manage their investments accordingly, may just make a great deal of money.
Sudden world-ending catastrophes will also be in short supply in 2014, though talk about them will be anything but. The current vagaries of the apocalypse lobby probably deserve a post of their own; the short version is that another prediction of mine—that the failure of the fake-Mayan 2012 prophecy would very quickly be followed by the emergence of another date of supposedly imminent doom—has already come true, with knobs on. The new date is 2030; expect to see the same dubious logic, the same frantic cherrypicking of factoids, and the same mass production of different theories sharing only a common date, that played such important and disreputable roles in the 2012 fracas.  Since we’ve got more than a decade and a half to go before the next Nothing Happened Day arrives, there’s plenty of time for the marketing machine to get rolling before the snallygaster sings, and roll it will.
Both the grandiose breakthroughs that never happen and the equally gaudy catastrophes that never happen will thus continue to fill their current role as excuses not to think about, much less do anything about, what’s actually happening around us right now—the long ragged decline and fall of industrial civilization that I’ve called the Long Descent. Given the popularity of both these evasive moves, we can safely assume that one more thing won’t happen in 2014:  any meaningful collective response to the rising spiral of crises that’s shredding our societies and our future. As before, anything useful that’s going to happen will be the work of individuals, families, and community groups, using the resources on hand to cope with local conditions. I’ve talked at great length here, and in several of my books, about some of the things that might go into such a response, and I won’t rehash that now; we’ll be talking about much of it from another perspective in the months ahead.
There’s at least one other question about the immediate future that I plan on leaving up in the air for now, though, and it circles back to some of the points I made toward the start of this post. I’ve just noted the times in the past where my guesses about the future have been wrong, explained why they were wrong, and talked about what I learned from the experience. I then reviewed the predictions I made at the beginning of 2013, checked them against the facts, and used the results as a basis for my 2014 predictions. I’d like to encourage other writers and bloggers exploring peak oil, and the shape of the future more generally, to do the same thing right out here in public, review their successes and their flops, and talk about how their 2013 predictions worked out over the year just past.
On second thought, I think we can go further than that. Bloggers and writers in the peak oil community who make predictions, and expect those predictions to be taken seriously, owe it to their readers to subject their predictions to honest assessment once they can be compared to the facts. If it’s reasonable for us to talk about the failed predictions of cornucopians,  fusion-power advocates, and the like, and of course it is, we need to be willing to ‘fess up to our own mistakes and learn from them, rather than just making them over and over again in the fond belief that nobody will notice. Those who indulge in this latter habit are doing active harm to the cause of peak oil awareness, and it’s high time that the rest of us start pointing out that they’re not offering useful insights into the future—just engaging in a hackneyed form of science fiction useful for entertainment purposes only.
The clatter you just heard? That was the sound of a gauntlet being thrown down. It will be interesting to see whether anyone picks it up.