AODA Blog

Smile For The Aliens

Wed, 2014-07-16 16:44
Last week’s post, with its uncompromising portrayal of what descent into a dark age looks like, fielded the usual quota of voices insisting that it’s different this time. It’s a familiar chorus, and I confess to a certain wry amusement in watching so many changes get rung on what, after all, is ultimately a non sequitur. Grant that it’s different this time: so?  It’s different every time, and it always has been, yet those differences have never stopped history’s remarkably diverse stable of civilizations from plodding down the self-same track toward their common destiny.
It may also have occurred to my readers, and it has certainly occurred to me, that the legions of bloggers and pundits who base their reasonings on the claim that history has nothing to teach us don’t have to face a constant barrage of comments insisting that it’s the same this time. “It’s different this time” isn’t simply one opinion among others, after all; it’s one of the basic articles of faith of the contemporary industrial world, and questioning it reliably elicits screams of outrage even from those who like to tell themselves that they’ve rejected the conventional wisdom of the present day.
Yet that raises another question, one that’s going to bear down with increasing force in the years ahead of us: just how will people cope when some of their most cherished beliefs have to face a cage match with reality, and come out second best?
Such issues are rather on my mind just at the moment. Regular readers may recall that a while back I published a book, The UFO Phenomenon, which managed the not inconsiderable feat of offending both sides of the UFO controversy. It did so by the simple expedient of setting aside the folk mythology that’s been heaped up with equal enthusiasm by true believers in extraterrestrial visitation and true believers in today’s fashionable pseudoskeptical debunkery. After getting past that and a few other sources of confusion, I concluded that the most likely explanation for the phenomenon was that US military and intelligence agencies invented it out of whole cloth after the Second World War, as protective camouflage for an assortment of then-secret aerospace technologies.
That wasn’t the conclusion I expected to reach when I began work on the project; I had several other hypotheses in mind, all of which had to be considerably modified as the research proceeded. It was just too hard not to notice the way that the typical UFO sightings reported in any given decade so closely mimicked whatever the US was testing in secret at any given time—silvery dots or spheres in the late 1940s, when high-altitude balloons were the latest thing in aerial reconnaissance; points or tiny blobs of light high in the air in the 1950s, when the U-2 was still top secret; a phantasmagoria of flying lights and things dropping from the sky in the 1960s, when the SR-71 and the first spy satellites entered service; black triangles in the 1980s, when the first stealth aircraft were being tested, and so on. An assortment of further evidence pointing the same way, not to mention the significant parallels between the UFO phenomenon and those inflatable tanks and nonexistent battalions that tricked the Germans into missing the real preparations for D-Day, were further icing on a saucer-shaped cake.
To call that an unpopular suggestion is to understate the case considerably, though I’m pleased to say it didn’t greatly hurt sales of the book.  In the years since The UFO Phenomenon saw print, though, there’s been a steady stream of declassified documents from US intelligence agencies admitting that, yes, a lot of so-called UFOs were perfectly identifiable if you happened to know what classified projects the US government had in the air just then. It turns out, for example, that roughly half the UFO sightings reported to the Air Force’s Project Blue Book between 1952 and 1969 were CIA spyplanes; the officers in charge of Blue Book used to call the CIA when sightings came in, and issue bogus “explanations” to provide cover for what was, at the time, a top secret intelligence project. I have no reason to think that the publication of The UFO Phenomenon had anything to do with the release of all this data, but it was certainly a welcome confirmation of my analysis.
The most recent bit of confirmation hit the media a few weeks back. Connoisseurs of UFO history know that the Scandinavian countries went through a series of major “flaps”—periods in which many UFO sightings occured in a short time—in the 1950s and 1960s. The latest round of declassified data confirmed that these were sightings of US spyplanes snooping on the Soviet Union. The disclosures didn’t happen to mention whether CIA assets also spread lurid accounts of flying saucer sightings and alien visitations to help muddy the waters. My hypothesis is that that’s what was going on all the way through the history of the UFO phenomenon: fake stories and, where necessary, faked sightings kept public attention fixated on a manufactured mythology of flying saucers from outer space, so that the signal of what was actually happening never made it through the noise.
Many of my readers will already have guessed how the two sides of the UFO controversy responded to the disclosures just mentioned:  by and large, they haven’t responded to them at all. Believers in the extraterrestrial origin of UFOs are still insisting at the top of their lungs that some day very soon, the US government will be forced to ‘fess up to the reality of alien visitation—yes, I field emails from such people regularly. Believers in the null hypothesis, the claim that all UFO sightings result from hoaxes, illusions, or misidentification of ordinary phenomena, are still rehashing the same old arguments when they haven’t gone off to play at being skeptical about something else. That’s understandable, as both sides have ended up with substantial amounts of egg on their face.
Mind you, the believers in the extraterrestrial hypothesis were right about a great many more things than their rivals, and they deserve credit for that. They were right, for example, that people really were seeing unusual things in the skies; they were right that there was a coverup orchestrated by the US government, and that the Air Force was handing out explanations that it knew to be fake; they were even right in guessing that the Groom Lake airfield in Nevada, the legendary “Area 51,” was somehow central to the mystery—that was the main US spyplane testing and training base straight through the decades when the UFO mystery was at its peak. The one thing they got wrong was the real origin of the UFO phenomenon, but for them, unfortunately, that was the one thing that mattered.
The believers in the null hypothesis don’t have much reason to cheer, even though they turned out to be right about that one point. The disclosures have shown with uncomfortable clarity that a good many of the explanations offered by UFO skeptics were actually nonsense, just as their opponents had been pointing out all along. In 1981, for example, Philip Klass, James Oberg, and Robert Sheaffer claimed that they’d identified all the cases  that Project Blue Book labeled as “unknown.” As it happens, they did nothing of the kind; what they actually did was offer untested ad hoc hypotheses to explain away the unknowns, which is not exactly the same thing. It hardly needs to be said that CIA spyplanes played no part in those explanations, and if the “unknown” cases contained the same proportion of spyplanes as the whole collection, as seems likely, roughly half their explanations are wrong—a point that doesn’t exactly do much to inspire confidence in other claims made on behalf of the debunking crusade.
So it’s not surprising that neither side in the controversy has had the least interest in letting all this new data get in the way of keeping up the old argument. The usual human reaction to cognitive dissonance is to exclude the information that’s causing the dissonance, and that’s precisely what both sides, by and large, have done. As the dissonance builds, to be sure, people on the fringes of both scenes will quiely take their leave, new recruits will become few and far between, and eventually surviving communities of believers and debunkers alike will settle into a common pattern familiar to any of my readers familiar with Spiritualist churches, Marxist parties, or the flotsam left behind by the receding tide of other once-influential movements in American society: little circles of true believers fixated on the disputes of an earlier day, hermetically sealed against the disdain and disinterest of the wider society.
They have the freedom to do that, because the presence or absence of alien saucers in Earth’s skies simply doesn’t have that much of an impact on everyday life. Like Spiritualists or Marxists, believers in alien contact and their debunking foes by and large can avoid paying more than the most cursory attention to the failure of their respective crusades. The believers can take comfort in the fact that even in the presence of overwhelming evidence, it’s notoriously hard to prove a negative; the debunkers can take comfort in the fact that, however embarrassing their logical lapses and rhetorical excesses, at least they were right about the origins of the phenomenon.
That freedom isn’t always available to those on the losing side of history. It’s not that hard to keep the faith if you aren’t having your nose rubbed in the reality of your defeat on a daily basis, but it’s quite another matter to cope with the ongoing, overwhelming disconfirmation of beliefs on which you’ve staked your pride, your values, and your sense of meaning and purpose in life. What would life be like these days for the vocal UFO debunkers of recent decades, say, if the flying saucers had turned out to be alien spacecraft after all, the mass saucer landing on the White House lawn so often and so vainly predicted had finally gotten around to happening, and Philip Klass and his fellow believers in the null hypothesis had to field polite requests on a daily basis to have their four-dimensional holopictures taken by giggling, gray-skinned tourists from Zeta Reticuli?
For a living example of the same process at work, consider the implosion of the New Age scene that’s well under way just now.  In the years before the 2008 crash, as my readers will doubtless remember, tens of thousands of people plunged into real estate speculation with copies of Rhonda Byrne’s meretricious The Secretor similar works of New Age pseudophilosophy clutched in their sweaty hands, convinced that they knew how to make the universe make them rich. I knew a fair number of them—Ashland, Oregon, where I lived at the time, had a large and lucrative New Age scene—and so I had a ringside seat as their pride went before the real estate market’s fall. That was a huge blow to the New Age movement, and it was followed in short order by the self-inflicted humiliation of the grand nonevent of December 21, 2012.
Those of my readers who don’t happen to follow trends in the publishing industry may be interested to know that sales of New Age books peaked in 2007 and have been plunging since then; so has the take from New Age seminars, conferences, and a galaxy of other products hawked under the same label. There hadn’t been any shortage of disconfirmations in the previous history of the New Age scene, to be sure, but these two seem to have been just that little bit more than most of the movement’s adherents can gloss over. No doubt the New Age movement will spawn its share of little circles of true believers—the New Thought movement, which was basically the New Age’s previous incarnation, did exactly that when it imploded at the end of the 1920s, and many of those little circles ended up contributing to the rise of the New Age decades later—but as a major cultural phenomenon, it’s circling the drain.
One of the central themes of this blog, in turn, is that an embarrassment on much this same scale waits for all those who’ve staked their pride, their values, and their sense of meaning and purpose in life on the belief that it’s different this time, that our society somehow got an exemption from the common fate of civilizations. If industrial society ends up following the familiar arc of decline and fall into yet another dark age, if all the proud talk about man’s glorious destiny among the stars turns out to be empty wind, if we don’t even get the consolation prize of a downfall cataclysmic enough to drag the rest of the planet down with us—what then?
I’ve come to think that’s what lies behind the steady drumbeat of emails and comments I field week after week insisting that it’s different this time, that it has to be different this time, and clutching at the most remarkable assortment of straws in an attempt to get me to agree with them that it’s different this time. That increasingly frantic chorus has many sources, but much of it is, I believe, a response to a simple fact:  most of the promises made by authoritative voices in contemporary industrial society about the future we’re supposed to get have turned out to be dead wrong.
Given the number of people who like to insist that every technological wet dream will eventually be fulfilled, it’s worth taking the time to notice just how poorly earlier rounds of promises have measured up to the inflexible yardstick of reality.  Of all the gaudy and glittering technological breakthroughs that have been promised with so much confidence over the last half dozen decades or so, from cities on the Moon and nuclear power too cheap to meter straight through to120-year lifespans and cures for cancer and the common cold, how many have actually panned out?  Precious few.  Meanwhile most measures of American public health are slipping further into Third World territory with every year that passes, our national infrastructure is sinking into a morass of malign neglect, and the rising curve of prosperity that was supposed to give every American acces to middle class amenities has vanished in a haze of financial fraud, economic sclerosis, and official statistics so blatantly faked that only the media pretends to believe them any more.
For many Americans these days, furthermore, those broken promises have precise personal equivalents. A great many of the people who were told by New Age authors that they could get rich easily and painlessly by visualizing abundance while investing in dubious real estate ventures found out the hard way that believing those promises amounted to being handed a one-way nonstop ticket to poverty. A great many of the people who were told by equally respected voices that they would attain financial security by mortgaging their futures for the benefit of a rapacious and corrupt academic industry and its allies in the banking sphere are finding out the same thing about the reassuring and seemingly authoritative claims that they took at face value.  For that matter, I wonder how many American voters feel they benefited noticeably from the hope and change that they were promised by the sock puppet they helped put into the White House in 2008 and 2012.
The promises that framed the housing bubble, the student loan bubble, and the breathtaking cynicism of Obama’s campaign, after all, drew on the same logic and the same assumptions that guided all that grand and vaporous talk about the inevitability of cities on the Moon and commuting by jetpack. They all assumed that history is a one-way street that leads from worse to better, to more, bigger, louder, gaudier, and insisted that of course things would turn out that way. Things haven’t turned out that way, they aren’t turning out that way, and it’s becoming increasingly clear that things aren’t going to turn out that way any time this side of the twelfth of Never. I’ve noted here several times now that if you want to predict the future, paying attention to the reality of ongoing decline pretty reliably gives you better results than trusting that the decline won’t continue in its current course.
 The difficulty with that realization, of course, is precisely that so many people have staked their pride, their values, and their sense of meaning and purpose in life on one or another version of the logic I’ve just sketched out. Admitting that the world is under no compulsion to change in the direction they think it’s supposed to change, that it’s currently changing in a direction that most people find acutely unwelcome, and that there are good reasons to think the much-ballyhooed gains of the recent past were the temporary products of the reckless overuse of irreplaceable energy resources, requires the surrender of a deeply and passionately held vision of time and human possibility. Worse, it lands those who do so in a situation uncomfortably close to the crestfallen former UFO debunkers I joked about earlier in this post, having to cope on an everyday basis with a world full of flying saucers and tourists from the stars.
Beneath the farcical dimensions of that image lies a sobering reality. Human beings can’t live for long without some source of values and some sense of meaning in their lives.  That’s why people respond to cognitive dissonance affecting their most cherished values by shoving away the unwelcome data so forcefully, even in the teeth of the evidence. Resistance to cognitive dissonance has its limits, though, and when people have their existing sources of meaning and value swept away by a sufficiently powerful flood of contradictions, they will seek new sources of meaning and value wherever they can find them—no matter how absurd, dysfunctional, or demonic those new meanings and values might look to an unsympathetic observer.  The mass suicide of the members of the Heaven’s Gate UFO cult in 1997 offers one measure of just how far astray those quests for new sources of meaning can go; so, on a much larger scale, does the metastatic nightmare of Nazi Germany.
I wrote in an earlier post this month about the implosion of the sense of political legitimacy that’s quietly sawing the props out from underneath the US federal government, and convincing more and more Americans that the people who claim to represent and govern them are a pack of liars and thieves.  So broad and deep a loss of legitimacy is political dynamite, and normally results within no very long a time frame in the collapse of the government in question. There are no guarantees, though, that whatever system replaces a delegitimzed government will be any better.
That same principle applies with equal force to the collapse of the fundamental beliefs of a civilization. In next week’s post, with this in mind, I plan on talking about potential sources of meaning, purpose and value in a world on its way into a global dark age.

Bright Were The Halls Then

Wed, 2014-07-09 16:51
Arnold Toynbee, whose magisterial writings on history have been a recurring source of inspiration for this blog, has pointed out an intriguing difference between the way civilizations rise and the way they fall. On the way up, he noted, each civilization tends to diverge not merely from its neighbors but from all other civilizations throughout history.  Its political and religious institutions, its arts and architecture, and all the other details of its daily life take on distinctive forms, so that as it nears maturity, even the briefest glance at one of its creations is often enough to identify its source.
 Once the peak is past and the long road down begins, though, that pattern of divergence shifts into reverse, slowly at first, and then with increasing speed. A curious sort of homogenization takes place: distinctive features are lost, and common patterns emerge in their place.  That doesn’t happen all at once, and different cultural forms lose their distinctive outlines at different rates, but the further down the trajectory of decline and fall a civilization proceeds, the more it resembles every other civilization in decline. By the time that trajectory bottoms out, the resemblance is all but total; compare one postcollapse society to another—the societies of post-Roman Europe, let’s say, with those of post-Mycenean Greece—and it can be hard to believe that dark age societies so similar could have emerged out of the wreckage of civilizations so different.
It’s interesting to speculate about why this reversion to the mean should be so regular a theme in the twilight and afermath of so many civilizations. Still, the recurring patterns of decline and fall have another implication—or, if you will, another application. I’ve noted here and elsewhere that modern industrial society, especially but not only here in North America, is showing all the usual symptoms of a civilization on its way toward history’s compost bin. If we’ve started along the familiar track of decline and fall—and I think a very good case can be made for that hypothesis—it should be possible to map the standard features of the way down onto the details of our current situation, and come up with a fairly accurate sense of the shape of the future ahead of us.
All the caveats raised in last week’s Archdruid Report post deserve repetition here, of course. The part of history that can be guessed in advance is a matter of broad trends and overall patterns, not the sort of specific incidents that make up so much of history as it happens.  Exactly how the pressures bearing down on late industrial America will work out in the day-by-day realities of politics, economics, and society will be determined by the usual interplay of individual choices and pure dumb luck. That said, the broad trends and overall patterns are worth tracking in their own right, and some things that look as though they ought to belong to the realm of the unpredictable—for example, the political and military dynamics of border regions, or the relations among the imperial society’s political class, its increasingly disenfranchised lower classes, and the peoples outside its borders—follow predictable patterns in case after case in history, and show every sign of doing the same thing this time around too.
What I’m suggesting, in fact, is that in a very real sense, it’s possible to map out the history of North America over the next five centuries or so in advance. That’s a sweeping claim, and I’m well aware that the immediate response of at least some of my readers will be to reject the possibility out of hand. I’d like to encourage those who have this reaction to try to keep an open mind. In the posts to come, I plan on illustrating every significant point I make with historical examples from the twilight years of other civilizations, as well as evidence from the current example insofar as that’s available yet.  Thus it should be possible for my readers to follow the argument as it unfolds and see how it hangs together.
Now of course all this presupposes that the lessons of the past actually have some relevance to our future. I’m aware that that’s a controversial proposal these days, but to my mind the controversy says more about the popular idiocies of our time than it does about the facts on the ground. I’ve discussed in previous posts how people in today’s America have taken to using thoughtstoppers such as "but it’s different this time!" to protect themselves from learning anything from history—a habit that no doubt does wonders for their peace of mind today, though it pretty much guarantees them a face-first collision with a brick wall of misery and failure not much further down time’s road. Those who insist on clinging to that habit are not going to find the next year or so of posts here to their taste.
They won’t be the only ones. Among the resources I plan on using to trace out the history of the next five centuries is the current state of the art in the environmental sciences, and that includes the very substantial body of evidence and research on anthropogenic climate change. I’m aware that some people consider that controversial, and of course some very rich corporate interests have invested a lot of money into convincing people that it’s controversial, but I’ve read extensively on all sides of the subject, and the arguments against taking anthropogenic climate change seriously strike me as specious. I don’t propose to debate the matter here, either—there are plenty of forums for that. While I propose to leaven current model-based estimates on climate change and sea level rise with the evidence from paleoclimatology, those who insist that there’s nothing at all the matter with treating the atmosphere as an aerial sewer for greenhouse gases are not going to be happy with the posts ahead.
I also propose to discuss industrial civilization’s decline and fall without trying to sugarcoat the harsher dimensions of that process, and that’s going to ruffle yet another set of feathers. Regular readers will recall a post earlier this year discussing the desperate attempts to insist that it won’t be that bad, really it won’t, that were starting to show up in the flurry of criticism each of these weekly essays reliably fields.  That’s even more common now than it was then; nowadays, in fact, whenever one of my posts uses words such as "decline" or "dark age," I can count on being taken to task by critics who insist earnestly that such language is too negative, that of course we’re facing a shift to a different kind of society but I shouldn’t describe it in such disempowering terms, and so on through the whole vocabulary of the obligatory optimism that’s so fashionable among the privileged these days.
I’m pretty sure, as noted in the blog post just cited, that this marks the beginning of a shift by the peak oil community as a whole out of the second of Elisabeth Kubler-Ross’ famous five stages, the stage of anger, into the third stage of bargaining. That’s welcome, in that it brings us closer to the point at which people have finished dealing with their own psychological issues and can get to work coping with the predicament of our time, but it’s still as much an evasion of that predicament as denial and anger were. The fall of a civilization is not a pleasant prospect—and that’s what we’re talking about, of course: the decline and fall of industrial civilization, the long passage through a dark age, and the first stirrings of the successor societies that will build on our ruins. That’s how the life cycle of a civilization ends, and it’s the way that ours is ending right now.
What that means in practice is that most of the familiar assumptions people in the industrial world like to make about the future will be stood on their heads in the decades and centuries ahead. Most of the rhetoric being splashed about these days in support of this or that or the other Great Turning that will save us from the consequences of our own actions assumes, as a matter of course, that a majority of people in the United States—or, heaven help us, in the whole industrial world—can and will come together around some broadly accepted set of values and some agreed-upon plan of action to rescue industrial civilization from the rising spiral of crises that surrounds it. My readers may have noticed that things seem to be moving in the opposite direction, and history suggests that they’re quite correct.
Among the standard phenomena of decline and fall, in fact, is the shattering of the collective consensus that gives a growing society the capacity to act together to accomplish much of anything at all.  The schism between the political class and the rest of the population—you can certainly call these "the 1%" and "the 99%" if you wish—is simply the most visible of the fissures that spread through every declining civilization, breaking it into a crazy quilt of dissident fragments pursuing competing ideals and agendas. That process has a predictable endpoint, too:  as the increasingly grotesque misbehavior of the political class loses it whatever respect and loyalty it once received from the rest of society, and the masses abandon their trust in the political institutions of their society, charismatic leaders from outside the political class fill the vacuum, violence becomes the normal arbiter of power, and the rule of law becomes a polite fiction when it isn’t simply abandoned altogether.
The economic sphere of a society in decline undergoes a parallel fragmentation for different reasons. In ages of economic expansion, the labor of the working classes yields enough profit to cover the costs of a more or less complex superstructure, whether that superstructure consists of the pharaohs and priesthoods of ancient Egypt or the bureaucrats and investment bankers of late industrial America. As expansion gives way to contraction, the production of goods and services no longer yields the profit ot once did, but the members of the political class, whose power and wealth depend on the superstructure, are predictably unwilling to lose their privileged status and have the power to keep themselves fed at everyone else’s expense. The reliable result is a squeeze on productive economic activity that drives a declining civilization into one convulsive financial crisis after another, and ends by shredding its capacity to produce even the most necessary goods and services .
In response, people begin dropping out of the economic mainstream altogether, because scrabbling for subsistence on the economic fringes is less futile than trying to get by in a system increasingly rigged against them. Rising taxes, declining government services, and systematic privatization of public goods by the rich compete to alienate more and more people from the established order, and the debasement of the money system in an attempt to make up for faltering tax revenues drives more and more economic activity into forms of exchange that don’t involve money at all.  As the monetary system fails, in turn, economies of scale become impossible to exploit; the economy fragments and simplifies until bare economic subsistence on local resources, occasionally supplemented by plunder, becomes the sole surviving form of economic activity
Taken together, these patterns of political fragmentation and economic unraveling send the political class of a failing civilization on a feet-first journey through the exit doors of history.  The only skills its members have, by and large, are those needed to manipulate the complex political and economic levers of their society, and their power depends entirely on the active loyalty of their subordinates, all the way down the chain of command, and the passive obedience of the rest of society.  The collapse of political institutions strips the political class of any claim to legitimacy, the breakdown of the economic system limits its ability to buy the loyalty of those that it can no longer inspire, the breakdown of the levers of control strips its members of the only actual power they’ve got, and that’s when they find themselves having to compete for followers with the charismatic leaders rising just then from the lower echelons of society. The endgame, far more often than not, comes when the political class tries to hire the rising leaders of the disenfranchised as a source of muscle to control the rest of the populace, and finds out the hard way that it’s the people who carry the weapons, not the ones who think they’re giving the orders, who actually exercise power.
The implosion of the political class has implications that go well beyond a simple change in personnel at the upper levels of society. The political and social fragmentation mentioned earlier applies just as forcefully to the less tangible dimensions of human life—its ideas and ideals, its beliefs and values and cultural practices. As a civilization tips over into decline, its educational and cultural institutions, its arts, literature, sciences, philosophies and religions all become identified with its political class; this isn’t an accident, as the political class generally goes out of its way to exploit all these things for the sake of its own faltering authority and influence. To those outside the political class, in turn, the high culture of the civilization becomes alien and hateful, and when the political class goes down, the cultural resources that it harnessed to its service go down with it.
Sometimes, some of those resources get salvaged by subcultures for their own purposes, as Christian monks and nuns salvaged portions of classical Greek and Roman philosophy and science for the greater glory of God. That’s not guaranteed, though, and even when it does happen, the salvage crew picks and chooses for its own reasons—the survival of classical Greek astronomy in the early medieval West, for example, happened simply because the Church needed to know how to calculate the date of Easter. Where no such motive exists, losses can be total: of the immense corpus of Roman music, the only thing that survives is a fragment of one tune that takes about 25 seconds to play, and there are historical examples in which even the simple trick of literacy got lost during the implosion of a civilization, and had to be imported centuries later from somewhere else.
All these transformations impact the human ecology of a falling civilization—that is, the basic relationships with the natural world on which every human society depends for day to day survival. Most civilizations know perfectly well what has to be done to keep topsoil in place, irrigation water flowing, harvests coming in, and all the other details of human interaction with the environment on a stable footing. The problem is always how to meet the required costs as economic growth ends, contraction sets in, and the ability of central governments to enforce their edicts begins to unravel. The habit of feeding the superstructure at the expense of everything else impacts the environment just as forcefully as it does the working classes:  just as wages drop to starvation levels and keep falling, funding for necessary investments in infrastructure, fallow periods needed for crop rotation, and the other inputs that keep an agricultural system going in a sustainable manner all get cut. 
As a result, topsoil washes away, agricultural hinterlands degrade into deserts or swamps, vital infrastructure collapses from malign neglect, and the ability of the land to support human life starts on the cascading descent that characterizes the end stage of decline—and so, in turn, does population, because human numbers in the last analysis are a dependent variable, not an independent one. Populations don’t grow or shrink because people just up and decide one day to have more or fewer babies; they’re constrained by ecological limits. In an expanding civilization, as its wealth and resource base increases, the population expands as well, since people can afford to have more children, and since more of the children born each year have access to the nutrition and basic health care that let them survive to breeding age themselves.  When growth gives way to decline, population typically keeps rising for another generation or so due to sheer demographic momentum, and then begins to fall.
The consequences can be traced in the history of every collapsing civilization.  As the rural economy implodes due to agricultural failure on top of the more general economic decline, a growing fraction of the population concentrates in urban slum districts, and as public health measures collapse, these turn into incubators for infectious disease. Epidemics are thus a common feature in the history of declining civilizations, and of course war and famine are also significant factors, but an even larger toll is taken by the constant upward pressure exerted on death rates by poverty, malnutrition, crowding, and stress. As deaths outnumber births, population goes into a decline that can easily continue for centuries. It’s far from uncommon for the population of an area in the wake of a civilization to equal less than 10% of the figure it reached at the precollapse peak.
Factor these patterns together, follow them out over the usual one to three centuries of spiralling decline, and you have the standard picture of a dark age society: a mostly deserted countryside of small and scattered villages where subsistence farmers, illiterate and impoverished, struggle to coax fertility back into the depleted topsoil. Their goverments consist of the personal rule of local warlords, who take a share of each year’s harvest in exchange for protection from raiders and rough justice administered in the shade of any convenient tree. Their literature consists of poems, lovingly memorized and chanted to the sound of a simple stringed instrument, recalling the great deeds of the charismatic leaders of a vanished age, and these same poems also contain everything they know about their history. Their health care consists of herbs, a little rough surgery, and incantations cannily used to exploit the placebo effect. Their science—well, I’ll let you imagine that for yourself.
And the legacy of the past? Here’s some of what an anonymous poet in one dark age had to say about the previous civilization:
Bright were the halls then, many the bath-houses,High the gables, loud the joyful clamor,Many the meadhalls full of delightsUntil mighty Fate overthrew it all. Wide was the slaughter, the plague-time came,Death took away all those brave men.Broken their ramparts, fallen their halls,The city decayed; those who built itFell to the earth. Thus these courts crumble,And roof-tiles fall from this arch of stone.
Fans of Anglo-Saxon poetry will recognize that as a passage from "The Ruin." If the processes of history follow their normal pattern, they will be chanting poems like this about the ruins of our cities four or five centuries from now. How we’ll get there, and what is likely to happen en route, will be the subject of most of the posts here for the next year or so.

In a Handful of Dust

Wed, 2014-07-02 17:03
All things considered, it’s a good time to think about how much we can know about the future in advance. A hundred years ago last Saturday, as all my European readers know and a few of my American readers might have heard, a young Bosnian man named Gavrilo Prinzip lunged out of a crowd in Sarajevo and emptied a pistol into the Archduke Franz Ferdinand and his wife Sophie, who were touring that corner of the ramshackle Austro-Hungarian empire they were expected to inherit in due time. Over the summer months that followed, as a direct result of those gunshots, most of the nations of Europe went to war with one another, and the shockwaves set in motion by that war brought a global order centuries old crashing down.
In one sense, none of this was a surprise. Perceptive observers of the European scene had been aware for decades of the likelihood of a head-on crash between the rising power of Germany and the aging and increasingly fragile British Empire. The decade and a half before war actually broke out had seen an increasingly frantic scramble for military alliances that united longtime rivals Britain and France in a political marriage of convenience with the Russian Empire, in the hope of containing Germany’s growing economic and military might. Every major power poured much of its wealth into armaments, sparking an arms race so rapid that the most powerful warship on the planet in 1906, Britain’s mighty HMS Dreadnought, was hopelessly obsolete when war broke out eight years later.
Inquiring minds could read learned treatises by Halford Mackinder and many other scholars, explaining why conflict between Britain and Germany was inevitable; they could also take in serious fictional treatments of the subject such as George Chesney’s The Battle of Dorking and Saki’s When William Came, or comic versions such as P.G. Wodehouse’s The Swoop!. Though most military thinkers remained stuck in the Napoleonic mode of conflict chronicled in the pages of Karl von Clausewitz’ On War, those observers of the military scene who paid attention to the events of the American Civil War’s closing campaigns might even have been able to sense something of the trench warfare that would dominate the coming war on the western front.
It’s only fair to remember that a great many prophecies in circulation at that same time turned out to be utterly mistaken. Most of them, however, had a theme in common that regular readers of this blog will find quite familiar: the claim that because of some loudly ballyhooed factor or other, it really was different this time. Thus, for example, plenty of pundits insisted in the popular media that economic globalization had made the world’s economies so interdependent that war between the major powers was no longer possible. Equally, there was no shortage of claims that this or that or the other major technological advance had either rendered war impossible, or guaranteed that a war between the great powers would be over in weeks. Then as now, those who knew their history knew that any claim about the future that begins “It’s different this time” is almost certain to be wrong.
All things considered, it was not exactly difficult in the late spring of 1914, for those who were willing to do so, to peer into the future and see the shadow of a major war between Britain and Germany rising up to meet them. There were, in fact, many people who did just that. To go further and guess how it would happen, though, was quite another matter.  Some people came remarkably close; Bismarck, who was one of the keenest political minds of his time, is said to have commented wearily that the next great European war would probably be set off by some idiotic event in the Balkans.  Still, not even Bismarck could have anticipated the cascade of misjudgments and unintended consequences that sent this particular crisis spinning out of control in a way that half a dozen previous crises had not done.
What’s more, the events that followed the outbreak of war in the summer of 1914 quickly flung themselves off the tracks intended for them by the various political leaders and high commands, and carved out a trajectory of their own that nobody anywhere seems to have anticipated. That the Anglo-French alliance would squander its considerable military and economic superiority by refusing to abandon a bad strategy no matter how utterly it failed or how much it cost; that Russia’s immense armies would prove so feeble under pressure; that Germany would combine military genius and political stupidity in so stunningly self-defeating a fashion; that the United States would turn out to be the wild card in the game, coming down decisively on the Allied side just when the war had begun to turn in Germany’s favor—none of that was predicted, or could have been predicted, by anyone.
Nor were the consequences of the war any easier to foresee. On that bright summer day in 1914 when Gavrilo Prinzip burst from the crowd with a pistol in his hand, who could have anticipated the Soviet Union, the Great Depression, the blitzkreig, or the Holocaust? Who would have guessed that the victor in the great struggle between Britain and Germany would turn out to be the United States?  The awareness that Britain and Germany were racing toward a head-on collision did not provide any certain knowledge about how the resulting crash would turn out, or what its consequences would be; all that could be known for sure was that an impact was imminent and the comfortable certainties of the prewar world would not survive the shock.
That dichotomy, between broad patterns that are knowable in advance and specific details that aren’t, is very common in history. It’s possible, for example, that an impartial observer who assessed the state of the Roman Empire in 400 or so could have predicted the collapse of Roman power outside the Eastern Mediterranean littoral. As far as I know, no one did so—the ideological basis of Roman society made the empire’s implosion just as unthinkable then as the end of progress is today—but the possibility was arguably there. Even if an observer had been able to anticipate the overall shape of the Roman and post-Roman future, though, that anticipation wouldn’t have reached as far as the specifics of the collapse, and let’s not even talk about whether our observer might have guessed that the last Emperor of Rome in the west would turn out to be the son of Attila the Hun’s secretary, as in fact he was.
Such reflections are on my mind rather more than usual just now, for reasons that will probably come as no surprise to regular readers of this blog. For a variety of reasons, a few of which I’ll summarize in the paragraphs ahead, I think it’s very possible that the United States and the industrial world in general are near the brink of a convusive era of crisis at least as severe as the one that began in the summer of 1914. It seems very likely to me that in the years immediately ahead, a great many of the comfortable certainties of the last half century or so are going to be thrown overboard once and for all, as waves of drastic political, economic, military, social, and ecological change slam into societies that, despite decades of cogent warnings, have done precisely nothing to prepare for them.
I want to review here some of the reasons why I expect an era of crisis to arrive sooner rather than later. One of the most important of those reasons is the twilight of the late (and soon to be loudly lamented) fracking bubble. I’ve noted in previous posts here that the main product of the current fracking industry is neither oil nor gas, but the same sort of dubiously priced financial paper we all got to know and love in the aftermath of last decade’s real estate bubble. These days, the rickety fabric of American finance depends for its survival on a steady flow of hallucinatory wealth, since the production of mere goods and services no longer produces enough profit to support the Brobdingnagian superstructure of the financial industry and its swarm of attendant businesses. These days, too, an increasingly brittle global political order depends for its survival on the pretense that the United States is still the superpower it was decades ago, and all those strident and silly claims that the US is about to morph into a "Saudi America" flush with oil wealth are simply useful evasions that allow the day of reckoning, with its inevitable reshuffling of political and economic status, to be put off a little longer.
Unfortunately for all those involved, the geological realities on which the fracking bubble depends are not showing any particular willingness to cooperate. The downgrading of the Monterey Shale not long ago was just the latest piece of writing on the wall: one more sign that we’re scraping the bottom of the oil barrel under the delusion that this proves the barrel is still full. The fact that most of the companies in the fracking industry are paying their bills by running up debt, since their expenses are considerably greater than their earnings, is another sign of trouble that ought to be very familiar to those of us who witnessed the housing bubble’s go through its cycle of boom and bust.
Bubbles are like empires; if you watch one rise, you can be sure that it’s going to fall. What you don’t know, and can’t know, is when and how. That’s a trap that catches plenty of otherwise savvy investors. They see a bubble get under way, recognize it as a bubble, put money into it under the fond illusion that they can anticipate the bust and pull their money out right before the bottom drops out...and then, like everyone else, they get caught flatfooted by the end of the bubble and lose their shirts. That’s one of the great and usually unlearned lessons of finance: when a bubble gets going, it’s the pseudo-smart money that piles into it—the really smart money heads for the hills.
So it’s anyone’s guess when exactly the fracking bubble is going to pop, and even more uncertain how much damage it’s going to do to what remains of the US economy. A good midrange guess might be that it’ll have roughly the same impact that the popping of the housing bubble had in 2008 and 2009, but it could be well to either side of that estimate. Crucially, though, the damage that it does will be landing on an economy that has never really recovered from the 2008-2009 housing crash, in which actual joblessness (as distinct from heavily manipulated unemployment figures) is at historic levels and a very large number of people are scrambling for survival. At this point, another sharp downturn would make things much worse for a great many millions whose prospects aren’t that good to begin with, and that has implications that cross the border from economics into politics.
Meanwhile, the political scene in the United States is primed for an explosion. One of my regular readers—tip of the archdruid’s hat to Andy Brown—is a research anthropologist who recently spent ten weeks traveling around the United States asking people about their opinions and feelings concerning government. What he found was that, straight across geographical, political, and economic dividing lines, everyone he interviewed described the US government as the corrupt sock puppet of wealthy interests. He noted that he couldn’t recall ever encountering so broad a consensus on any political subject, much less one as explosive as this. 
Recent surveys bear him out. Only 7% of Americans feel any significant confidence in Congress.  Corresponding figures for the presidency and the Supreme Court are 29% and 30% respectively; fewer than a third of Americans, that is, place much trust in the political institutions whose birth we’ll be celebrating in a few days. This marks a tectonic shift of immense importance.  Not that many decades ago, substantial majorities of Americans believed in the essential goodness of the institutions that governed their country. Even those who condemned the individuals running those institutions—and of course that’s always been one of our national sports—routinely phrased those condemnations in terms reflecting a basic faith in the institutions themselves, and in the American experiment as a whole.
Those days are evidently over. The collapse of legitimacy currently under way in the United States is a familiar sight to students of history, who can point to dozens of comparable examples; each of these was followed, after no very long delay, by the collapse of the system of government whose legitimacy in the eyes of its people had gone missing in action. Those of my readers who are curious about such things might find it educational to read a good history of the French or the Russian revolutions, the collapse of the Weimar Republic or the Soviet Union, or any of the other implosions of political authority that have littered the last few centuries with rubble: when a system loses legitimacy in the eyes of the people it claims to lead, the end of that system is on its way.
The mechanics behind the collapse are worth a glance as well. Whether or not political power derives from the consent of the governed, as American political theory insists, it’s unarguably true that political power depends from moment to moment on the consent of the people who do the day-to-day work of governing:  the soldiers, police officers, bureaucrats and clerks whose job is is to see to it that orders from the leadership get carried out. Their obedience is the linchpin on which the survival of a regime rests, and it’s usually also the fault line along which regimes shatter, because these low-ranking and poorly paid functionaries aren’t members of the elite. They’re ordinary working joes and janes, subject to the same cultural pressures as their neighbors, and they generally stop believing in the system they serve about the same time as their neighbors do. That doesn’t stop them from serving it, but it does very reliably make them unwilling to lay down their lives in its defense, and if a viable alternative emerges, they’re rarely slow to jump ship.
Here in America, as a result of the processes just surveyed, we’ve got a society facing a well-known pattern of terminal crisis, with a gridlocked political system that’s lost its legitimacy in the eyes of the people it governs, coupled with a baroque and dysfunctional economic system lurching toward another cyclical collapse under the weight of its own hopelessly inefficient management of wealth. This is not a recipe for a comfortable future. The situation has become dire enough that some of the wealthiest beneficiaries of the system—usually the last people to notice what’s happening, until the mob armed with torches and pitchforks shows up at their mansion’s front door—have belatedly noticed that robbing the rest of society blind is not a habit with a long shelf life, and have begun to suggest that if the rich don’t fancy the thought of dangling from lampposts, they might want to consider a change in approach. 
In its own way, this recognition is a promising sign. Similar realizations some seventy years ago put Franklin Roosevelt in the White House and spared the United States the hard choice between civil war and authoritarian rule that so many other countries were facing just then.  Unless a great many more members of our kleptocratic upper class experience the same sort of wake-up call in a hurry, though, the result this time is likely to be far too little and much too late.
Here again, though, a recognition that some kind of crash is coming doesn’t amount to foreknowledge of when it’s going to hit, how it’s going to play out, or what the results will be. If the implosion of the fracking bubble leads to one more round of bailouts for the rich and cutbacks for the poor, we could see the inner cities explode as they did in the long hot summers of the 1960s, setting off the insurgency that was so narrowly avoided in those years, and plunging the nation into a long nightmare of roadside bombs, guerrilla raids, government reprisals, and random drone strikes. If a talented demagogue shows up in the right place and time, we might instead see the rise of a neofascist movement that would feed on the abandoned center of American politics and replace the rusted scraps of America’s democratic institutions with a shiny new dictatorship.
If the federal government’s gridlock stiffens any further toward rigor mortis, for that matter, we could see the states force a constitutional convention that could completely rewrite the terms of our national life, or simply dissolve the Union and allow new regional nations to take shape.  Alternatively, if a great many factors break the right way, and enough people in and out of the corridors of power take the realities of our predicament seriously and unexpectedly grow some gonads—either kind, take your pick—we might just be able to stumble through the crisis years into an era of national retrenchment and reassessment, in which many of the bad habits picked up during America’s century of empire get chucked in history’s compost bin, and some of the ideals that helped inspire this country get a little more attention for a while. That may not be a likely outcome, but I think it’s still barely possible.
All we can do is wait and see what happens, or try to take action in the clear awareness that we can’t know what effects our actions will have. Thinking about that predicament, I find myself remembering lines from the bleak and brilliant poetic testament of the generation that came of age in the aftermath of those gunshots in Sarajevo, T.S. Eliot’s The Waste Land:
What are the roots that clutch, what branches growOut of this stony rubbish? Son of man,You cannot say, or guess, for you know onlyA heap of broken images, where the sun beats,And the dead tree gives no shelter, the cricket no relief,And the dry stone no sound of water. OnlyThere is shadow under this red rock(Come in under the shadow of this red rock),And I will show you something different from eitherYour shadow at morning striding behind youOr your shadow at evening rising up to meet you:I will show you fear in a handful of dust.
It’s a crisp metaphor for the challenges of our time, as it was of those in the time about which Eliot wrote. For that matter, the quest to see something other than our own shadows projected forward on the future or backward onto the past has a broader significance for the project of this blog. With next week’s post, I plan on taking that quest a step further. The handful of dust I intend to offer my readers for their contemplation is the broader trajectory of which the impending crisis of the United States is one detail: the descent of industrial civilization over the next few centuries into a deindustrial dark age.

The Broken Thread of Culture

Wed, 2014-06-25 17:38
There are times when the deindustrial future seems to whisper in the night like a wind blowing through the trees, sending the easy certainties of the present spinning like dead leaves. I had one of those moments recently, courtesy of a news story from 1997 that a reader forwarded me, about the spread of secret stories among homeless children in Florida’s Dade County.  These aren’t your ordinary children’s stories: they’re myths in the making, a bricolage of images from popular religion and folklore torn from their original contexts and pressed into the service of a harsh new vision of reality.
God, according to Dade County’s homeless children, is missing in action; demons stormed Heaven a while back and God hasn’t been seen since. The mother of Christ murdered her son and morphed into the terrifying Bloody Mary, a nightmare being who weeps blood from eyeless sockets and seeks out children to kill them.  Opposing her is a mysterious spirit from the ocean who takes the form of a blue-skinned woman, and who can protect children who know her secret name. The angels, though driven out of Heaven, haven’t given up; they carry on their fight against the demons from a hidden camp in the jungle somewhere outside Miami, guarded by friendly alligators who devour hostile intruders. The spirits of children who die in Dade County’s pervasive gang warfare can go to the camp and join the war against the demons, so long as someone who knows the stories puts a leaf on their graves.
This isn’t the sort of worldview you’d expect from people living in a prosperous, scientifically literate industrial society, but then the children in Dade County’s homeless shelters don’t fit that description in any meaningful sense. They live in conditions indistinguishable from the worst end of the Third World; their lives are defined by poverty, hunger, substance abuse, shattered families, constant uncertainty, and lethal violence dispensed at random. If, as Bruce Sterling suggested, the future is already here, just not evenly distributed yet, they’re the involuntary early adopters of a future very few people want to think about just now, but many of us will experience in the decades ahead, and most of humanity will face in the centuries that follow: a future we may as well call by the time-honored label "dark age."
That label actually dates from before the period most often assigned it these days. Marcus Terentius Varro, who was considered the most erudite Roman scholar of his time, divided up the history known to him into three ages—an age of history, for which there were written records; before that, an age of fable, from which oral traditions survived; and before that, a dark age, about which no one knew anything at all. It’s a simple division but a surprisingly useful one; even in those dark ages where literacy survived as a living tradition, records tend to be extremely sparse and unhelpful, and when records pick up again they tend to be thickly frosted with fable and legend for a good long while thereafter. In a dark age, the thread of collective memory and cultural continuity snaps, the ends are lost, and a new thread must be spun from whatever raw materials happen to be on hand.
There are many other ways to talk about dark ages, and we’ll get to those in later posts, but I want to focus on this aspect for the moment. Before the Greco-Roman world Varro knew, an earlier age of complex, literate civilizations had flourished and then fallen, and the dark age that followed was so severe that in many regions—Greece was one of them—even the trick of written language was lost, and had to be imported from elsewhere centuries afterwards. The dark age following Varro’s time wasn’t quite that extreme, but it was close enough; literacy became a rare attainment, and vast amounts of scientific, technical, and cultural knowledge were lost. To my mind, that discontinuity demands more attention than it’s usually been given.  What is it that snaps the thread that connects past to present, and allows the accumulated knowledge of an entire civilization to fall into oblivion?
A recurring historical process lies behind that failure of transmission, and it’s one that can be seen at work in those homeless children of Dade County, whispering strange stories to one another in the night.
Arnold Toynbee, whose monumental work A Study of History has been a major inspiration to this blog’s project, proposed that civilizations on the way to history’s compost heap always fail in the same general way. The most important factor that makes a rising civilization work, he suggested, is mimesis—the universal human habit by which people imitate the behavior and attitudes of those they admire. As long as the political class of a civilization can inspire admiration and affection from those below it, the civilization thrives, because the shared sense of values and purpose generated by mimesis keeps the pressures of competing class interests from tearing it apart.
Civilizations fail, in turn, because their political classes lose the ability to inspire mimesis, and this happens in turn because members of the elite become so fixated on maintaining their own power and privilege that they stop doing an adequate job of addressing the problems facing their society.  As those problems spin further and further out of control, the political class loses the ability to inspire and settles instead for the ability to dominate. Outside the political class and its hangers-on, in turn, more and more of the population becomes what Toynbee calls an internal proletariat, an increasingly sullen underclass that still provides the political class with its cannon fodder and labor force but no longer sees anything to admire or emulate in those who order it around.
It can be an unsettling experience to read American newspapers or wide-circulation magazines from before 1960 or so with eyes sharpened by Toynbee’s analysis.  Most newspapers included a feature known as the society pages, which chronicled the social and business activities of the well-to-do, and those were read, with a sort of fascinated envy, very far down the social pyramid. Established figures of the political and business world were treated with a degree of effusive respect you won’t find in today’s media, and even those who hoped to shoulder aside this politician or that businessman rarely dreamed of anything more radical than filling the same positions themselves. Nowadays? Watching politicians, businesspeople, and celebrities get dragged down by some wretched scandal or other is this nation’s most popular spectator sport.
That’s what happens when mimesis breaks down. The failure to inspire has disastrous consequences for the political class—when the only things left that motivate people to seek political office are cravings for power or money, you’ve pretty much guaranteed that the only leaders you’ll get are the sort of incompetent hacks who dominate today’s political scene—but I want to concentrate for a moment on the effects on the other end of the spectrum. The failure of the political class to inspire mimesis in the rest of society doesn’t mean that mimesis goes away. The habit of imitation is as universal among humans as it is among other social primates. The question becomes this:  what will inspire mimesis among the internal proletariat? What will they use as the templates for their choices and their lives?
That’s a crucial question, because it’s not just social cohesion that depends on mimesis.  The survival of the collective knowledge of a society—the thread connecting past with present I mentioned earlier—also depends on the innate habit of imitation. In most human societies, children learn most of what they need to know about the world by imitating parents, older siblings, and the like, and in the process the skills and knowledge base of the society is passed on to each new generation. Complex societies like ours do the same thing in a less straightforward way, but the principle is still the same. Back in the day, what motivated so many young people to fiddle with chemistry sets? More often than not, mimesis—the desire to be just like a real scientist, making real discoveries—and that was reasonable in the days when a significant fraction of those young people could expect to grow up to be real scientists.
That still happens, but it’s less and less common these days, and for those who belong to the rapidly expanding underclass of American society—the homeless children in Dade County I mentioned at the beginning of this essay, for example—the sort of mimesis that might lead to a career in science isn’t even an option. A great many of those children won’t live to reach adulthood, and they know it; those who do manage to dodge the stray bullets and the impact of collapsing public health, by and large, will spend their days in the crumbling, crowded warehouse facilities that substitute for schools in this country’s poorer neighborhoods, where maybe half of each graduating high school class comes out functionally illiterate; their chances of getting a decent job of any kind weren’t good even before the global economy started unraveling, and let’s not even talk about those chances now.
When imitating the examples offered by the privileged becomes a dead end, in other words, people find other examples to imitate. That’s one of the core factors, I’m convinced, behind the collapse of the reputation of the sciences in contemporary American society, which is so often bemoaned by scientists and science educators.  Neil DeGrasse Tyson, say, may rhapsodize about the glories of science, but what exactly do those glories have to offer children huddling in an abandoned house in some down-at-heels Miami suburb, whose main concerns are finding ways to get enough to eat and stay out of the way of the latest turf war between the local drug gangs?
Now of course there’s been a standard kneejerk answer to such questions for the last century or so. That answer was that science and technology would eventually create such abundance that everyone in the world would be able to enjoy a middle-class lifestyle and its attendant opportunities.  That same claim can still be heard nowadays, though it’s grown shrill of late after repeated disconfirmation. In point of fact, for the lower 80% of Americans by income, the zenith of prosperity was reached in the third quarter of the 20th century, and it’s all been downhill from there. This isn’t an accident; what the rhetoric of progress through science misses is that the advance of science may have been a necessary condition for the boomtimes of the industrial age, but it was never a sufficient condition in itself.
The other half of the equation was the resource base on which industrial civilization depended. Three centuries ago, as industrialism got under way, it could draw on vast amounts of cheap, concentrated energy in the form of fossil fuels, which had been stored up in the Earth’s crust over the previous half billion years or so. It could draw on equally huge stocks of raw materials of various kinds, and it could also make use of a biosphere whose capacity to absorb pollutants and other environmental insults hadn’t yet been overloaded to the breaking point by human activity. None of those conditions still obtain, and the popular insistence that the economic abundance of the recent past must inevitably be maintained in the absence of the material conditions that made it possible—well, let’s just say that makes a tolerably good example of faith-based thinking.
Thus Tyson is on one side of the schism Toynbee traced out, and the homeless children of Dade County and their peers and soon-to-be-peers elsewhere in America and the world are on the other. He may denounce superstition and praise reason and science until the cows come home, but again, what possible relevance does that have for those children? His promises are for the privileged, not for them; whatever benefits further advances in technology might still have to offer will go to the dwindling circle of those who can still afford such things, not to the poor and desperate.  Of course that simply points out another way of talking about Toynbee’s schism:  Tyson thinks he lives in a progressing society, while the homeless children of Dade County know that they live in a collapsing one.
As the numbers shift toward the far side of that dividing line, and more and more Americans find themselves struggling to cope with a new and unwelcome existence in which talk about progress and prosperity amounts to a bad joke, the failure of mimesis—as in the fallen civilizations of the past—will become a massive social force. If the usual patterns play themselves out, there will be a phase when the  leaders of successful drug gangs, the barbarian warbands of our decline and fall, will attract the same rock-star charisma that clung to Attila, Alaric, Genseric and their peers. The first traces of that process are already visible; just as young Romans in the fourth century adopted the clothes and manners of Visigoths, it’s not unusual to see the children of white families in the suburban upper middle class copying the clothing and culture of inner city gang members.
Eventually, to judge by past examples, this particular mimesis is likely to extend a great deal further than it has so far. It’s when the internal proletariat turns on the failed dominant minority and makes common cause with what Toynbee calls the external proletariat—the people who live just beyond the borders of the falling civilization, who have been shut out from its benefits but burdened with many of its costs, and who will eventually tear the corpse of the civilization to bloody shreds—that civilizations make the harsh transition from decline to fall. That transition hasn’t arrived yet for our civilization, and exactly when it will arrive is by no means a simple question, but the first whispers of its approach are already audible for those who know what to listen for and are willing to hear.
The age of charismatic warlords, though, is an epoch of transition rather than an enduring reality.  The most colorful figures of that age, remade by the workings of the popular imagination, become the focus of folk memories and epic poetry in the ages that follow; Theodoric the Ostrogoth becomes Dietrich von Bern and the war leader Artorius becomes the courtly King Arthur, taking their place alongside Gilgamesh, Arjuna, Achilles, Yoshitsune, and their many equivalents. In their new form as heroes of romance, they have a significant role to play as objects of mimesis, but it tends to be restricted to specific classes, and finds a place within broader patterns of mimesis that draw from other sources.
And those other sources?  What evidence we have—for the early stages of their emergence are rarely well documented—suggests that they begin as strange stories whispered in the night, stories that deliberately devalue the most basic images and assumptions of a dying civilization to find meaning in a world those images and assumptions no longer explain.
Two millennia ago, for example, the classical Greco-Roman world imagined itself seated comfortably at the summit of history.  Religious people in that culture gloried in gods that had reduced primal chaos to permanent order and exercised a calm rulership over the cosmos; those who rejected traditional religion in favor of rationalism—and there was no shortage of those, any more than there is today; it’s a common stage in the life of every civilization—rewrote the same story in secular terms, invoking various philosophical principles of order to fill the role of the gods of Olympus; political thinkers defined history in the same terms, with the Roman Empire standing in for Jupiter Optimus Maximus. It was a very comforting way of thinking about the world, if you happened to be a member of the gradually narrowing circle of those who benefited from the existing order of society.
To thos who formed the nucleus of the Roman Empire’s internal proletariat, though, to slaves and the urban poor, that way of thinking communicated no meaning and offered no hope. The scraps of evidence that survived the fall of the Roman world suggest that a great many different stories got whispered in the darkness, but those stories increasingly came to center around a single narrative—a story in which the God who created everything came down to walk the earth as a man, was condemned by a Roman court as a common criminal, and was nailed to a cross and left hanging there to die.
That’s not the sort of worldview you’d expect from people living in a prosperous, philosophically literate classical society, but then the internal proletariat of the Roman world increasingly didn’t fit that description. They were the involuntary early adopters of the post-Roman future, and they needed stories that would give meaning to lives defined by poverty, brutal injustice, uncertainty, and violence. That’s what they found in Christianity, which denied the most basic assumptions of Greco-Roman culture in order to give value to the lived experience of those for whom the Roman world offered least.
This is what the internal proletariat of every collapsing civilization finds in whatever stories become central to the faith of the dark age to come.  It’s what Egyptians in the last years of the Old Kingdom found by abandoning the official Horus-cult in favor of the worship of Osiris, who walked the earth as a man and suffered a brutal death; it’s what many Indians in the twilight of the Guptas and many Chinese in the aftermath of the Han dynasty found by rejecting their traditional faiths in favor of reverence for the Buddha, who walked away from a royal lifestyle to live by his begging bowl and search for a way to leave the miseries of existence behind forever.  Those and the many more examples like them inspired mimesis among those for whom the official beliefs of their civilizations had become a closed book, and became the core around which new societies emerged.
The stories being whispered from one homeless Dade County child to another probably aren’t the stories that will serve that same function as our civilization follows the familiar trajectory of decline and fall. That’s my guess, at least, though of course I could be wrong. What those whispers in the night seem to be telling me is that the trajectory in question is unfolding in the usual way—that those who benefit least from modern industrial civilization are already finding meaning and hope in narratives that deliberately reject our culture’s traditional faiths and overturn the most fundamental presuppositions of our age. As more and more people find themselves in similar straits, in turn, what are whispers in the night just now will take on greater and greater volume, until they drown out the stories that most people take on faith today.

The Stories of our Grandchildren

Wed, 2014-06-18 17:05
Over the last six weeks, in what spare time I could find, I’ve glanced back over the last eight years of weekly Archdruid Report posts, trying to get some sense of where this blog has been and where it might head in the months and years to come. In language the Grateful Dead made famous—well, among those of us in a certain generation, at least—it‘s been a long strange trip, crossing terrain not often included in tours of the future of our faltering industrial civilization.
Among those neglected landscapes of the mind, though, the territory that seems most crucial to me involves the role that stories play in shaping our ideas and expectations about the future, and thus our future itself. It’s a surprisingly difficult issue for many people these days to grapple with. Each time I raise it, I can count on hearing from readers who don’t get what I’m saying, usually because they’ve lost track of the distinction between whatever story they’ve gotten stuck in their minds and the far more diffuse and shapeless experiences that the story claims to explain. We tell ourselves stories to explain the world; that much is normal among human beings, and inevitable. The problem creeps in when we lose track of the difference between the narrative map and the experiential territory, and treat (for example) progress as a simple reality, rather than the complex and nuanced set of interpretations we lay over the facts of history to turn them into incidents in a familiar narrative.
During the time just past, I’ve had several reminders of the power of stories to shape the world of human experience, and the way those stories can get out of step with the facts on the ground. I’d like to take a moment to talk about a couple of those just now.
The first reminder made quite a splash in the news media a couple of weeks ago, when the Energy Information Administraton (EIA)—the US bureaucracy that publishes statistics about American energy resources and production—was forced to admit in public that, well, actually, there was only about 4% as much economically extractable oil in the Monterey Shale in California as they’d claimed a few years earlier. Given that this same Monterey Shale was supposed to provide us with around two-thirds of the oil that was allegedly going to turn the United States into a major oil exporter again by 2020, this was not precisely a minor issue. How many other oil shale deposits are facing similar downgrades? That’s a good question, and one which the EIA seems noticeably unwilling to address.
Bertram Gross pointed out a good many years ago that economic indicators were becoming “economic vindicators,” meant to justify government policy instead of providing accurate glimpses into what’s actually happening in the economic sphere. That certainly seems to have been one of the things behind the EIA’s stratospherically overenthusiastic estimates.  Equally, the US government seems to have responded to the current boom in shale with exactly the same sort of mindless cheerleading it displayed during the housing bubble that popped in 2008 and the tech stock bubble that popped in 2001. I trust it hasn’t escaped the attention of my readers that the key product of the shale oil boom hasn’t been oil or natural gas, but bundled shale leases and similar scraps of overpriced paper, sold to gullible investors with the same gaudy promises of fast wealth and starry-eyed disdain for mere economic reality that fed those earlier bubbles, and drove the market crashes that followed.
Still, there’s more going on here than the common or garden variety political puffery and securities fraud that makes up so much of business as usual in America’s years of decline. The question that needs asking is this:  why are investors who watched those two earlier booms go bust, who either lost money in them or saw many others do so, lining up so eagerly to put their nest eggs into shale-oil companies that are losing money quarter after quarter, and can only stay in business by loading on more and more debt?  Why is the same weary drivel about a new economic era of perpetual prosperity being lapped up so uncritically for a third time in fifteen years, when anyone in possession of three functioning neurons ought to be able to recognize it as a rehash of the same failed hype paraded about in previous bubbles, all the way back to the tulip bubble in the 17th-century Netherlands?
That’s not a rhetorical question; it has an answer, and the answer follows from one of the most popular stories of our culture, the story that says that getting rich is normal. From Horatio Alger right on down to the present, our entertainment media have been overloaded with tales about people who rose up out of poverty and became prosperous. What’s more, during the boom times that made up so much of the 20th century, a modest fraction of those tales were true, or at least not obviously false. Especially but not only  in the United States, you could find people who were born poor and died rich. An expanding economy brings that option within reach for some, though—despite the propaganda—never for all.
The story was always at least a little dishonest, as the golden road up from poverty was never normal for more than a certain fraction of the population, and the wealth of the few always depended, as it always does depend in the real world, on the impoverishment of the many. During their 20th century heyday, the world’s industrial societies could pretend that wasn’t the case by the simple expedient of offshoring their poverty to the Third World, and supporting their standards of living at home on the backs of sharecroppers and sweatshop workers overseas. Still, in those same industrial nations, it was possible to ignore that for a while, and to daydream about a future in which every last human being on earth would get to enjoy the benefits of upward mobility in a rapidly expanding economy.
That dream is over and done with. To begin with, the long arc of economic expansion is over; subtract the fake wealth generated by the mass production of unpayable IOUs—the one real growth industry in our economy these days—and we live in an economy in decline, in which real wealth trickles away and  the fraction of the population permanently shut out of the workforce rises year after year.  Downward mobility, not upward mobility, has become a central fact of our time.  The reality has changed, but the story hasn’t, and so investors convinced that their money ought to make them money are easy prey for some grifter in a Brooks Brothers suit who insists that tech stocks, or real estate, or oil shales will inevitably bring them the rising incomes and painless prosperity that the real world no longer provides.
The same sort of mismatch between a popular story and an unwelcome reality defines the second reminder I want to discuss, which popped up during and after the Age of Limits conference late last month in the woods of south central Pennsylvania. That was a very lively and enjoyable event; when Dr. Dennis Meadows, this year’s special guest, noted how pleasant it was to speak to an audience that didn’t have to be convinced of the reality of limits to growth, he spoke for all the presenters and a great many of the attendees as well. For a few days, those of us who attended had the chance to talk about the most important reality of our age—the decline and impending fall of modern industrial civilization—without having to contend minute by minute with the thirty-one flavors of denial so many people use to evade that reality and the responsibilities it brings with it.
That said, there were a few jarring moments, and one of them happened in the interval between my talk on dark ages and Dr. Mark Cochrane’s excellent presentation on the realities of climate change. In the Q&A session after my talk, in response to a question from the audience, I noted how the prestige of science among the general public had taken a beating due to the way that scientific opinions handed down to the public as proven fact so often get retracted after a decade or so, a habit that has caused  many people outside the scientific community to treat all scientific pronouncements with skepticism. I cited several examples of this, and one of them was the way that popular works on climate science in the 1970s and 1980s routinely claimed that the world was on the brink of a new ice age.
Mention the existence of those claims nowadays and you’ll inevitably get denounced as a climate denialist. As my regular readers know, I’m nothing of the kind; I’ve written extensively about the impacts of anthropogenic climate change on the decades and centuries ahead, and my recently published science fiction novel Star’s Reach takes place in a 25th-century America ravaged by the impacts of climate change, in which oranges are grown in what’s now Illinois and Memphis has become a seaport. It’s become popular, for that matter, to insist that those claims of a new ice age never happened; I’d be happy, if anyone’s curious, to cite books published in the 1970s and 1980s for the general public, written by eminent scientists and respected science writers, that described the imminent ice age as a scientifically proven fact, since I have several on my bookshelf.
What I found interesting is that Dr. Cochrane, who is a more than usually careful scholar, jumped to the conclusion that my reference to these popular works of a bygone decade meant that I must be a climate denialist. I corrected him, and he accepted the correction gracefully.  Yet permaculturist and peak oil author Albert Bates then proceeded to miss my point in exactly the same way in his blog post on the event. Bates was present at the discussion, and presumably heard the whole exchange. He’s neither a stupid man nor a malicious one; why, then, so embarrassing and so public a misstatement?
This isn’t a rhetorical question, either; it has an answer, and the answer follows from another of the most popular stories of our culture, the story that says that having the right answer is all you need to get people to listen to you. You’ll find narratives with that theme straight through the popular culture of the last two centuries and more, and it also pervades the rhetoric of science and of scientific history: once the protagonist figures out what’s really going on, whether it’s a murder mystery or the hunt for the molecular structure of DNA, everything falls promptly into place.
Now of course in the real world, things aren’t generally so easy. That was precisely the point I was trying to make in the discussion at the Age of Limits conference:  however convincing the evidence for anthropogenic climate change may be to scientists, it’s failed to convince a great many people outside the scientific enterprise, and one of the things that’s driven that failure is the accelerating decline in the prestige of science in modern industrial society as a whole. Among the roots of that decline, in turn, is the dogmatic tone so often taken when scientists and science writers set out to communicate current scientific opinions to the general public—a tone that differs sharply, it bears remembering, from the far more tentative habits of communication practiced within the scientific community itself.
When climate scientists today insist that they’ve determined conclusively that we’ve entered an age of rising temperatures, I see no reason to doubt them—but they need to recall that many people still remember when writers and speakers with equally impressive scientific credentials insisted with equal vigor that it was just as certain that we’d entered an age of cooling temperatures.  Scientists in the relevant fields know what’s behind the change, but people outside the scientific community don’t; all they see is a flip-flop, and since such flip-flops of scientific opinion have been fairly common in recent decades, members of the general public are by no means as quick as they once were to take scientists at their word. For that matter, when spokespeople for the scientific community insist to the general public nowadays that the flip-flop never took place—that, for example, no reputable scientist or science writer ever claimed to the general public that a new ice age was imminent—those spokespeople simply leave themselves and the scientific community wide open to accusations of bad faith.
We don’t talk about the political dimensions of scientific authority in the modern industrial world. That’s what lies behind the convenient and inaccurate narrative I mentioned earlier, the one that claims that all you have to do to convince people is speak the truth. Question that story, and you have to deal with the mixed motives and tangled cultural politics inseparable from science as a human activity, and above all, you have to discuss the much-vexed relationship between the scientific community and a general public that has become increasingly suspicious of the rhetoric of expertise in contemporary life.
That relationship has dimensions that I don’t think anyone in the scientific community these days has quite grasped. I’ve been told privately by several active online proponents of creationism, for example, that they don’t actually care that much about how the world’s current stock of life forms got there; it’s just that the spluttering Donald Duck frenzy that can reliably be elicited from your common or garden variety rationalist atheist by questioning Darwin’s theory is too entertaining to skip.
Such reflections lead in directions most Americans aren’t willing to go, because they can’t be discussed without raising deeply troubling issues about the conflict between the cult of expertise and what’s left of the traditions of American democracy, and about the social construction of what’s considered real in this as in every other human culture. It’s much easier, and much more comfortable, to insist that the people on the other side of the divide just mentioned are simply stupid and evil, and—as in the example I cited earlier—to force any attempt to talk about the faltering prestige of science in today’s America into a more familiar discourse about who’s right and who’s wrong.
Equally, it’s much easier, and much more comfortable, to insist that the ongoing decline in standards of living here in America is either the fault of the poor or the fault of the rich.  Either evasion makes it possible to ignore all the evidence that suggests that what most Americans think of as a normal standard of living is actually an absurd degree of extravagance, made possibly only briefly by the reckless squandering of the most lavish energy resource our species will ever know.
One of the crucial facts of our age is thus that the stories we tell ourselves, the narratives we use to make sense of the events of our lives, have passed their pull date and no longer make sense of the world we experience. The stories our grandchildren use to make sense of their world will be different, from ours, because they will be living in the world that the misguided choices of the last four decades or so will have made—a world that is beginning to take shape around us already, even though most people nowadays are doing their level best not to notice that awkward fact.
Meanwhile, those new stories, the stories of our grandchildren, may already be stirring in the crawlspaces of our collective imagination. In future posts, I’ll be talking about some of the more troubling of those, but this week I’m pleased to have the chance to discuss something a little more cheerful along those lines:  the outcome of this year’s “Space Bats” deindustrial science fiction contest.
Regular readers of this blog will remember that back in the fall of 2011, in the course of discussing the role that the science fiction of previous decades played in shaping our expectations of the future, I put out a call for SF short stories set in a world on the far side of peak oil and climate change. I was delighted by the response: over the five months or so that followed, 63 stories were submitted, and I duly assembled an anthology: After Oil: SF Stories of a Post-Petroleum Future. This January, I announced a second contest of the same sort, with a three-month window in which stories would be accepted.
The response was even more impressive this time around. Over those three months I received 92 story submissions, some from Archdruid Report regulars but many others from people I didn’t know from Robert Heinlein’s off ox, and a remarkably large fraction of them were not only publishable but of very high quality. I despaired of winnowing down the input to one anthology’s worth; fortunately, the publisher came to the rescue by proposing a slight change in plans.
I’m therefore delighted to announce that there will be not one but two new anthologies—one of stories set in the twilight years of our own civilization, one of stories set in the new societies that will rise after the industrial world is a fading memory. The first one, After Oil 2: The Years of Crisis, will include the following stories:
Grant Canterbury’s "Dreaming"Walt Freitag’s "A Mile a Minute"Matthew Griffith’s "Promised Land"Diana Haugh’s "The Big Quiet"Martin Hensher’s "Crown Prerogative"J.M. Hughes’ "Byte Heist"Calvin Jennings’ "A Dead Art Form"Joseph Nemeth’s "A Break with the Past"N.N. Scott’s "When It Comes a Gully-Washer"David Trammel’s "A Fish Tale"Tony Whelk’s "Al-Kimiya"Rachel White’s "Story Material"
The second new anthology, After Oil 3: The Years of Rebirth, will include the following stories:
Bill Blondeau’s "The Borax Road Affair"Phil Harris’ "North of the Wall"Wylie Harris’ "Dispatches"Diana Haugh’s "Silver Survivor"Jason Heppenstall’s "Saga and the Bog People"J.M. Hughes’ "Dahamri"Gaianne Jenkins’ "Midwinter Eclipse"Troy Jones’ "For Our Mushrooms"Catherine McGuire’s "Singing the World"David Senti’s "Nuala Thrives"Al Sevcik’s "Community"Eric Singletary’s "City of Spirits"
Once again, I’d like to thank everyone who contributed a story to the contest; even with a spare anthology to fill, it wasn’t easy to choose among the entries. I’m looking into whether it might be possible to launch a quarterly magazine for deindustrial SF:  there’s clearly an ample supply of good writers who want to tell such stories, and (to judge from sales of the original anthology, and of my deindustrial SF novel Star’s Reach) plenty of people who want to read them as well.
That strikes me as a very good sign. We may not yet be in a position to guess at the stories our grandchildren will tell each other to make sense of the world, but the fact that so many people are already eager to write and read stories about a world on the far side of progress gives me hope that the failed narratives of the past are losing their grip on the collective imagination of our age—and that we may be starting to tell at least a few of the new stories that will make sense of the world after oil.

The Time of the Seedbearers

Wed, 2014-04-30 19:39
Myths, according to the philosopher Sallust, are things that never happened but always are. With a few modifications, the same rule applies to the enduring narratives of every culture, the stories that find a new audience in every generation as long as their parent cultures last.  Stories of that stature don’t need to chronicle events that actually took place to have something profoundly relevant to say, and the heroic quest I used last week to frame a satire on the embarrassingly unheroic behavior of many of industrial civilization’s more privileged inmates is no exception to that rule.
That’s true of hero tales generally, of course. The thegns and ceorls who sat spellbound in an Anglo-Saxon meadhall while a scop chanted the deeds of Beowulf to the sound of a six-stringed lyre didn’t have to face the prospect of wrestling with cannibalistic ogres or battling fire-breathing dragons, and were doubtless well aware of that fact.  If they believed that terrible creatures of a kind no longer found once existed in the legendary past, why, so do we—the difference in our case is merely that we call our monsters “dinosaurs,” and insist that our paleontologist-storytellers be prepared to show us the bones.
The audience in the meadhall never wondered whether Beowulf was a historical figure in the same sense as their own great-grandparents. Since history and legend hadn’t yet separated out in the thinking of the time, Beowulf and those great-grandparents occupied exactly the same status, that of people in the past about whom stories were told. Further than that it was unnecessary to go, since what mattered to them about Beowulf was not whether he lived but how he lived.  The tale’s original audience, it’s worth recalling, got up the next morning to face the challenges of life in dark age Britain, in which defending their community against savage violence was a commonplace event; having the example of Beowulf’s courage and loyalty in mind must have made that harsh reality a little easier to face.
The same point can be made about the hero tale I borrowed and rewrote in last week’s post, Tolkien’s The Lord of the Rings. Frodo Baggins was no Beowulf, which was of course exactly the point, since Tolkien was writing for a different audience in a different age.  The experience of being wrenched out of a peaceful community and sent on a long march toward horror and death was one that Tolkien faced as a young man in the First World War, and watched his sons face in the Second. That’s what gave Tolkien’s tale its appeal: his hobbits were ordinary people facing extraordinary challenges, like so many people in the bitter years of the early twentieth century.
The contrast between Beowulf and The Lord of the Rings is precisely that between the beginning and the zenith of a civilization. Beowulf, like his audience, was born into an age of chaos and violence, and there was never any question of what he was supposed to do about it; the only detail that had to be settled was how many of the horrors of his time he would overcome before one of them finally killed him. Frodo Baggins, like his audience, was born into a world that was mostly at peace, but found itself faced with a resurgence of a nightmare that everyone in his community thought had been laid to rest for good. In Frodo’s case, the question of what he was going to do about the crisis of his age was what mattered most—and of course that’s why I was able to stand Tolkien’s narrative on its head last week, by tracing out what would have happened if Frodo’s answer had been different.
Give it a few more centuries, and it’s a safe bet that the stories that matter will be back on Beowulf’s side of the equation, as the process of decline and fall now under way leads into an era of dissolution and rebirth that we might as well call by the time-honored label “dark age.”  For the time being, though, most of us are still on Frodo’s side of things, trying to come to terms with the appalling realization that the world we know is coming apart and it’s up to us to do something about it.
That said, there’s a crucial difference between the situation faced by Frodo Baggins and his friends in Middle-earth, and the situation faced by those of us who have awakened to the crisis of our time here and now. Tolkien was a profoundly conservative thinker and writer, in the full sense of that word.  The plot engine of his works of adult fiction, The Silmarillion just as much as The Lord of the Rings, was always the struggle to hold onto the last scraps of a glorious past, and his powers of evil want to make Middle-earth modern, efficient and up-to-date by annihilating the past and replacing it with a cutting-edge industrial landscape of slagheaps and smokestacks. It’s thus no accident that Saruman’s speech to Gandalf in book two, chapter two of The Fellowship of the Ring is a parody of the modern rhetoric of progress, or that The Return of the King ends with a Luddite revolt against Sharkey’s attempted industrialization of the Shire; Tolkien was a keen and acerbic observer of twentieth-century England, and wove much of his own political thought into his stories.
The victory won by Tolkien’s protagonists in The Lord of the Rings, accordingly, amounted to restoring Middle-Earth as far as possible to the condition it was in before the War of the Ring, with the clock turned back a bit further here and there—for example, the reestablishment of the monarchy in Gondor—and a keen sense of loss surrounding those changes that couldn’t be undone. That was a reasonable goal in Tolkien’s imagined setting, and it’s understandable that so many people want to achieve the same thing here and now:  to preserve some semblance of  industrial civilization in the teeth of the rising spiral of crises that are already beginning to tear it apart.
I can sympathize with their desire. It’s become fashionable in many circles to ignore the achievements of the industrial age and focus purely on its failures, or to fixate on the places where it fell short of the frankly Utopian hopes that clustered around its rise. If the Enlightenment turned out to be far more of a mixed blessing than its more enthusiastic prophets liked to imagine, and if so many achievements of science and technology turned into sources of immense misery once they were whored out in the service of greed and political power, the same can be said of most human things: “If it has passed from the high and the beautiful to darkness and ruin,” Tolkien commented of a not dissimilar trajectory, “that was of old the fate of Arda marred.” Still, the window of opportunity through which modern industrial civilization might have been able to escape its unwelcome destiny has long since slammed shut.
That’s one of the things I meant to suggest in last week’s post by sketching out a Middle-earth already ravaged by the Dark Lord, in which most of the heroes of Tolkien’s trilogy were dead and most of the things they fought to save had already been lost. Even with those changes, though, Tolkien’s narrative no longer fits the crisis of our age as well as it did a few decades back. Our Ring of Power was the fantastic glut of energy we got from fossil fuels; we could have renounced it, as Tolkien’s characters renounced the One Ring, before we’d burnt enough to destabilize the climate and locked ourselves into a set of economic arrangements with no future...but that’s not what happened, of course.
We didn’t make that collective choice when it still could have made a difference:  when peak oil was still decades in the future, anthropogenic climate change hadn’t yet begun to destabilize the planet’s ice sheets and weather patterns, and the variables that define the crisis of our age—depletion rates, CO2 concentrations, global population, and the rest of them—were a good deal less overwhelming than they’ve now become.  As The Limits to Growth pointed out more than four decades ago, any effort to extract industrial civilization from the trap it made for itself had to get under way long before the jaws of that trap began to bite, because the rising economic burden inflicted by the ongoing depletion of nonrenewable resources and the impacts of pollution and ecosystem degradation were eating away at the surplus wealth needed to meet the costs of the transition to sustainability.
That prediction has now become our reality. Grandiose visions of vast renewable-energy buildouts and geoengineering projects on a global scale, of the kind being hawked so ebulliently these days by the  prophets of eternal business as usual, fit awkwardly with the reality that a great many industrial nations can no longer afford to maintain basic infrastructures or to keep large and growing fractions of their populations from sliding into desperate poverty. The choice that I discussed in last week’s post, reduced to its hard economic bones, was whether we were going to put what remained of our stock of fossil fuels and other nonrenewable resources into maintaining our current standard of living for a while longer, or whether we were going to put it into building a livable world for our grandchildren.
The great majority of us chose the first option, and insisting at the top of our lungs that of course we could have both did nothing to keep the second from slipping away into the realm of might-have-beens. The political will to make the changes and accept the sacrifices that would be required to do anything else went missing in action in the 1980s and hasn’t been seen since. That’s the trap that was hidden in the crisis of our age: while the costs of transition were still small enough that we could have met them without major sacrifice, the consequences of inaction were still far enough in the future that most people could pretend they weren’t there; by the time the consequences were hard to ignore, the costs of transition had become too great for most people to accept—and not too long after that, they had become too great to be met at all. .
As a commentary on our current situation, in other words, the story of the heroic quest has passed its pull date. As I noted years ago, insisting that the world must always follow a single narrative is a fertile source of misunderstanding and misery. Consider the popular insistence that the world can grow its way out of problems caused by growth—as though you could treat the consequences of chronic alcoholism by drinking even more heavily! What gives that frankly idiotic claim the appeal it has is that it draws on one of the standard stories of our age, the Horatio Alger story of the person who overcame long odds to make a success of himself. That does happen sometimes, which is why it’s a popular story; the lie creeps in when the claim gets made that this is always what happens. 
When people insist, as so many of them do, that of course we’ll overcome the limits to growth and every other obstacle to our allegedly preordained destiny out there among the stars, all that means is that they have a single story wedged into their imagination so tightly that mere reality can’t shake it loose. The same thing’s true of all the other credos I’ve discussed in recent posts, from “they’ll think of something” through “it’s all somebody else’s fault” right on up to “we’re all going to be extinct soon anyway so it doesn’t matter any more.” Choose any thoughtstopper you like from your randomly generated Peak Oil Denial Bingo card, and behind it lies a single story, repeating itself monotonously over and over in the heads of those who can’t imagine the world unfolding in any other way.
The insistence that it’s not too late, that there must still be time to keep industrial civilization from crashing into ruin if only we all come together to make one great effort, and that there’s any reason to think that we can and will all come together, is another example. The narrative behind that claim has a profound appeal to people nowadays, which is why stories that feature it—again, Tolkien’s trilogy comes to mind—are as popular as they are. It’s deeply consoling to be told that there’s still one last chance to escape the harsh future that’s already taking shape around us. It seems almost cruel to point out that whether a belief appeals to our emotions has no bearing on whether or not it’s true.
The suggestion that I’ve been making since this blog first began eight years ago is that we’re long past the point at which modern industrial civilization might still have been rescued from the consequences of its own mistakes. If that’s the case, it’s no longer useful to put the very limited resources we have left into trying to stop the inevitable, and it’s even less useful to wallow in wishful thinking about how splendid it would be if the few of us who recognize the predicament we’re in were to be joined by enough other people to make a difference. If anything of value is to get through the harsh decades and centuries ahead of us, if anything worth saving is to be rescued from the wreck of our civilization, there’s plenty of work to do, and daydreaming about mass movements that aren’t happening and grand projects we can no longer afford simply wastes what little time we still have left.
That’s why I’ve tried to suggest in previous posts here that it’s time to set aside some of our more familiar stories and try reframing the crisis of our age in less shopworn ways. There are plenty of viable options—plenty, that is, of narratives that talk about what happens when the last hope of rescue has gone whistling down the wind and it’s time to figure out what can be saved in the midst of disaster—but the one that keeps coming back to my mind is one I learned and, ironically, dismissed as uninteresting quite a few decades ago, in the early years of my esoteric studies: the old legend of the fall of Atlantis.
It’s probably necessary to note here that whether Atlantis existed as a historical reality is not the point. While it’s interesting to speculate about whether human societies more advanced than current theory suggests might have flourished in the late Ice Age and then drowned beneath rising seas, those speculations are as irrelevant here as trying to fit Grendel and his mother into the family tree of the Hominidae, say, or discussing how plate tectonics could have produced the improbable mountain ranges of Middle-earth. Whatever else it might or might not have been, Atlantis is a story, one that has a potent presence in our collective imagination. Like Beowulfor The Lord of the Rings, the Atlantis story is about the confrontation with evil, but where Beowulf comes at the beginning of a civilization and Frodo Baggins marks its zenith, the Atlantis story illuminates its end.
Mind you, the version of the story of Atlantis I learned, in common with most of the versions in circulation in occult schools in those days, had three details that you won’t find in Plato’s account, or in most of the rehashes that have been churned out by the rejected-knowledge industry over the last century or so. First, according to that version, Atlantis didn’t sink all at once; rather, there were three inundations separated by long intervals. Second, the sinking of Atlantis wasn’t a natural disaster; it was the direct result of the clueless actions of the Atlanteans, who brought destruction on themselves by their misuse of advanced technology.
The third detail, though, is the one that matters here. According to the mimeographed lessons I studied back in the day, as it became clear that Atlantean technology had the potential to bring about terrifying blowback, the Atlanteans divided into two factions: the Children of the Law of One, who took the warnings seriously and tried to get the rest of Atlantean society to do so, and the Servants of the Dark Face, who dismissed the whole issue—I don’t know for a fact that these latter went around saying “I’m sure the priests of the Sun Temple will think of something,” “orichalcum will always be with us,” “the ice age wasn’t ended by an ice shortage,” and the like, but it seems likely. Those of my readers who haven’t spent the last forty years hiding at the bottom of the sea will know instantly which of these factions spoke for the majority and which was marginalized and derided as a bunch of doomers.
According to the story, when the First Inundation hit and a big chunk of Atlantis ended up permanently beneath the sea, the shock managed to convince a lot of Atlanteans that the Children of the Law of One had a point, and for a while there was an organized effort to stop doing the things that were causing the blowback. As the immediate memories of the Inundation faded, though, people convinced themselves that the flooding had just been one of those things, and went back to their old habits. When the Second Inundation followed and all of Atlantis sank but the two big islands of Ruta and Daitya, though, the same pattern didn’t repeat itself; the Children of the Law of One were marginalized even further, and the Servants of the Dark Face became even more of a majority, because nobody wanted to admit the role their own actions had had in causing the catastrophe. Again, those of my readers who have been paying attention for the last forty years know this story inside and out.
It’s what happened next, though, that matters most. In the years between the Second Inundation and the Third and last one, so the story goes, Atlantis was for all practical purposes a madhouse with the inmates in charge. Everybody knew what was going to happen and nobody wanted to deal with the implications of that knowledge, and the strain expressed itself in orgiastic excess, bizarre belief systems, and a rising spiral of political conflict ending in civil war—anything you care to name, as long as it didn’t address the fact that Atlantis was destroying itself and that nearly all the Atlanteans were enthusiastic participants in the activities driving the destruction. That was when the Children of the Law of One looked at one another and, so to speak, cashed out their accounts at the First National Bank of Atlantis, invested the proceeds in shipping, and sailed off to distant lands to become the seedbearers of the new age of the world.
That’s the story that speaks to me just now—enough so that I’ve more than once considered writing a fantasy novel about the fall of Atlantis as a way of talking about the crisis of our age. Of course that story doesn’t speak to everyone, and the belief systems that insist either that everything is fine or that nothing can be done anyway have no shortage of enthusiasts. If these belief systems turn out to be as delusional as they look, though, what then? The future that very few people are willing to consider or prepare for is the one that history shows us is the common destiny of every other failed civilization:  the long, bitter, ragged road of decline and fall into a dark age, from which future civilizations will eventually be born. If that’s the future ahead of us, as I believe it is, the necessary preparations need to be made now, if the best achievements of our age are to be carried into the future when the time of the seedbearers arrives.
*************************Even archdruids need to take a break from time to time, and it’s been quite a while since I took time off from these weekly essays. The Archdruid Report will therefore be on hiatus for the next month and a half. I’ll look forward to meeting my readers at The Age of Limits conference in southern Pennsylvania, the Economics, Energy and Environment conference in London, or one of the less peak oil-centric speaking gigs I’ll be having over the same period. In the meantime, I wish my readers good weather for gardening, pleasant days of weatherstripping and caulking, and plenty of spare time to learn the knowledge and skills that will be needed in the future ahead of us; we’ll talk again on June 18th.

Refusing the Call: A Tale Rewritten

Wed, 2014-04-23 18:43
I have been wondering for some time now how to talk about the weirdly autumnal note that sounds so often and so clearly in America these days. Through the babble and clatter, the seven or eight television screens yelling from the walls of every restaurant you pass and all the rest of it, there comes a tone and a mood that reminds me of wind among bare branches and dry leaves crackling underfoot:  as though even the people who insist most loudly that it’s all onward and upward from here don’t believe it any more, and those for whom the old optimism stopped being more than a soothing shibboleth a long time ago are hunching their shoulders, shutting their eyes tight, and hoping that things can still hold together for just a little while longer.
It’s not just that American politicians and pundits are insisting at the top of their lungs that the United States can threaten Russia with natural gas surpluses that don’t exist, though that’s admittedly a very bad sign all by itself. It’s that this orgy of self-congratulatory nonsense appears in the news right next to reports that oil and gas companies are slashing their investments in the fracking technology and shale leases that were supposed to produce those imaginary surpluses, having lost a great deal of money pursuing the shale oil mirage, while Russia and Iran  pursue a trade deal that will make US sanctions against Iran all but irrelevant, and China is quietly making arrangements to conduct its trade with Europe in yuan rather than dollars. Strong nations in control of their own destinies, it’s fair to note, don’t respond to challenges on this scale by plunging their heads quite so enthusiastically into the sands of self-deception.
To shift temporal metaphors a bit, the long day of national delusion that dawned back in 1980, when Ronald Reagan famously and fatuously proclaimed “it’s morning in America,” is drawing on rapidly toward dusk, and most Americans are hopelessly unprepared for the coming of night.  They’re unprepared in practical terms, that is, for an era in which the five per cent of us who live in the United States will no longer dispose of a quarter of the world’s energy supply and a third of its raw materials and industrial products, and in which what currently counts as a normal American lifestyle will soon be no more than a fading memory for the vast majority.  They’re just as unprepared, though,  for the psychological and emotional costs of that shattering transformation—not least because the change isn’t being imposed on them at random by an indifferent universe, but comes as the inevitable consequence of their own collective choices in decades not that long past.
The hard fact that most people in this country are trying not to remember is this:  in the years right after Reagan’s election, a vast number of Americans enthusiastically turned their backs on the promising steps toward sustainability that had been taken in the previous decade, abandoned the ideals they’d been praising to the skies up to that time, and cashed in their grandchildrens’ future so that they didn’t have to give up the extravagance and waste that defined their familiar and comfortable lifestyles. As a direct result, the nonrenewable resources that might have supported the transition to a sustainable future went instead to fuel one last orgy of wretched excess.  Now, though, the party is over, the bill is due, and the consequences of that disastrous decision have become a massive though almost wholly unmentionable factor in our nation’s culture and collective psychology.
A great many of the more disturbing features of contemporary American life, I’m convinced, can’t be understood unless America’s thirty-year vacation from reality is taken into account. A sixth of the US population is currently on antidepressant medications, and since maybe half of Americans can’t afford to get medication at all, the total number of Americans who are clinically depressed is likely a good deal higher than prescription figures suggest. The sort of bizarre delusions that used to count as evidence of serious mental illness—baroque conspiracy theories thickly frosted with shrill claims of persecution, fantasies of imminent mass death as punishment for humanity’s sins, and so on—have become part of the common currency of American folk belief. For that matter, what does our pop culture’s frankly necrophiliac obsession with vampires amount to but an attempt, thinly veiled in the most transparent of symbolism, to insist that it really is okay to victimize future generations for centuries down the line in order to prolong one’s own existence?
Mythic and legends such as this can be remarkably subtle barometers of the collective psyche. The transformation that turned the vampire from just another spooky Eastern European folktale into a massive pop culture presence in industrial North America has quite a bit to say about the unspoken ideas and emotions moving through the crawlspaces of our collective life. In the same way, it’s anything but an accident that the myth of the heroic quest has become so pervasive a presence in the modern industrial world that Joseph Campbell could simply label it “the monomyth,” the basic form of myth as such. In any sense other than a wholly parochial one, of course, he was quite wrong—the wild diversity of the world’s mythic stories can’t be forced into any one narrative pattern—but if we look only at popular culture in the modern industrial world, he’s almost right.
The story of the callow nobody who answers the call to adventure, goes off into the unknown, accomplishes some grand task, and returns transformed, to transform his surroundings in turn, is firmly welded into place in the imagination of our age. You’ll find it at the center of J.R.R. Tolkien’s great  works of fantasy, in the most forgettable products of the modern entertainment industry, and everything in between and all around. Yet there’s a curious blind spot in all this: we hear plenty about those who answer the call to adventure, and nothing at all about those who refuse it. Those latter don’t offer much of a plot engine for an adventure story, granted, but such a tale could make for a gripping psychological study—and one that has some uncomfortably familiar features.
With that in mind, with an apology in the direction of Tolkien’s ghost, and with another to those of my readers who aren’t lifelong Tolkien buffs with a head full of Middle-earth trivia—yes, I used to sign school yearbooks in fluent Elvish—I’d like to suggest a brief visit to an alternate Middle-earth:  one in which Frodo Baggins, facing the final crisis of the Third Age and the need to leave behind everything he knew and loved in order to take the Ring to Mount Doom, crumpled instead, with a cry of “I can’t, Gandalf, I just can’t.” Perhaps you’ll join me in a quiet corner of The Green Dragon, the best inn in Bywater, take a mug of ale from the buxom hobbit barmaid, and talk about old Frodo, who lived until recently just up the road and across the bridge in Hobbiton.
You’ve heard about the magic ring he had, the one that he inherited from his uncle Bilbo, the one that Gandalf the wizard wanted him to go off and destroy? That was thirty years ago, and most folk in the Shire have heard rumors about it by now. Yes, it’s quite true; Frodo was supposed to leave the Shire and go off on an adventure, as Bilbo did before him, and couldn’t bring himself to do it. He had plenty of reasons to stay home, to be sure.  He was tolerably well off and quite comfortable, all his friends and connections were here, and the journey would have been difficult and dangerous. Nor was there any certainty of success—quite the contrary, it’s entirely possible that he might have perished somewhere in the wild lands, or been caught by the Dark Lord’s servants, or what have you.
So he refused, and when Gandalf tried to talk to him about it, he threw the old wizard out of Bag End and slammed the round green door in his face. Have you ever seen someone in a fight who knows that he’s in the wrong, and knows that everyone else knows it, and that knowledge just makes him even more angry and stubborn?  That was Frodo just then. Friends of mine watched the whole thing, or as much of it as could be seen from the garden outside, and it was not a pleasant spectacle. 
It’s what happened thereafter, though, that bears recalling. I’m quite sure that if Frodo had shown the least sign of leaving the Shire and going on the quest, Sauron would have sent Black Riders after him in a fine hurry, and there’s no telling what else might have come boiling up out of Mordor.  It’s by no means impossible that the Dark Lord might have panicked, and launched a hasty, ill-advised assault on Gondor right away. For all I know, that may have been what Gandalf had in mind, tricking the Dark Lord into overreacting before he’d gathered his full strength, and before Gondor and Rohan had been thoroughly weakened from within.
Still, once Sauron’s spies brought him word that Frodo had refused to embark on the quest, the Dark Lord knew that he had a good deal less to fear, and that he could afford to take his time. Ever since then, there have been plenty of servants of Mordor in and around the Shire, and a Black Rider or two keeping watch nearby, but nothing obvious or direct, nothing that might rouse whatever courage Frodo might have had left or  convince him that he had to flee for his life. Sauron was willing to be patient—patient and cruel. I’m quite sure he knew perfectly well what the rest of Frodo’s life would be like.
So Gandalf went away, and Frodo stayed in Bag End, and for years thereafter it seemed as though the whole business had been no more than a mistake. The news that came up the Greenway from the southern lands was no worse than before; Gondor still stood firm, and though there was said to be some kind of trouble in Rohan, well, that was only to be expected now and then.  Frodo even took to joking about how gullible he’d been to believe all those alarmist claims that Gandalf had made. Sauron was still safely cooped up in Mordor, and all seemed right with Middle-earth.
Of course part of that was simply that Frodo had gotten even wealthier and more comfortable than he’d been before. He patched up his relationship with the Sackville-Bagginses, and he invested a good deal of his money in Sandyman’s mill in Hobbiton, which paid off handsomely. He no longer spent time with many of his younger friends by then, partly because they had their own opinions about what he should have done, and partly because he had business connections with some of the wealthiest hobbits in the Shire, and wanted to build on those. He no longer took long walks around the Shire, as he’d done before, and he gave up visiting elves and dwarves when he stopped speaking to Gandalf.
But of course the rumors and news from the southern lands slowly but surely turned to the worse, as the Dark Lord gathered his power and tightened his grip on the western lands a little at a time. I recall when Rohan fell to Saruman’s goblin armies.  That was a shock for a great many folk, here in the Shire and elsewhere.  Soon thereafter, though, Frodo was claiming that after all, Saruman wasn’t Sauron, and Rohan wasn’t that important, and for all anyone knew, the wizard and the Dark Lord might well end up at each other’s throats and spare the rest of us.
Still, it was around that time that Frodo stopped joking about Gandalf’s warnings, and got angry if anyone mentioned them in his hearing. It was around that same time, too, that he started insisting loudly and often that someone would surely stop Sauron. One day it was the elves:  after all, they had three rings of power, and could surely overwhelm the forces of Mordor if they chose to. Another day, the dwarves would do it, or Saruman, or the men of Gondor, or the Valar in the uttermost West. There were so many alternatives!  His friends very quickly learned to nod and agree with him, for he would lose his temper and start shouting at them if they disagreed or even asked questions.
When Lorien was destroyed, that was another shock. It was after that, as I recall, that Frodo started hinting darkly that the elves didn’t seem to be doing anything with their three rings of power to stop Sauron, and maybe they weren’t as opposed to him as they claimed. He came up with any number of theories about this or that elvish conspiracy. The first troubles were starting to affect the Shire by then, of course, and his investments were beginning to lose money; it was probably inevitable that he would start claiming that the conspiracy was aimed in part against hobbits, against the Shire, or against him in particular—especially the latter. They wanted his ring, of course. That played a larger and larger role in his talk as the years passed.
I don’t recall hearing of any particular change in his thinking when word came that Minas Tirith had been taken by the Dark Lord’s armies, but it wasn’t much later that a great many elves came hurrying along the East Road through the Shire, and a few months after that, word came that Rivendell had fallen. That was not merely a shock, but a blow; Frodo had grown up hearing his uncle’s stories about Rivendell and the elves and half-elves who lived there. There was a time after that news came that some of us briefly wondered if old Frodo might actually find it in himself to do the thing he’d refused to do all those years before.
But of course he did nothing of the kind, not even when the troubles here in the Shire began to bite more and more deeply, when goblins started raiding the borders of the North Farthing and the Buckland had to be abandoned to the Old Forest. No, he started insisting to anyone who would listen that Middle-earth was doomed, that there was no hope left in elves or dying Númenor, that Sauron’s final victory would surely come before—oh, I forget what the date was; it was some year or other not too far from now. He spent hours reading through books of lore, making long lists of reasons why the Dark Lord’s triumph was surely at hand. Why did he do that? Why, for the same reason that drove him to each of his other excuses in turn: to prove to himself that his decision to refuse the quest hadn’t been the terrible mistake he knew perfectly well it had been.
And then, of course, the Ring betrayed him, as it betrayed Gollum and Isildur before him. He came home late at night, after drinking himself half under the table at the Ivy Bush, and discovered that the Ring was nowhere to be found. After searching Bag End in a frantic state, he ran out the door and down the road toward Bywater shouting “My precious! My precious!” He was weeping and running blindly in the night, and when he got to the bridge he stumbled; over he went into the water, and that was the end of him. They found his body in a weir downstream the next morning.
The worst of it is that right up to the end, right up to the hour the Ring left him, he still could have embarked on the quest.  It would have been a different journey, and quite possibly a harder one.  With Rivendell gone, he would have had to go west rather than east, across the Far Downs to Cirdan at the Grey Havens, where you’ll find most of the high-elves who still remain in Middle-earth. From there, with such companions as might have joined him, he would have had to go north and then eastward through Arnor, past the ruins of Annuminas and Lake Evendim, to the dales of the Misty Mountains, and then across by one of the northern passes: a hard and risky journey, but by no means impossible, for with no more need to hinder travel between Rivendell and Lorien, the Dark Lord’s watch on the mountains has grown slack.
Beyond the mountains, the wood-elves still dwell in the northern reaches of Mirkwood, along with refugees from Lorien and the last of the Beornings.  He could have gotten shelter and help there, and boats to travel down the River Running into the heart of Wilderland.  From there his way would have led by foot to the poorly guarded northern borders of Mordor—when has Sauron ever had to face a threat from that quarter?  So you see that it could have been done. It could still be done, if someone were willing to do it. Even though so much of what could have been saved thirty years ago has been lost, even though Minas Tirith, Edoras, Lorien and Rivendell have fallen and the line of the kings of Gondor is no more, it would still be worth doing; there would still be many things that could be saved.
Nor would such a journey have to be made alone. Though Aragorn son of Arathorn was slain in the last defense of Rivendell, there are still Rangers to be found in Cirdan’s realm and the old lands of Arnor; there are elf-warriors who hope to avenge the blood shed at Rivendell, and dwarves from the Blue Mountains who have their own ancient grudges against the Dark Lord. The last free Rohirrim retreated to Minhiriath after Éomer fell at Helm’s Deep, and still war against King Grima, while Gondor west of the river Gilrain clings to a tenuous independence and would rise up against Sauron at need. Would those and the elves of Lindon be enough? No one can say; there are no certainties in this business, except for the one Frodo chose—the certainty that doing nothing will guarantee Sauron’s victory.
And there might even still be a wizard to join such a quest. In fact, there would certainly be one—the very last of them, as far as I know. Gandalf perished when Lorien fell, I am sorry to say, and as for Saruman, the last anyone saw of him, he was screaming in terror as two Ringwraiths dragged him through the door of the Dark Tower; his double-dealing was never likely to bring him to a good end. The chief of the Ringwraiths rules in Isengard now. Still, there was a third in these western lands: fool and bird-tamer, Saruman called him, having never quite managed to notice that knowledge of the ways of nature and the friendship of birds and beasts might have considerable value in the last need of Middle-earth. Radagast is his name; yes, that would be me.
Why am I telling you all this?  Well, you are old Frodo’s youngest cousin, are you not? Very nearly the only one of his relatives with enough of the wild Tookish blood in you to matter, or so I am told. It was just a month ago that you and two of your friends were walking in the woods, and you spoke with quite a bit of anger about how the older generation of hobbits had decided to huddle in their holes until the darkness falls—those were your very words, I believe. How did I know that? Why, a little bird told me—a wren, to be precise, a very clever and helpful little fellow, who runs errands for me from time to time when I visit this part of Middle-earth. If you meant what you said then, there is still hope.
And the Ring? No, it was not lost, or not for long. It slipped from its chain and fell from old Frodo’s pocket as he stumbled home that last night, and a field mouse spotted it. I had briefed all the animals and birds around Hobbiton, of course, and so she knew what to do; she dragged the Ring into thick grass, and when dawn came, caught the attention of a jay, who took it and hid it high up in a tree. I had to trade quite a collection of sparkling things for it! But here it is, in this envelope, waiting for someone to take up the quest that Frodo refused. The choice is yours, my dear hobbit. What will you do?

Star's Reach: A Novel of the Deindustrial Future

Sat, 2014-04-19 19:18
I'm delighted to report that my deindustrial novel Star's Reach, which appeared in episodes as a blog between 2009 and late last year, is now available in print and ebook formats from Founders House Publishing. I've suggested here more than once that narratives are among the most important tools we have for understanding and shaping the future; from that perspective, Star's Reach is a contribution to imagining a future that isn't locked inside the Hobson's choice between business as usual and overnight catastrophe that's frozen into place in the collective imagination of our time. If a story of adventure and alien contact in 25th-century neomedieval America appeals to you, you might want to give it a read!

The End of Employment

Wed, 2014-04-16 16:51
Nothing is easier, as the Long Descent begins to pick up speed around us, than giving in to despair—and nothing is more pointless. Those of us who are alive today are faced with the hugely demanding task of coping with the consequences of industrial civilization’s decline and fall, and saving as many as possible of the best achievements of the last few centuries so that they can cushion the descent and enrich the human societies of the far future.  That won’t be easy; so?  The same challenge has been faced many times before, and quite often it’s been faced with relative success.
The circumstances of the present case are in some ways more difficult than past equivalents, to be sure, but the tools and the knowledge base available to cope with them are almost incomparably greater. All in all, factoring in the greater challenges and the greater resources, it’s probably fair to suggest that the challenge of our time is about on a par with other eras of decline and fall.  The only question that still remains to be settled is how many of the people who are awake to the imminence of crisis will rise to the challenge, and how many will fail to do so.
The suicide of peak oil writer Mike Ruppert two days ago puts a bit of additional emphasis on that last point. I never met Ruppert, though we corresponded back in the days when his “From The Wilderness” website was one of the few places on the internet that paid any attention at all to peak oil, and I don’t claim to know what personal demons drove him to put a bullet through his brain. Over the last eight years, though, as the project of this blog has brought me into contact with more and more people who are grappling with the predicament of our time, I’ve met a great many people whose plans for dealing with a postpeak world amount to much the same thing.  Some of them are quite forthright about it, which at least has the virtue of honesty.  Rather more of them conceal the starkness of that choice behind a variety of convenient evasions, the insistence that we’re all going to die soon anyway being far and away the most popular of these just now.
I admit to a certain macabre curiosity about how that will play out in the years ahead. I’ve suspected for a while now, for example, that the baby boomers will manage one final mediagenic fad on the way out, and the generation that marked its childhood with coonskin caps and hula hoops and its puberty with love beads and Beatlemania will finish with a fad for suicide parties, in which attendees reminisce to the sound of the tunes they loved in high school, then wash down pills with vodka and help each other tie plastic bags over their heads. Still, I wonder how many people will have second thoughts once every other option has gone whistling down the wind, and fling themselves into an assortment of futile attempts to have their cake when they’ve already eaten it right down to the bare plate. We may see some truly bizarre religious movements, and some truly destructive political ones, before those who go around today insisting that they don’t want to live in a deindustrial world finally get their wish.
There are, of course, plenty of other options. The best choice for most of us, as I’ve noted here in previous posts, follows a strategy I’ve described wryly as “collapse first and avoid the rush:”  getting ahead of the curve of decline, in other words, and downshifting to a much less extravagant lifestyle while there’s still time to pick up the skills and tools needed to do it competently. Despite the strident insistence from defenders of the status quo that anything less than business as usual amounts to heading straight back to the caves, it’s entirely possible to have a decent and tolerably comfortable life on a tiny fraction of the energy and resource base that middle class Americans think they can’t possibly do without. Mind you, you have to know how to do it, and that’s not the sort of knowledge you can pick up from a manual, which is why it’s crucial to start now and get through the learning curve while you still have the income and the resources to cushion the impact of the inevitable mistakes.
This is more or less what I’ve been saying for eight years now. The difficulty at this stage in the process, though, is that a growing number of Americans are running out of time. I don’t think it’s escaped the notice of many people in this country that despite all the cheerleading from government officials, despite all the reassurances from dignified and clueless economists, despite all those reams of doctored statistics gobbled down whole by the watchdogs-turned-lapdogs of the media and spewed forth undigested onto the evening news, the US economy is not getting better.  Outside a few privileged sectors, times are hard and getting harder; more and more Americans are slipping into the bleak category of the long-term unemployed, and a great many of those who can still find employment work at part-time positions for sweatshop wages with no benefits at all.
Despite all the same cheerleading, reassurances, and doctored statistics, furthermore, the US economy is not going to get better: not for more than brief intervals by any measure, and not at all if “better”  means returning to some equivalent of America’s late 20th century boomtime. Those days are over, and they will not return. That harsh reality is having an immediate impact on some of my readers already, and that impact will only spread as time goes on. For those who have already been caught by the economic downdrafts, it’s arguably too late to collapse first and avoid the rush; willy-nilly, they’re already collapsing as fast as they can, and the rush is picking up speed around them as we speak.
For those who aren’t yet in that situation, the need to make changes while there’s still time to do so is paramount, and a significant number of my readers seem to be aware of this. One measure of that is the number of requests for personal advice I field, which has gone up steeply in recent months. Those requests cover a pretty fair selection of the whole gamut of human situations in a failing civilization, but one question has been coming up more and more often of late: the question of what jobs might be likely to provide steady employment as the industrial economy comes apart.
That’s a point I’ve been mulling over of late, since its implications intersect the whole tangled web in which our economy and society is snared just now. In particular, it assumes that the current way of bringing work together with workers, and turning the potentials of human mind and muscle toward the production of goods and services, is likely to remain in place for the time being, and it’s becoming increasingly clear to me that this won’t be the case.
It’s important to be clear on exactly what’s being discussed here. Human beings have always had to produce goods and services to stay alive and keep their families and communities going; that’s not going to change. In nonindustrial societies, though, most work is performed by individuals who consume the product of their own labor, and most of the rest is sold or bartered directly by the people who produce it to the people who consume it. What sets the industrial world apart is that a third party, the employer, inserts himself into this process, hiring people to produce goods and services and then selling those goods and services to buyers.  That’s employment, in the modern sense of the word; most people think of getting hired by an employer, for a fixed salary or wage, to produce goods and services that the employer then sells to someone else, as the normal and natural state of affairs—but it’s a state of affairs that is already beginning to break down around us, because the surpluses that make that kind of employment economically viable are going away.
Let’s begin with the big picture. In any human society, whether it’s a tribe of hunter-gatherers, an industrial nation-state, or anything else, people apply energy to raw materials to produce goods and services; this is what we mean by the word “economy.” The goods and services that any economy can produce are strictly limited by the energy sources and raw materials that it can access.
A principle that ecologists call Liebig’s law of the minimum is relevant here: the amount of anything  that a given species or ecosystem can produce in a given place and time is limited by whichever resource is in shortest supply. Most people get that when thinking about the nonhuman world; it makes sense that plants can’t use extra sunlight to make up for a shortage of water, and that you can’t treat soil deficient in phosphates by adding extra nitrates. It’s when you apply this same logic to human societies that the mental gears jam up, because we’ve been told so often that one resource can always be substituted for another that most people believe it without a second thought.
What’s going on here, though, is considerably more subtle than current jargon reflects. Examine most of the cases of resource substitution that find their way into economics textbooks, and you’ll find that what’s happened is that a process of resource extraction that uses less energy on a scarcer material has been replaced by another process that takes more energy but uses more abundant materials. The shift from high-quality iron ores to low-grade taconite that reshaped the iron industry in the 20th century, for example, was possible because ever-increasing amounts of highly concentrated energy could be put into the smelting process without making the resulting iron too expensive for the market.
The point made by this and comparable examples is applicable across the board to what I’ve termed technic societies, that subset of human societies—ours is the first, though probably not the last—in which a large fraction of total energy per capita comes from nonbiological sources and is put to work by way of  machines rather than human or animal muscles.  Far more often than not, in such societies, concentrated energy is the limiting resource. Given an abundant enough supply of concentrated energy at a low enough price, it would be possible to supply a technic society with raw materials by extracting dissolved minerals from seawater or chewing up ordinary rock to get a part per million or so of this or that useful element. Lacking that—and there are good reasons to think that human societies will always be lacking that—access to concentrated energy is where Liebig’s law bites down hard.
Another way to make this same point is to think of how much of any given product a single worker can make in a day using a set of good hand tools, and comparing that to the quantity of the same thing that the same worker could make using the successive generations of factory equipment, from the steam-driven and belt-fed power tools of the late 19th century straight through to the computerized milling machines and assembly-line robots of today. The difference can be expressed most clearly as a matter of the amount of energy being applied directly and indirectly to the manufacturing process—not merely the energy driving the tools through the manufacturing process, but the energy that goes into  manufacturing and maintaining the tools, supporting the infrastructure needed for manufacture and maintenance, and so on through the whole system involved in the manufacturing process.
Maverick economist E.F. Schumacher, whose work has been discussed in this blog many times already, pointed out that the cost per worker of equipping a workplace is one of the many crucial factors that  mainstream economic thought invariably neglects. That cost is usually expressed in financial terms, but underlying the abstract tokens we call money is a real cost in energy, expressed in terms of the goods and services that have to be consumed in the process of equipping and maintaining the workplace. If you have energy to spare, that’s not a problem; if you don’t, on the other hand, you’re actually better off using a less complex technology—what Schumacher called “intermediate technology” and the movement in which I studied green wizardry thirty years ago called “appropriate technology.”
The cost per worker of equipping a workplace, in turn, also has a political dimension—a point that Schumacher did not neglect, though nearly all other economists pretend that it doesn’t exist. The more costly it is to equip a workplace, the more certain it is that workers won’t be able to set themselves up in business, and the more control the very rich will then have over economic production and the supply of jobs. As Joseph Tainter pointed out in The Collapse of Complex Societies, social complexity correlates precisely with social hierarchy; one of the functions of complexity, in the workplace as elsewhere, is thus to maintain existing social pecking orders.
Schumacher’s arguments, though, focused on the Third World nations of his own time, which had very little manufacturing capacity at all—most of them, remember, had been colonies of European empires, assigned the role of producing raw materials and buying finished products from the imperial center as part of the wealth pump that drove them into grinding poverty while keeping their imperial overlords rich. He focused on advising client nations on how to build their own economies and extract themselves from the political grip of their former overlords, who were usually all too eager to import high-tech factories which their upper classes inevitably controlled. The situation is considerably more challenging when  your economy is geared to immense surpluses of concentrated energy, and the supply of energy begins to run short—and of course that’s the situation we’re in today.
Even if it were just a matter of replacing factory equipment, that would be a huge challenge, because all those expensive machines—not to mention the infrastructure that manufactures them, maintains them, supplies them, and integrates their products into the wider economy—count as sunk costs, subject to what social psychologists call the “Concorde fallacy,” the conviction that it’s less wasteful to keep on throwing money into a failing project than to cut your losses and do something else. The real problem is that it’s not just factory equipment; the entire economy has been structured from the ground up to use colossal amounts of highly concentrated energy, and everything that’s been invested in that economy since the beginning of the modern era thus counts as a sunk cost to one degree or another.
What makes this even more challenging is that very few people in the modern industrial world actually produce goods and services for consumers, much less for themselves, by applying energy to raw materials. The vast majority of today’s employees, and in particular all those who have the wealth and influence that come with high social status, don’t do this.  Executives, brokers, bankers, consultants, analysts, salespeople—well, I could go on for pages: the whole range of what used to be called white-collar jobs exists to support the production of goods and services by the working joes and janes managing all the energy-intensive machinery down there on the shop floor. So does the entire vast maze of the financial industry, and so do the legions of government bureaucrats—local, state, and federal—who manage, regulate, or oversee one or another aspect of economic activity.
All these people are understandably just as interested in keeping their jobs as the working joes and janes down there on the shop floor, and yet the energy surpluses that made it economically viable to perch such an immensely complex infrastructure on top of the production of goods and services for consumers are going away. The result is a frantic struggle on everyone’s part to make sure that the other guy loses his job first. It’s a struggle that all of them will ultimately lose—as the energy surplus needed to support it dwindles away, so will the entire system that’s perched on that high but precarious support—and so, as long as that system remains in place, getting hired by an employer, paid a regular wage or salary, and given work and a workplace to produce goods and services that the employer then sells to someone else, is going to become increasingly rare and increasingly unrewarding. 
That transformation is already well under way. Nobody I know personally who works for an employer in the sense I’ve just outlined is prospering in today’s American economy.  Most of the people I know who are employees in the usual sense of the word are having their benefits slashed, their working conditions worsened, their hours cut, and their pay reduced by one maneuver or another, and the threat of being laid off is constantly hovering over their heads.  The few exceptions are treading water and hoping to escape the same fate. None of this is accidental, and none of it is merely the result of greed on the part of the very rich, though admittedly the culture of executive kleptocracy at the upper end of the American social pyramid is making things a good deal worse than they might otherwise be.
The people I know who are prospering right now are those who produce goods and services for their own use, and provide goods and services directly to other people, without having an employer to provide them with work, a workplace, and a regular wage or salary. Some of these people have to stay under the radar screen of the current legal and regulatory system, since the people who work in that system are trying to preserve their own jobs by making life difficult for those who try to do without their services. Others can do things more openly. All of them have sidestepped as many as possible of the infrastructure services that are supposed to be part of an employee’s working life—for example, they aren’t getting trained at universities, since the US academic industry these days is just another predatory business sector trying to keep itself afloat by running others into the ground, and they aren’t going to banks for working capital for much the same reason. They’re using their own labor, their own wits, and their own personal connections with potential customers, to find a niche in which they can earn the money (or barter for the goods) they need or want.
I’d like to suggest that this is the wave of the future—not least because this is how economic life normally operates in nonindustrial societies, where the vast majority of people in the workforce are directly engaged in the production of goods and services for themselves and their own customers.  The surplus that supports all those people in management, finance, and so on is a luxury that nonindustrial societies don’t have. In the most pragmatic of economic senses, collapsing now and avoiding the rush involves getting out of a dying model of economics before it drags you down, and finding your footing in the emerging informal economy while there’s still time to get past the worst of the learning curve.
Playing by the rules of a dying economy, that is, is not a strategy with a high success rate or a long shelf life. Those of my readers who are still employed in the usual sense of the term may choose to hold onto that increasingly rare status, but it’s not wise for them to assume that such arrangements will last indefinitely; using the available money and other resources to get training, tools, and skills for some other way of getting by would probably be a wise strategy. Those of my readers who have already fallen through the widening cracks of the employment economy will have a harder row to hoe in many cases; for them, the crucial requirement is getting access to food, shelter, and other necessities while figuring out what to do next and getting through any learning curve that might be required.
All these are challenges; still, like the broader challenge of coping with the decline and fall of a civilization, they are challenges that countless other people have met in other places and times. Those who are willing to set aside currently popular fantasies of entitlement and the fashionable pleasures of despair will likely be in a position to do the same thing this time around, too.

The Four Industrial Revolutions

Wed, 2014-04-09 16:34
Last week’s post on the vacuous catchphrases that so often substitute for thought in today’s America referenced only a few examples of the species under discussion.  It might someday be educational, or at least entertaining, to write a sequel to H.L. Mencken’s The American Credo, bringing his choice collection of thoughtstoppers up to date with the latest fashionable examples; still, that enticing prospect will have to wait for some later opportunity.In the meantime, those who liked my suggestion of Peak Oil Denial Bingo will doubtless want to know that cards can now be downloaded for free.
What I’d like to do this week is talk about another popular credo, one that plays a very large role in blinding people nowadays to the shape of the future looming up ahead of us all just now. In an interesting display of synchronicity, it came up in a conversation I had while last week’s essay was still being written. A friend and I were talking about the myth of progress, the facile and popular conviction that all human history follows an ever-ascending arc from the caves to the stars; my friend noted how disappointed he’d been with a book about the future that backed away from tomorrow’s challenges into the shelter of a comforting thoughtstopper:  “Technology will always be with us.”
Let’s take a moment to follow the advice I gave in last week’s post and think about what, if anything, that actually means. Taken in the most literal sense, it’s true but trivial. Toolmaking is one of our species’ core evolutionary strategies, and so it’s a safe bet that human beings will have some variety of technology or other as long as our species survives. That requirement could just as easily be satisfied, though, by a flint hand axe as by a laptop computer—and a flint hand axe is presumably not what people who use that particular thoughtstopper have in mind.
Perhaps we might rephrase the credo, then, as “modern technology will always be with us.” That’s also true in a trivial sense, and false in another, equally trivial sense. In the first sense, every generation has its own modern technology; the latest up-to-date flint hand axes were, if you’ll pardon the pun, cutting-edge technology in the time of the Neanderthals.  In the second sense, much of every generation’s modern technology goes away promptly with that generation; whichever way the future goes, much of what counts as modern technology today will soon be no more modern and cutting-edge than eight-track tape players or Victorian magic-lantern projectors. That’s as true if we get a future of continued progress as it is if we get a future of regression and decline.
Perhaps our author means something like “some technology at least as complex as what we have now, and fulfilling most of the same functions, will always be with us.” This is less trivial but it’s quite simply false, as historical parallels show clearly enough. Much of the technology of the Roman era, from wheel-thrown pottery to central heating, was lost in most of the western Empire and had to be brought in from elsewhere centuries later.  In the dark ages that followed the fall of Mycenean Greece, even so simple a trick as the art of writing was lost, while the history of Chinese technology before the modern era is a cycle in which many discoveries made during the heyday of each great dynasty were lost in the dark age that followed its decline and fall, and had to be rediscovered when stability and prosperity returned. For people living in each of these dark ages, technology comparable to what had been in use before the dark age started was emphatically not always with them.
For that matter, who is the “us” that we’re discussing here? Many people right now have no access to the technologies that middle-class Americans take for granted. For all the good that modern technology does them, today’s rural subsistence farmers, laborers in sweatshop factories, and the like might as well be living in some earlier era. I suspect our author is not thinking about such people, though, and the credo thus might be phrased as “some technology at least as complex as what middle-class people in the industrial world have now, providing the same services they have come to expect, will always be available to people of that same class.” Depending on how you define social classes, that’s either true but trivial—if “being middle class” equals “having access to the technology todays middle classes have,” no middle class people will ever be deprived of such a technology because, by definition, there will be no middle class people once the technology stops being available—or nontrivial but clearly false—plenty of people who think of themselves as middle class Americans right now are losing access to a great deal of technology as economic contraction deprives them of their jobs and incomes and launches them on new careers of downward mobility and radical impoverishment.
Well before the analysis got this far, of course, anyone who’s likely to mutter the credo “Technology will always be with us” will have jumped up and yelled, “Oh for heaven’s sake, you know perfectly well what I mean when I use that word! You know, technology!”—or words to that effect. Now of course I do know exactly what the word means in that context: it’s a vague abstraction with no real conceptual meaning at all, but an ample supply of raw emotional force.  Like other thoughtstoppers of the same kind, it serves as a verbal bludgeon to prevent people from talking or even thinking about the brittle, fractious, ambivalent realities that shape our lives these days. Still, let’s go a little further with the process of analysis, because it leads somewhere that’s far from trivial.
Keep asking a believer in the credo we’re discussing the sort of annoying questions I’ve suggested above, and sooner or later you’re likely to get a redefinition that goes something like this: “The coming of the industrial revolution was a major watershed in human history, and no future society of any importance will ever again be deprived of the possibilities opened up by that revolution.” Whether or not that turns out to be true is a question nobody today can answer, but it’s a claim worth considering, because history shows that enduring shifts of this kind do happen from time to time. The agricultural revolution of c. 9000 BCE and the urban revolution of c. 3500 BCE were both decisive changes in human history.  Even though there were plenty of nonagricultural societies after the first, and plenty of nonurban societies after the second, the possibilities opened up by each revolution were always options thereafter, when and where ecological and social circumstances permitted.
Some 5500 years passed between the agricultural revolution and the urban revolution, and since it’s been right around 5500 years since the urban revolution began, a case could probably be made that we were due for another. Still, let’s take a closer look at the putative third revolution. What exactly was the industrial revolution? What changed, and what future awaits those changes?
That’s a far more subtle question than it might seem at first glance, because the cascade of changes that fit under the very broad label “the industrial revolution” weren’t all of a piece. I’d like to suggest, in fact, that there was not one industrial revolution, but four of them—or, more precisely, three and a half. Lewis Mumford’s important 1934 study Technics and Civilization identified three of those revolutions, though the labels he used for them—the eotechnic, paleotechnic, and neotechnic phases—shoved them into a linear scheme of progress that distorts many of their key features. Instead, I propose to borrow the same habit people use when they talk about the Copernican and Darwinian revolutions, and name the revolutions after individuals who played crucial roles in making them happen.
First of all, then—corresponding to Mumford’s eotechnic phase—is the Baconian revolution, which got under way around 1600. It takes its name from Francis Bacon, who was the first significant European thinker to propose that what he called natural philosophy and we call science ought to be reoriented away from the abstract contemplation of the cosmos, and toward making practical improvements in the technologies of the time. Such improvements were already under way, carried out by a new class of “mechanicks” who had begun to learn by experience that building a faster ship, a sturdier plow, a better spinning wheel, or the like could be a quick route to prosperity, and encouraged by governments eager to cash in new inventions for the more valued coinage of national wealth and military victory.
The Baconian revolution, like those that followed it, brought with it a specific suite of technologies. Square-rigged ships capable of  long deepwater voyages revolutionized international trade and naval warfare; canals and canal boats had a similar impact on domestic transport systems. New information and communication media—newspapers, magazines, and public libraries—were crucial elements of the Baconian technological suite, which also encompassed major improvements in agriculture and in metal and glass manufacture, and significant developments in the use of wind and water power, as well as the first factories using division of labor to allow mass production.
The second revolution—corresponding to Mumford’s paleotechnic phase—was the Wattean revolution, which got started around 1780. This takes its name, of course, from James Watt, whose redesign of the steam engine turned it from a convenience for the mining industry to the throbbing heart of a wholly new technological regime, replacing renewable energy sources with concentrated fossil fuel energy and putting that latter to work in every economically viable setting. The steamship was the new vehicle of international trade, the railroad the corresponding domestic transport system; electricity came in with steam, and so did the telegraph, the major new communications technology of the era, while mass production of steel via the Bessemer process had a massive impact across the economic sphere.
The third revolution—corresponding to Mumford’s neotechnic phase—was the Ottonian revolution, which took off around 1890. I’ve named this revolution after Nikolaus Otto, who invented the four-cycle internal combustion engine in 1876 and kickstarted the process that turned petroleum from a source of lamp fuel to the resource that brought the industrial age to its zenith. In the Ottonian era, international trade shifted to diesel-powered ships, supplemented later on by air travel; the domestic transport system was the automobile; the rise of vacuum-state electronics made radio (including television, which is simply an application of radio technology) the major new communications technology; and the industrial use of organic chemistry, turning petroleum and other fossil fuels into feedstocks for plastics, gave the Ottonian era its most distinctive materials.
The fourth, partial revolution, which hadn’t yet begun when Mumford wrote his book, was the Fermian revolution, which can be dated quite precisely to 1942 and is named after Enrico Fermi, the designer and builder of the first successful nuclear reactor.  The keynote of the Fermian era was the application of subatomic physics, not only in nuclear power but also in solid-state electronic devices such as the transistor and the photovoltaic cell. In the middle years of the 20th century, a great many people took it for granted that the Fermian revolution would follow the same trajectory as its Wattean and Ottonian predecessors: nuclear power would replace diesel power in freighters, electricity would elbow aside gasoline as the power source for domestic transport, and nucleonics would become as important as electronics as a core element in new technologies yet unimagined.
Unfortunately for those expectations, nuclear power turned out to be a technical triumph but an economic flop.  Claims that nuclear power would make electricity too cheap to meter ran face first into the hard fact that no nation anywhere has been able to have a nuclear power industry without huge and ongoing government subsidies, while nuclear-powered ships were relegated to the navies of very rich nations, which didn’t have to turn a profit and so could afford to ignore the higher construction and operating costs. Nucleonics turned out to have certain applications, but nothing like as many or as lucrative as the giddy forecasts of 1950 suggested.  Solid state electronics, on the other hand, turned out to be economically viable, at least in a world with ample fossil fuel supplies, and made the computer and the era’s distinctive communications medium, the internet, economically viable propositions.
The Wattean, Ottonian, and Fermian revolutions thus had a core theme in common. Each of them relied on a previously untapped energy resource—coal, petroleum, and uranium, respectively—and set out to build a suite of technologies to exploit that resource and the forms of energy it made available. The scientific and engineering know-how that was required to manage each power source then became the key toolkit for the technological suite that unfolded from it; from the coal furnace, the Bessemer process for making steel was a logical extension, just as the knowledge of hydrocarbon chemistry needed for petroleum refining became the basis for plastics and the chemical industry, and the same revolution in physics that made nuclear fission reactors possible also launched solid state electronics—it’s not often remembered, for example, that Albert Einstein got his Nobel prize for understanding the process that makes PV cells work, not for the theory of relativity.
Regular readers of this blog will probably already have grasped the core implication of this common theme. The core technologies of the Wattean, Ottonian, and Fermian eras all depend on access to large amounts of specific nonrenewable resources.  Fermian technology, for example, demands fissible material for its reactors and rare earth elements for its electronics, among many other things; Ottonian technology demands petroleum and natural gas, and some other resources; Wattean technology demands coal and iron ore. It’s sometimes possible to substitute one set of materials for another—say, to process coal into liquid fuel—but there’s always a major economic cost involved, even if there’s an ample and inexpensive supply of the other resource that isn’t needed for some other purpose.
In today’s world, by contrast, the resources needed for all three technological suites are being used at breakneck rates and thus are either already facing depletion or will do so in the near future. When coal has already been mined so heavily that sulfurous, low-energy brown coal—the kind that miners in the 19th century used to discard as waste—has become the standard fuel for coal-fired power plants, for example, it’s a bit late to talk about a coal-to-liquids program to replace any serious fraction of the world’s petroleum consumption: the attempt to do so would send coal prices soaring to economy-wrecking heights.  Richard Heinberg has pointed out in his useful book Peak Everything, for that matter, that a great deal of the coal still remaining in the ground will take more energy to extract than it will produce when burnt, making it an energy sink rather than an energy source.
Thus we can expect very large elements of Wattean, Ottonian, and Fermian technologies to stop being economically viable in the years ahead, as depletion drives up resource costs and the knock-on effects of the resulting economic contraction force down demand. That doesn’t mean that every aspect of those technological suites will go away, to be sure.  It’s not at all unusual, in the wake of a fallen civilization, to find “orphan technologies” that once functioned as parts of a coherent technological suite, still doing their jobs long after the rest of the suite has fallen out of use.  Just as Roman aqueducts kept bringing water to cities in the post-Roman dark ages whose inhabitants had neither the resources nor the knowledge to build anything of the kind, it’s quite likely that (say) hydroelectric facilities in certain locations will stay in use for centuries to come, powering whatever electrical equipment can maintained or built from local resources, even if the people who tend the dams and use the electricity have long since lost the capacity to build turbines, generators, or dams at all.
Yet there’s another issue involved, because the first of the four industrial revolutions I’ve discussed in this essay—the Baconian revolution—was not dependent on nonrenewable resources.  The suite of technologies that unfolded from Francis Bacon’s original project used the same energy sources that everyone in the world’s urban-agricultural societies had been using for more than three thousand years: human and animal muscle, wind, water, and heat from burning biomass. Unlike the revolutions that followed it, to put the same issue in a different but equally relevant way, the Baconian revolution worked within the limits of the energy budget the Earth receives each year from the Sun, instead of drawing down stored sunlight from the Earth’s store of fossil carbon or its much more limited store of fissible isotopes.  The Baconian era simply used that annual solar budget in a more systematic way than previous societies managed, by directing the considerable intellectual skills of the natural philosophers of the day toward practical ends.
Because of their dependence on nonrenewable resources, the three later revolutions were guaranteed all along to be transitory phases. The Baconian revolution need not be, and I think that there’s a noticeable chance that it will not be. By that I mean, to begin with, that the core intellectual leap that made the Baconian revolution possible—the  scientific method—is sufficiently widespread at this point that with a little help, it may well get through the decline and fall of our civilization and become part of the standard toolkit of future civilizations, in much the same way that classical logic survived the wreck of Rome to be taken up by successor civilizations across the breadth of the Old World.
Still, that’s not all I mean to imply here. The specific technological suite that developed in the wake of the Baconian revolution will still be viable in a post-fossil fuel world, wherever the ecological and social circumstances will permit it to exist at all. Deepwater maritime shipping, canal-borne transport across nations and subcontinents, mass production of goods using the division of labor as an organizing principle, extensive use of wind and water power, and widespread literacy and information exchange involving print media, libraries, postal services, and the like, are all options available to societies in the deindustrial world. So are certain other technologies that evolved in the post-Baconian era, but fit neatly within the Baconian model: solar thermal technologies, for example, and those forms of electronics that can be economically manufactured and powered with the limited supplies of concentrated energy a sustainable society will have on hand.
I’ve suggested in previous posts here, and in my book The Ecotechnic Future, that our current industrial society may turn out to be merely the first, most wasteful, and least durable of what might  best be called “technic societies”—that is, human societies that get a large fraction of their total energy supply from sources other than human and animal muscle, and support complex technological suites on that basis. The technologies of the Baconian era, I propose, offer a glimpse of what an emerging ecotechnic society might look like in practice—and a sense of the foundations on which the more complex ecotechnic societies of the future will build.
When the book mentioned at the beginning of this essay claimed that “technology will always be with us,” it’s a safe bet that the author wasn’t thinking of tall ships, canal boats, solar greenhouses, and a low-power global radio net, much less the further advances along the same lines that might well be possible in a post-fossil fuel world. Still, it’s crucial to get outside the delusion that the future must either be a flashier version of the present or a smoldering wasteland full of bleached bones, and start to confront the wider and frankly more interesting possibilities that await our descendants.
***************Along these same lines, I’d like to remind readers that this blog’s second post-peak oil science fiction contest has less than a month left to run. Those of you who are still working on stories need to get them finished, posted online, and linked to a comment on this blog before May 1 to be eligible for inclusion in the second After Oil anthology. Get ‘em in!

Mentats Wanted, Will Train

Wed, 2014-04-02 17:17
The theme of last week’s post here on The Archdruid Report—the strategy of preserving or reviving technologies for the deindustrial future now, before the accelerating curve of decline makes that task more difficult than it already is—can be applied very broadly indeed. Just now, courtesy of the final blowoff of the age of cheap energy, we have relatively easy access to plenty of information about what worked in the past; some other resources are already becoming harder to get, but there’s still time and opportunity to accomplish a great deal.

I’ll be talking about some of the possibilities as we proceed, and with any luck, other people will get to work on projects of their own that I haven’t even thought of. This week, though, I want to take Gustav Erikson’s logic in a direction that probably would have made the old sea dog scratch his head in puzzlement, and talk about how a certain set of mostly forgotten techniques could be put back into use right now to meet a serious unmet need in contemporary American society.
The unmet need I have in mind is unusually visible just now, courtesy of the recent crisis in the Ukraine. I don’t propose to get into the whys and wherefores of that crisis just now, except to note that since the collapse of the Austro-Hungarian Empire, the small nations of eastern Europe have been grist between the spinning millstones of Russia and whichever great power dominates western Europe. It’s not a comfortable place to be; Timothy Snyder’s terse description of 20th century eastern Europe as “bloodlands” could be applied with equal force to any set of small nations squeezed between empires, and it would take quite a bit of unjustified faith in human goodness to think that the horrors of the last century have been safely consigned to the past.
The issue I want to discuss, rather, has to do with the feckless American response to that crisis. Though I’m not greatly interested in joining the chorus of American antigovernment activists fawning around Vladimir Putin’s feet these days, it’s fair to say that he won this one. Russia’s actions caught the United States and EU off balance, secured the Russian navy’s access to the Black Sea and the Mediterranean, and boosted Putin’s already substantial popularity at home. By contrast, Obama came across as amateurish and, worse, weak.  When Obama announced that the US retaliation would consist of feeble sanctions against a few Russian banks and second-string politicians, the world rolled its eyes, and the Russian Duma passed a resolution scornfully requesting Obama to apply those same sanctions to every one of its members.
As the crisis built, there was a great deal of talk in the media about Europe’s dependence on Russian natural gas, and the substantial influence over European politics that Russia has as a result of that unpalatable fact. It’s a major issue, and unlikely to go away any time soon; around a third of the natural gas that keeps Europeans from shivering in the dark each winter comes from Russian gas fields, and the Russian government has made no bones about the fact that it could just as well sell that gas to somebody to Russia’s south or east instead. It was in this context that American politicians and pundits started insisting at the top of their lungs that the United States had a secret weapon against the Sov—er, Russian threat: exports of abundant natural gas from America, which would replace Russian gas in Europe’s stoves, furnaces, and power plants.
As Richard Heinberg pointed out trenchantly a few days back in a typically spot-on essay, there’s only one small problem with this cozy picture: the United States has no spare natural gas to export.  It’s a net importer of natural gas, as it typically burns over a hundred billion more cubic feet of gas each month than it produces domestically.  What’s more, even according to the traditionally rose-colored forecasts issued by the EIA, it’ll be 2020 at the earliest before the United States has any natural gas to spare for Europe’s needs. Those forecasts, by the way, blithely assume that the spike in gas production driven by the recent fracking bubble will just keep on levitating upwards for the foreseeable future; if this reminds you of the rhetoric surrounding tech stocks in the runup to 2000, housing prices in the runup to 2008, or equivalent phenomena in the history of any other speculative swindle you care to name, let’s just say you’re not alone.
According to those forecasts that start from the annoying fact that the laws of physics and geology do actually apply to us, on the other hand, the fracking boom will be well into bust territory by 2020, and those promised torrents of natural gas that will allegedly free Europe from Russian influence will therefore never materialize at all. At the moment, furthermore, boasting about America’s alleged surplus of natural gas for export is particularly out of place, because US natural gas inventories currently in storage are less than half their five-year average level for this time of year, having dropped precipitously since December. Since all this is public information, we can be quite confident that the Russians are aware of it, and this may well explain some of the air of amused contempt with which Putin and his allies have responded to American attempts to rattle a saber that isn’t there.
Any of the politicians and pundits who participated in that futile exercise could have found out the problems with their claim in maybe two minutes of internet time.  Any of the reporters and editors who printed those claims at face value could have done the same thing. I suppose it’s possible that the whole thing was a breathtakingly cynical exercise of Goebbels’ “Big Lie” principle, intended to keep Americans from noticing that the Obama’s people armed themselves with popguns for a shootout at the OK Corral. I find this hard to believe, though, because the same kind of thinking—or, more precisely, nonthinking—is so common in America these days.
It’s indicative that my post here two weeks ago brought in a bumper crop of the same kind of illogic. My post took on the popular habit of using the mantra “it’s different this time” to insist that the past has nothing to teach us about the present and the future. Every event, I pointed out, has some features that set it apart from others, and other features that it shares in common with others; pay attention to the common features and you can observe the repeating patterns, which can then be adjusted to take differences into account.  Fixate on the differences and deny the common features, though, and you have no way to test your beliefs—which is great if you want to defend your beliefs against reasonable criticism, but not so useful if you want to make accurate predictions about where we’re headed.
Did the critics of this post—and there were quite a few of them—challenge this argument, or even address it? Not in any of the peak oil websites I visited. What happened instead was that commenters brandished whatever claims about the future are dearest to their hearts and then said, in so many words, “It’s different this time”—as though that somehow answered me. It was quite an impressive example of sheer incantation, the sort of thing we saw not that long ago when Sarah Palin fans were trying to conjure crude oil into America’s depleted oilfields by chanting “Drill, baby, drill” over and over again. I honestly felt as though I’d somehow dozed off at the computer and slipped into a dream in which I was addressing an audience of sheep, who responded by bleating “But it’s different this ti-i-i-i-ime” in perfect unison.
A different mantra sung to the same bleat, so to speak, seems to have been behind the politicians and pundits, and all that nonexistent natural gas they thought was just waiting to be exported to Europe. The thoughtstopping phrase here is “America has abundant reserves of natural gas.” It will doubtless occur to many of my readers that this statement is true, at least for certain values of that nicely vague term “abundant,” just as it’s true that every historical event differs in at least some way from everything that’s happened in the past, and that an accelerated program of drilling can (and in fact did) increase US petroleum production by a certain amount, at least for a while. The fact that each of these statements is trivially true does not make any of them relevant.
That is to say, a remarkably large number of Americans, including the leaders of our country and the movers and shakers of our public opinion, are so inept at the elementary skills of thinking that they can’t tell the difference between mouthing a platitude and having a clue.
I suppose this shouldn’t surprise me as much as it does. For decades now, American public life has been dominated by thoughtstoppers of this kind—short, emotionally charged declarative sentences, some of them trivial, some of them incoherent, none of them relevant and all of them offered up as sound bites by politicians, pundits, and ordinary Americans alike, as though they meant something and proved something. The redoubtable H.L. Mencken, writing at a time when such things were not quite as universal in the American mass mind than they have become since then, called them “credos.”  It was an inspired borrowing from the Latin credo, “I believe,” but its relevance extends far beyond the religious sphere. 
Just as plenty of believing Americans in Mencken’s time liked to affirm their fervent faith in the doctrines of whatever church they attended without having the vaguest idea of what those doctrines actually meant, a far vaster number of Americans these days—religious, irreligious, antireligious, or concerned with nothing more supernatural than the apparent capacity of Lady Gaga’s endowments to defy the laws of gravity—gladly affirm any number of catchphrases about which they seem never to have entertained a single original thought. Those of my readers who have tried to talk about the future with their family and friends will be particularly familiar with the way this works; I’ve thought more than once of providing my readers with Bingo cards marked with the credos most commonly used to silence discussions of our future—“they’ll think of something,” “technology can solve any problem,” “the world’s going to end soon anyway,” “it’s different this time,” and so on—with some kind of prize for whoever fills theirs up first.
The prevalence of credos, though, is only the most visible end of a culture of acquired stupidity that I’ve discussed here in previous posts, and Erik Lindberg has recently anatomized in a crisp and thoughtful blog post. That habit of cultivated idiocy is a major contributor to the crisis of our age, but a crisis is always an opportunity, and with that in mind, I’d like to propose that it’s time for some of us, at least, to borrow a business model from the future, and start getting prepared for future job openings as mentats.
In Frank Herbert’s iconic SF novel Dune, as many of my readers will be aware, a revolt against computer technology centuries before the story opened led to a galaxywide ban on thinking machines—“Thou shalt not make a machine in the image of a human mind”—and a corresponding focus on developing human capacities instead of replacing them with hardware. The mentats were among the results: human beings trained from childhood to absorb, integrate, and synthesize information. Think of them as the opposite end of human potential from the sort of credo-muttering couch potatoes who seem to make up so much of the American population these days:  ask a mentat if it really is different this time, and after he’s spent thirty seconds or so reviewing the entire published literature on the subject, he’ll give you a crisp first-approximation analysis explaining what’s different, what’s similar, which elements of each category are relevant to the situation, and what your best course of action would be in response.
Now of course the training programs needed to get mentats to this level of function haven’t been invented yet, but the point still stands: people who know how to think, even at a less blinding pace than Herbert’s fictional characters manage, are going to be far better equipped to deal with a troubled future than those who haven’t.  The industrial world has been conducting what amounts to a decades-long experiment to see whether computers can make human beings more intelligent, and the answer at this point is a pretty firm no. In particular, computers tend to empower decision makers without making them noticeably smarter, and the result by and large is that today’s leaders are able to make bad decisions more easily and efficiently than ever before. That is to say, machines can crunch data, but it takes a mind to turn data into information, and a well-trained and well-informed mind to refine information into wisdom.
What makes a revival of the skills of thinking particularly tempting just now is that the bar is set so low. If you know how to follow an argument from its premises to its conclusion, recognize a dozen or so of the most common logical fallacies, and check the credentials of a purported fact, you’ve just left most Americans—including the leaders of our country and the movers and shakers of our public opinon—way back behind you in the dust. To that basic grounding in how to think, add a good general knowledge of history and culture and a few branches of useful knowledge in which you’ve put some systematic study, and you’re so far ahead of the pack that you might as well hang out your shingle as a mentat right away.
Now of course it may be a while before there’s a job market for mentats—in the post-Roman world, it took several centuries for those people who preserved the considerable intellectual toolkit of the classical world to find a profitable economic niche, and that required them to deck themselves out in tall hats with moons and stars on them. In the interval before the market for wizards opens up again, though, there are solid advantages to be gained by the sort of job training I’ve outlined, unfolding from the fact that having mental skills that go beyond muttering credos makes it possible to make accurate predictions about the future that are considerably more accurate than the ones guiding most Americans today. .
This has immediate practical value in all sorts of common, everyday situations these days. When all the people you know are rushing to sink every dollar they have in the speculative swindle du jour, for example, you’ll quickly recognize the obvious signs of a bubble in the offing, walk away, and keep your shirt while everyone else is losing theirs. When someone tries to tell you that you needn’t worry about energy costs or shortages because the latest piece of energy vaporware will surely solve all our problems, you’ll be prepared to ignore him and go ahead with insulating your attic, and when someone else insists that the Earth is sure to be vaporized any day now by whatever apocalypse happens to be fashionable that week, you’ll be equally prepared to ignore him and go ahead with digging the new garden bed. 
When the leaders of your country claim that an imaginary natural gas surplus slated to arrive six years from now will surely make Putin blink today, for that matter, you’ll draw the logical conclusion, and get ready for the economic and political impacts of another body blow to what’s left of America’s faltering global power and reputation. It may also occur to you—indeed, it may have done so already—that the handwaving about countering Russia is merely an excuse for building the infrastructure needed to export American natural gas to higher-paying global markets, which will send domestic gas prices soaring to stratospheric levels in the years ahead; this recognition might well inspire you to put a few extra inches of insulation up there in the attic, and get a backup heat source that doesn’t depend either on gas or on gas-fired grid electricity, so those soaring prices don’t have the chance to clobber you.
If these far from inconsiderable benefits tempt you, dear reader, I’d like to offer you an exercise as the very first step in your mentat training.  The exercise is this: the next time you catch someone (or, better yet, yourself) uttering a familiar thoughtstopper about the future—“It’s different this time,” “They’ll think of something,” “There are no limits to what human beings can achieve,” “The United States has an abundant supply of natural gas,” or any of the other entries in the long and weary list of contemporary American credos—stop right there and think about it. Is the statement true? Is it relevant? Does it address the point under discussion?  Does the evidence that supports it, if any does, outweigh the evidence against it? Does it mean what the speaker thinks it means? Does it mean anything at all?
There’s much more involved than this in learning how to think, of course, and down the road I propose to write a series of posts on the subject, using as raw material for exercises more of the popular idiocies behind which America tries to hide from the future. I would encourage all the readers of this blog to give this exercise a try, though. In an age of accelerating decline, the habit of letting arbitrary catchphrases replace actual thinking is a luxury that nobody can really afford, and those who cling to such things too tightly can expect to be blindsided by a future that has no interest in playing along with even the most fashionable credos.
*******************In not unrelated news, I’m pleased to report that the School of Economic Science will be hosting a five week course in London on Economics, Energy and Environment, beginning April 29 of this year, based in part on ideas from my book The Wealth of Nature. The course will finish up with a conference on June 1 at which, ahem, I’ll be one of the speakers. Details are at www.eeecourse.org.