AODA Blog

The Dream of the Machine

Wed, 2015-07-01 16:02
As I type these words, it looks as though the wheels are coming off the global economy. Greece and Puerto Rico have both suspended payments on their debts, and China’s stock market, which spent the last year in a classic speculative bubble, is now in the middle of a classic speculative bust. Those of my readers who’ve read John Kenneth Galbraith’s lively history The Great Crash 1929 already know all about the Chinese situation, including the outcome—and since vast amounts of money from all over the world went into Chinese stocks, and most of that money is in the process of turning into twinkle dust, the impact of the crash will inevitably proliferate through the global economy.
So, in all probability, will the Greek and Puerto Rican defaults. In today’s bizarre financial world, the kind of bad debts that used to send investors backing away in a hurry attract speculators in droves, and so it turns out that some big New York hedge funds are in trouble as a result of the Greek default, and some of the same firms that got into trouble with mortgage-backed securities in the recent housing bubble are in the same kind of trouble over Puerto Rico’s unpayable debts. How far will the contagion spread? It’s anybody’s guess.
Oh, and on another front, nearly half a million acres of Alaska burned up in a single day last week—yes, the fires are still going—while ice sheets in Greenland are collapsing so frequently and forcefully that the resulting earthquakes are rattling seismographs thousands of miles away. These and other signals of a biosphere in crisis make good reminders of the fact that the current economic mess isn’t happening in a vacuum. As Ugo Bardi pointed out in a thoughtful blog post, finance is the flotsam on the surface of the ocean of real exchanges of real goods and services, and the current drumbeat of financial crises are symptomatic of the real crisis—the arrival of the limits to growth that so many people have been discussing, and so many more have been trying to ignore, for the last half century or so.
A great many people in the doomward end of the blogosphere are talking about what’s going on in the global economy and what’s likely to blow up next. Around the time the next round of financial explosions start shaking the world’s windows, a great many of those same people will likely be talking about what to do about it all.  I don’t plan on joining them in that discussion. As blog posts here have pointed out more than once, time has to be considered when getting ready for a crisis. The industrial world would have had to start backpedaling away from the abyss decades ago in order to forestall the crisis we’re now in, and the same principle applies to individuals.  The slogan “collapse now and avoid the rush!” loses most of its point, after all, when the rush is already under way.
Any of my readers who are still pinning their hopes on survival ecovillages and rural doomsteads they haven’t gotten around to buying or building yet, in other words, are very likely out of luck. They, like the rest of us, will be meeting this where they are, with what they have right now. This is ironic, in that ideas that might have been worth adopting three or four years ago are just starting to get traction now. I’m thinking here particularly of a recent article on how to use permaculture to prepare for a difficult future, which describes the difficult future in terms that will be highly familiar to readers of this blog. More broadly, there’s a remarkable amount of common ground between that article and the themes of my book Green Wizardry. The awkward fact remains that when the global banking industry shows every sign of freezing up the way it did in 2008, putting credit for land purchases out of reach of most people for years to come, the article’s advice may have come rather too late.
That doesn’t mean, of course, that my readers ought to crawl under their beds and wait for death. What we’re facing, after all, isn’t the end of the world—though it may feel like that for those who are too deeply invested, in any sense of that last word you care to use, in the existing order of industrial society. As Visigothic mommas used to remind their impatient sons, Rome wasn’t sacked in a day. The crisis ahead of us marks the end of what I’ve called abundance industrialism and the transition to scarcity industrialism, as well as the end of America’s global hegemony and the emergence of a new international order whose main beneficiary hasn’t been settled yet. Those paired transformations will most likely unfold across several decades of economic chaos, political turmoil, environmental disasters, and widespread warfare. Plenty of people got through the equivalent cataclysms of the first half of the twentieth century with their skins intact, even if the crisis caught them unawares, and no doubt plenty of people will get through the mess that’s approaching us in much the same condition.
Thus I don’t have any additional practical advice, beyond what I’ve already covered in my books and blog posts, to offer my readers just now. Those who’ve already collapsed and gotten ahead of the rush can break out the popcorn and watch what promises to be a truly colorful show.  Those who didn’t—well, you might as well get some popcorn going and try to enjoy the show anyway. If you come out the other side of it all, schoolchildren who aren’t even born yet may eventually come around to ask you awed questions about what happened when the markets crashed in ‘15.
In the meantime, while the popcorn is popping and the sidewalks of Wall Street await their traditional tithe of plummeting stockbrokers, I’d like to return to the theme of last week’s post and talk about the way that the myth of the machine—if you prefer, the widespread mental habit of thinking about the world in mechanistic terms—pervades and cripples the modern mind.
Of all the responses that last week’s post fielded, those I found most amusing, and also most revealing, were those that insisted that of course the universe is a machine, so is everything and everybody in it, and that’s that. That’s amusing because most of the authors of these comments made it very clear that they embraced the sort of scientific-materialist atheism that rejects any suggestion that the universe has a creator or a purpose. A machine, though, is by definition a purposive artifact—that is, it’s made by someone to do something. If the universe is a machine, then, it has a creator and a purpose, and if it doesn’t have a creator and a purpose, logically speaking, it can’t be a machine.
That sort of unintentional comedy inevitably pops up whenever people don’t think through the implications of their favorite metaphors. Still, chase that habit further along its giddy path and you’ll find a deeper absurdity at work. When people say “the universe is a machine,” unless they mean that statement as a poetic simile, they’re engaging in a very dubious sort of logic. As Alfred Korzybski pointed out a good many years ago, pretty much any time you say “this is that,” unless you implicitly or explicitly qualify what you mean in very careful terms, you’ve just babbled nonsense.
The difficulty lies in that seemingly innocuous word “is.” What Korzybski called the “is of identity”—the use of the word “is” to represent  =, the sign of equality—makes sense only in a very narrow range of uses.  You can use the “is of identity” with good results in categorical definitions; when I commented above that a machine is a purposive artifact, that’s what I was doing. Here is a concept, “machine;” here are two other concepts, “purposive” and “artifact;” the concept “machine” logically includes the concepts “purposive” and “artifact,” so anything that can be described by the words “a machine” can also be described as “purposive” and “an artifact.” That’s how categorical definitions work.
Let’s consider a second example, though: “a machine is a purple dinosaur.” That utterance uses the same structure as the one we’ve just considered.  I hope I don’t have to prove to my readers, though, that the concept “machine” doesn’t include the concepts “purple” and “dinosaur” in any but the most whimsical of senses.  There are plenty of things that can be described by the label “machine,” in other words, that can’t be described by the labels “purple” or “dinosaur.” The fact that some machines—say, electronic Barney dolls—can in fact be described as purple dinosaurs doesn’t make the definition any less silly; it simply means that the statement “no machine is a purple dinosaur” can’t be justified either.
With that in mind, let’s take a closer look at the statement “the universe is a machine.” As pointed out earlier, the concept “machine” implies the concepts “purposive” and “artifact,” so if the universe is a machine, somebody made it to carry out some purpose. Those of my readers who happen to belong to Christianity, Islam, or another religion that envisions the universe as the creation of one or more deities—not all religions make this claim, by the way—will find this conclusion wholly unproblematic. My atheist readers will disagree, of course, and their reaction is the one I want to discuss here. (Notice how “is” functions in the sentence just uttered: “the reaction of the atheists” equals “the reaction I want to discuss.” This is one of the few other uses of “is” that doesn’t tend to generate nonsense.)
In my experience, at least, atheists faced with the argument about the meaning of the word “machine” I’ve presented here pretty reliably respond with something like “It’s not a machine in that sense.” That response takes us straight to the heart of the logical problems with the “is of identity.” In what sense is the universe a machine? Pursue the argument far enough, and unless the atheist storms off in a huff—which admittedly tends to happen more often than not—what you’ll get amounts to “the universe and a machine share certain characteristics in common.” Go further still—and at this point the atheist will almost certainly storm off in a huff—and you’ll discover that the characteristics that the universe is supposed to share with a machine are all things we can’t actually prove one way or another about the universe, such as whether it has a creator or a purpose.
The statement “the universe is a machine,” in other words, doesn’t do what it appears to do. It appears to state a categorical identity; it actually states an unsupported generalization in absolute terms. It takes a mental model abstracted from one corner of human experience and applies it to something unrelated.  In this case, for polemic reasons, it does so in a predictably one-sided way: deductions approved by the person making the statement (“the universe is a machine, therefore it lacks life and consciousness”) are acceptable, while deductions the person making the statement doesn’t like (“the universe is a machine, therefore it was made by someone for some purpose”) get the dismissive response noted above.
This sort of doublethink appears all through the landscape of contemporary nonconversation and nondebate, to be sure, but the problems with the “is of identity” don’t stop with its polemic abuse. Any time you say “this is that,” and mean something other than “this has some features in common with that,” you’ve just fallen into one of the corel boobytraps hardwired into the structure of human thought.
Human beings think in categories. That’s what made ancient Greek logic, which takes categories as its basic element, so massive a revolution in the history of human thinking: by watching the way that one category includes or excludes another, which is what the Greek logicians did, you can squelch a very large fraction of human stupidities before they get a foothold. What Alfred Korzybski pointed out, in effect, is that there’s a metalogic that the ancient Greeks didn’t get to, and logical theorists since their time haven’t really tackled either: the extremely murky relationship between the categories we think with and the things we experience, which don’t come with category labels spraypainted on them.
Here is a green plant with a woody stem. Is it a tree or a shrub? That depends on exactly where you draw the line between those two categories, and as any botanist can tell you, that’s neither an easy nor an obvious thing. As long as you remember that categories exist within the human mind as convenient handles for us to think with, you can navigate around the difficulties, but when you slip into thinking that the categories are more real than the things they describe, you’re in deep, deep trouble.
It’s not at all surprising that human thought should have such problems built into it. If, as I do, you accept the Darwinian thesis that human beings evolved out of prehuman primates by the normal workings of the laws of evolution, it follows logically that our nervous systems and cognitive structures didn’t evolve for the purpose of understanding the truth about the cosmos; they evolved to assist us in getting food, attracting mates, fending off predators, and a range of similar, intellectually undemanding tasks. If, as many of my theist readers do, you believe that human beings were created by a deity, the yawning chasm between creator and created, between an infinite and a finite intelligence, stands in the way of any claim that human beings can know the unvarnished truth about the cosmos. Neither viewpoint supports the claim that a category created by the human mind is anything but a convenience that helps our very modest mental powers grapple with an ultimately incomprehensible cosmos.
Any time human beings try to make sense of the universe or any part of it, in turn, they have to choose from among the available categories in an attempt to make the object of inquiry fit the capacities of their minds. That’s what the founders of the scientific revolution did in the seventeenth century, by taking the category of “machine” and applying it to the universe to see how well it would fit. That was a perfectly rational choice from within their cultural and intellectual standpoint. The founders of the scientific revolution were Christians to a man, and some of them (for example, Isaac Newton) were devout even by the standards of the time; the idea that the universe had been made by someone for some purpose, after all, wasn’t problematic in the least to people who took it as given that the universe was made by God for the purpose of human salvation. It was also a useful choice in practical terms, because it allowed certain features of the universe—specifically, the behavior of masses in motion—to be accounted for and modeled with a clarity that previous categories hadn’t managed to achieve.
The fact that one narrowly defined aspect of the universe seems to behave like a machine, though, does not prove that the universe is a machine, any more than the fact that one machine happens to look like a purple dinosaur proves that all machines are purple dinosaurs. The success of mechanistic models in explaining the behavior of masses in motion proved that mechanical metaphors are good at fitting some of the observed phenomena of physics into a shape that’s simple enough for human cognition to grasp, and that’s all it proved. To go from that modest fact to the claim that the universe and everything in it are machines involves an intellectual leap of pretty spectacular scale. Part of the reason that leap was taken in the seventeenth century was the religious frame of scientific inquiry at that time, as already mentioned, but there was another factor, too.
It’s a curious fact that mechanistic models of the universe appeared in western European cultures, and become wildly popular there, well before the machines did. In the early seventeenth century, machines played a very modest role in the life of most Europeans; most tasks were done using hand tools powered by human and animal muscle, the way they had been done since the dawn of the agricultural revolution eight millennia or so before. The most complex devices available at the time were pendulum clocks, printing presses, handlooms, and the like—you know, the sort of thing that people these days use instead of machines when they want to get away from technology.
For reasons that historians of ideas are still trying to puzzle out, though, western European thinkers during these same years were obsessed with machines, and with mechanical explanations for the universe. Those latter ranged from the plausible to the frankly preposterous—RenéDescartes, for example, proposed a theory of gravity in which little corkscrew-shaped particles went zooming up from the earth to screw themselves into pieces of matter and yank them down. Until Isaac Newton, furthermore, theories of nature based on mechanical models didn’t actually explain that much, and until the cascade of inventive adaptations of steam power that ended with James Watt’s epochal steam engine nearly a century after Newton, the idea that machines could elbow aside craftspeople using hand tools and animals pulling carts was an unproven hypothesis. Yet a great many people in western Europe believed in the power of the machine as devoutly as their ancestors had believed in the power of the bones of the local saints.
A habit of thought very widespread in today’s culture assumes that technological change happens first and the world of ideas changes in response to it. The facts simply won’t support that claim, though. As the history of mechanistic ideas in science shows clearly, the ideas come first and the technologies follow—and there’s good reason why this should be so. Technologies don’t invent themselves, after all. Somebody has to put in the work to invent them, and then other people have to invest the resources to take them out of the laboratory and give them a role in everyday life. The decisions that drive invention and investment, in turn, are powerfully shaped by cultural forces, and these in turn are by no means as rational as the people influenced by them generally like to think.
People in western Europe and a few of its colonies dreamed of machines, and then created them. They dreamed of a universe reduced to the status of a machine, a universe made totally transparent to the human mind and totally subservient to the human will, and then set out to create it. That latter attempt hasn’t worked out so well, for a variety of reasons, and the rising tide of disasters sketched out in the first part of this week’s post unfold in large part from the failure of that misbegotten dream. In the next few posts, I want to talk about why that failure was inevitable, and where we might go from here.

The Delusion of Control

Wed, 2015-06-24 15:45
I'm sure most of my readers have heard at least a little of the hullaballoo surrounding the release of Pope Francis’ encyclical on the environment, Laudato Si. It’s been entertaining to watch, not least because so many politicians in the United States who like to use Vatican pronouncements as window dressing for their own agendas have been left scrambling for cover now that the wind from Rome is blowing out of a noticeably different quarter.
Take Rick Santorum, a loudly Catholic Republican who used to be in the US Senate and now spends his time entertaining a variety of faux-conservative venues with his signature flavor of hate speech. Santorum loves to denounce fellow Catholics who disagree with Vatican edicts as “cafeteria Catholics,” and announced a while back that John F. Kennedy’s famous defense of the separation of church and state made him sick to his stomach. In the wake of Laudato Si, care to guess who’s elbowing his way to the head of the cafeteria line? Yes, that would be Santorum, who’s been insisting since the encyclical came out that the Pope is wrong and American Catholics shouldn’t be obliged to listen to him.
What makes all the yelling about Laudato Si a source of wry amusement to me is that it’s not actually a radical document at all. It’s a statement of plain common sense. It should have been obvious all along that treating the air as a gaseous sewer was a really dumb idea, and in particular, that dumping billions upon billions of tons of infrared-reflecting gases into the atmosphere would change its capacity for heat retention in unwelcome ways. It should have been just as obvious that all the other ways we maltreat the only habitable planet we’ve got were guaranteed to end just as badly. That this wasn’t obvious—that huge numbers of people find it impossible to realize that you can only wet your bed so many times before you have to sleep in a damp spot—deserves much more attention than it’s received so far.
It’s really a curious blindness, when you think about it. Since our distant ancestors climbed unsteadily down from the trees of late Pliocene Africa, the capacity to anticipate threats and do something about them has been central to the success of our species. A rustle in the grass might indicate the approach of a leopard, a series of unusually dry seasons might turn the local water hole into undrinkable mud: those of our ancestors who paid attention to such things, and took constructive action in response to them, were more likely to survive and leave offspring than those who shrugged and went on with business as usual. That’s why traditional societies around the world are hedged about with a dizzying assortment of taboos and customs meant to guard against every conceivable source of danger.
Somehow, though, we got from that to our present situation, where substantial majorities across the world’s industrial nations seem unable to notice that something bad can actually happen to them, where thoughtstoppers of the “I’m sure they’ll think of something” variety take the place of thinking about the future, and where, when something bad does happen to someone, the immediate response is to find some way to blame the victim for what happened, so that everyone else can continue to believe that the same thing can’t happen to them. A world where Laudato Si is controversial, not to mention necessary, is a world that’s become dangerously detached from the most basic requirements of collective survival.
For quite some time now, I’ve been wondering just what lies behind the bizarre paralogic with which most people these days turn blank and uncomprehending eyes on their onrushing fate. The process of writing last week’s blog post on the astonishing stupidity of US foreign policy, though, seems to have helped me push through to clarity on the subject. I may be wrong, but I think I’ve figured it out.
Let’s begin with the issue at the center of last week’s post, the realy remarkable cluelessness with which US policy toward Russia and China has convinced both nations they have nothing to gain from cooperating with a US-led global order, and are better off allying with each other and opposing the US instead. US politicians and diplomats made that happen, and the way they did it was set out in detail in a recent and thoughtful article by Paul R. Pillar in the online edition of The National Interest.
Pillar’s article pointed out that the United States has evolved a uniquely counterproductive notion of how negotiation works. Elsewhere on the planet, people understand that when you negotiate, you’re seeking a compromise where you get whatever you most need out of the situation, while the other side gets enough of its own agenda met to be willing to cooperate. To the US, by contrast, negotiation means that the other side complies with US demands, and that’s the end of it. The idea that other countries might have their own interests, and might expect to receive some substantive benefit in exchange for cooperation with the US, has apparently never entered the heads of official Washington—and the absence of that idea has resulted in the cascading failures of US foreign policy in recent years.
It’s only fair to point out that the United States isn’t the only practitioner of this kind of self-defeating behavior. A first-rate example has been unfolding in Europe in recent months—yes, that would be the ongoing non-negotiations between the Greek government and the so-called troika, the coalition of unelected bureaucrats who are trying to force Greece to keep pursuing a failed economic policy at all costs. The attitude of the troika is simple: the only outcome they’re willing to accept is capitulation on the part of the Greek government, and they’re not willing to give anything in return. Every time the Greek government has tried to point out to the troika that negotiation usually involves some degree of give and take, the bureaucrats simply give them a blank look and reiterate their previous demands.
That attitude has had drastic political consequences. It’s already convinced Greeks to elect a radical leftist government in place of the compliant centrists who ruled the country in the recent past. If the leftists fold, the neofascist Golden Dawn party is waiting in the wings. The problem with the troika’s stance is simple: the policies they’re insisting that Greece must accept have never—not once in the history of market economies—produced anything but mass impoverishment and national bankruptcy. The Greeks, among many other people, know this; they know that Greece will not return to prosperity until it defaults on its foreign debts the way Russia did in 1998, and scores of other countries have done as well.
If the troika won’t settle for a negotiated debt-relief program, and the current Greek government won’t default, the Greeks will elect someone else who will, no matter who that someone else happens to be; it’s that, after all, or continue along a course that’s already caused the Greek economy to lose a quarter of its precrisis GDP, and shows no sign of stopping anywhere this side of failed-state status. That this could quite easily hand Greece over to a fascist despot is just one of the potential problems with the troika’s strategy. It’s astonishing that so few people in Europe seem to be able to remember what happened the last time an international political establishment committed itself to the preservation of a failed economic orthodoxy no matter what; those of my readers who don’t know what I’m talking about may want to pick up any good book on the rise of fascism in Europe between the wars.
Let’s step back from specifics, though, and notice the thinking that underlies the dysfunctional behavior in Washington and Brussels alike. In both cases, the people who think they’re in charge have lost track of the fact that Russia, China, and Greece have needs, concerns, and interests of their own, and aren’t simply dolls that the US or EU can pose at will. These other nations can, perhaps, be bullied by threats over the short term, but that’s a strategy with a short shelf life.  Successful diplomacy depends on giving the other guy reasons to want to cooperate with you, while demanding cooperation at gunpoint guarantees that the other guy is going to look for ways to shoot back.
The same sort of thinking in a different context underlies the brutal stupidity of American drone attacks in the Middle East. Some wag in the media pointed out a while back that the US went to war against an enemy 5,000 strong, we’ve killed 10,000 of them, and now there are only 20,000 left. That’s a good summary of the situation; the US drone campaign has been a total failure by every objective measure, having worked out consistently to the benefit of the Muslim extremist groups against which it’s aimed, and yet nobody in official Washington seems capable of noticing this fact.
It’s hard to miss the conclusion, in fact, that the Obama administration thinks that in pursuing its drone-strike program, it’s playing some kind of video game, which the United States can win if it can rack up enough points. Notice the way that every report that a drone has taken out some al-Qaeda leader gets hailed in the media: hey, we nailed a commander, doesn’t that boost our score by five hundred? In the real world, meanwhile the indiscriminate slaughter of civilians by US drone strikes has become a core factor convincing Muslims around the world that the United States is just as evil as the jihadis claim, and thus sending young men by the thousands to join the jihadi ranks. Has anyone in the Obama administration caught on to this straightforward arithmetic of failure? Surely you jest.
For that matter, I wonder how many of my readers recall the much-ballyhooed “surge” in Afghanistan several years back.  The “surge” was discussed at great length in the US media before it was enacted on Afghan soil; talking heads of every persuasion babbled learnedly about how many troops would be sent, how long they’d stay, and so on. It apparently never occurred to anybody in the Pentagon or the White House that the Taliban could visit websites and read newspapers, and get a pretty good idea of what the US forces in Afghanistan were about to do. That’s exactly what happened, too; the Taliban simply hunkered down for the duration, and popped back up the moment the extra troops went home.
Both these examples of US military failure are driven by the same problem discussed earlier in the context of diplomacy: an inability to recognize that the other side will reliably respond to US actions in ways that further its own agenda, rather than playing along with the US. More broadly, it’s the same failure of thought that leads so many people to assume that the biosphere is somehow obligated to give us all the resources we want and take all the abuse we choose to dump on it, without ever responding in ways that might inconvenience us.
We can sum up all these forms of acquired stupidity in a single sentence: most people these days seem to have lost the ability to grasp that the other side can learn.
The entire concept of learning has been so poisoned by certain bad habits of contemporary thought that it’s probably necessary to pause here. Learning, in particular, isn’t the same thing as rote imitation. If you memorize a set of phrases in a foreign language, for example, that doesn’t mean you’ve learned that language. To learn the language means to grasp the underlying structure, so that you can come up with your own phrases and say whatever you want, not just what you’ve been taught to say.
In the same way, if you memorize a set of disconnected factoids about history, you haven’t learned history. This is something of a loaded topic right now in the US, because recent “reforms” in the American  public school system have replaced learning with rote memorization of disconnected factoids that are then regurgitated for multiple choice tests. This way of handling education penalizes those children who figure out how to learn, since they might well come up with answers that differ from the ones the test expects. That’s one of many ways that US education these days actively discourages learning—but that’s a subject for another post.
To learn is to grasp the underlying structure of a given subject of knowledge, so that the learner can come up with original responses to it. That’s what Russia and China did; they grasped the underlying structure of US diplomacy, figured out that they had nothing to gain by cooperating with that structure, and came up with a creative response, which was to ally against the United States. That’s what Greece is doing, too.  Bit by bit, the Greeks seem to be figuring out the underlying structure of troika policy, which amounts to the systematic looting of southern Europe for the benefit of Germany and a few of its allies, and are trying to come up with a response that doesn’t simply amount to unilateral submission.
That’s also what the jihadis and the Taliban are doing in the face of US military activity. If life hands you lemons, as the saying goes, make lemonade; if the US hands you drone strikes that routinely slaughter noncombatants, you can make very successful propaganda out of it—and if the US hands you a surge, you roll your eyes, hole up in your mountain fastnesses, and wait for the Americans to get bored or distracted, knowing that this won’t take long. That’s how learning works, but that’s something that US planners seem congenitally unable to take into account.
The same analysis, interestingly enough, makes just as much sense when applied to nonhuman nature. As Ervin Laszlo pointed out a long time ago in Introduction to Systems Philosophy, any sufficiently complex system behaves in ways that approximate intelligence.  Consider the way that bacteria respond to antibiotics. Individually, bacteria are as dumb as politicians, but their behavior on the species level shows an eerie similarity to learning; faced with antibiotics, a species of bacteria “tries out” different biochemical approaches until it finds one that sidesteps the antibiotic. In the same way, insects and weeds “try out” different responses to pesticides and herbicides until they find whatever allows them to munch on crops or flourish in the fields no matter how much poison the farmer sprays on them.
We can even apply the same logic to the environmental crisis as a whole. Complex systems tend to seek equilibrium, and will respond to anything that pushes them away from equilibrium by pushing back the other way. Any field biologist can show you plenty of examples: if conditions allow more rabbits to be born in a season, for instance, the population of hawks and foxes rises accordingly, reducing the rabbit surplus to a level the ecosystem can support. As humanity has put increasing pressure on the biosphere, the biosphere has begun to push back with increasing force, in an increasing number of ways; is it too much to think of this as a kind of learning, in which the biosphere “tries out” different ways to balance out the abusive behavior of humanity, until it finds one or more that work?
Now of course it’s long been a commonplace of modern thought that natural systems can’t possibly learn. The notion that nature is static, timeless, and unresponsive, a passive stage on which human beings alone play active roles, is welded into modern thought, unshaken even by the realities of biological evolution or the rising tide of evidence that natural systems are in fact quite able to adapt their way around human meddling. There’s a long and complex history to the notion of passive nature, but that’s a subject for another day; what interests me just now is that since 1990 or so, the governing classes of the United States, and some other Western nations as well, have applied the same frankly delusional logic to everything in the world other than themselves.
“We’re an empire now, and when we act, we create our own reality,” neoconservative guru Karl Rove is credited as saying to reporter Ron Suskind. “We’re history’s actors, and you, all of you, will be left to just study what we do.” That seems to be the thinking that governs the US government these days, on both sides of the supposed partisan divide. Obama says we’re in a recovery, and if the economy fails to act accordingly, why, rooms full of industrious flacks churn out elaborately fudged statistics to erase that unwelcome reality. That history’s self-proclaimed actors might turn out to be just one more set of flotsam awash on history’s vast tides has never entered their darkest dream.
Let’s step back from specifics again, though. What’s the source of this bizarre paralogic—the delusion that leads politicians to think that they create reality, and that everyone and everything else can only fill the roles they’ve been assigned by history’s actors?  I think I know. I think it comes from a simple but remarkably powerful fact, which is that the people in question, along with most people in the privileged classes of the industrial world, spend most of their time, from childhood on, dealing with machines.
We can define a machine as a subset of the universe that’s been deprived of the capacity to learn. The whole point of building a machine is that it does what you want, when you want it, and nothing else. Flip the switch on, and it turns on and goes through whatever rigidly defined set of behaviors it’s been designed to do; flip the switch off, and it stops. It may be fitted with controls, so you can manipulate its behavior in various tightly limited ways; nowadays, especially when computer technology is involved, the set of behaviors assigned to it may be complex enough that an outside observer may be fooled into thinking that there’s learning going on. There’s no inner life behind the facade, though.  It can’t learn, and to the extent that it pretends to learn, what happens is the product of the sort of rote memorization described above as the antithesis of learning.
A machine that learned would be capable of making its own decisions and coming up with a creative response to your actions—and that’s the opposite of what machines are meant to do, because that response might well involve frustrating your intentions so the machine can get what it wants instead. That’s why the trope of machines going to war against human beings has so large a presence in popular culture: it’s exactly because we expect machines not to act like people, not to pursue their own needs and interests, that the thought of machines acting the way we do gets so reliable a frisson of horror.
The habit of thought that treats the rest of the cosmos as a collection of machines, existing only to fulfill whatever purpose they might be assigned by their operators, is another matter entirely. Its origins can be traced back to the dawning of the scientific revolution in the seventeenth century, when a handful of thinkers first began to suggest that the universe might not be a vast organism—as everybody in the western world had presupposed for millennia before then—but might instead be a vast machine. It’s indicative that one immediate and popular response to this idea was to insist that other living things were simply “meat machines” who didn’t actually suffer pain under the vivisector’s knife, but had been designed by God to imitate sounds of pain in order to inspire feelings of pity in human beings.
The delusion of control—the conviction, apparently immune to correction by mere facts, that the world is a machine incapable of doing anything but the things we want it to do—pervades contemporary life in the world’s industrial societies. People in those societies spend so much more time dealing with machines than they do interacting with other people and other living things without a machine interface getting in the way, that it’s no wonder that this delusion is so widespread. As long as it retains its grip, though, we can expect the industrial world, and especially its privileged classes, to stumble onward from one preventable disaster to another. That’s the inner secret of the delusion of control, after all: those who insist on seeing the world in mechanical terms end up behaving mechanically themselves. Those who deny all other things the ability to learn lose the ability to learn from their own mistakes, and lurch robotically onward along a trajectory that leads straight to the scrapheap of the future.

An Affirming Flame

Wed, 2015-06-17 16:57
According to an assortment of recent news stories, this Thursday, June 18, is the make-or-break date by which a compromise has to be reached between Greece and the EU if a Greek default, with the ensuing risk of a potential Greek exit from the Eurozone, is to be avoided. If that’s more than just media hype, there’s a tremendous historical irony in the fact.  June 18 is after all the 200th anniversary of the Battle of Waterloo, where a previous attempt at European political and economic integration came to grief.
Now of course there are plenty of differences between the two events. In 1815 the preferred instrument of integration was raw military force; in 2015, for a variety of reasons, a variety of less overt forms of political and economic pressure have taken the place of Napoleon’s Grande Armée. The events of 1815 were also much further along the curve of defeat than those of 2015.  Waterloo was the end of the road for France’s dream of pan-European empire, while the current struggles over the Greek debt are taking place at a noticeably earlier milepost along the same road. The faceless EU bureaucrats who are filling Napoleon’s role this time around thus won’t be on their way to Elba for some time yet.
“What discords will drive Europe into that artificial unity—only dry or drying sticks can be tied into a bundle—which is the decadence of every civilization?” William Butler Yeats wrote that in 1936. It was a poignant question but also a highly relevant one, since the discords in question were moving rapidly toward explosion as he penned the last pages of A Vision, where those words appear.  Like most of those who see history in cyclical terms, Yeats recognized that the patterns that recur from age to age  are trends and motifs rather than exact narratives.  The part played by a conqueror in one era can end up in the hands of a heroic failure in the next, for circumstances can define a historical role but not the irreducibly human strengths and foibles of the person who happens to fill it.
Thus it’s not too hard to look at the rising spiral of stresses in the European Union just now and foresee the eventual descent of the continent into a mix of domestic insurgency and authoritarian nationalism, with the oncoming tide of mass migration from Africa and the Middle East adding further pressure to an already explosive mix. Exactly how that will play out over the next century, though, is a very tough question to answer. A century from now, due to raw demography, many countries in Europe will be majority-Muslim nations that look to Mecca for the roots of their faith and culture—but which ones, and how brutal or otherwise will the transition be? That’s impossible to know in advance.
There are plenty of similar examples just now; for the student of historical cycles, 2015 practically defines the phrase “target-rich environment.” Still, I want to focus on something a little different here. Partly, this is because the example I have in mind makes a good opportunity to point out the the way that what philosophers call the contingent nature of events—in less highflown language, the sheer cussedness of things—keeps history’s dice constantly rolling. Partly, though, it’s because this particular example is likely to have a substantial impact on the future of everyone reading this blog.
Last year saw a great deal of talk in the media about possible parallels between the current international situation and that of the world precisely a century ago, in the weeks leading up to the outbreak of the First World War.  Mind you, since I contributed to that discussion, I’m hardly in a position to reject the parallels out of hand. Still, the more I’ve observed the current situation, the more I’ve come to think that a different date makes a considerably better match to present conditions. To be precise, instead of a replay of 1914, I think we’re about to see an equivalent of 1939—but not quite the 1939 we know.
Two entirely contingent factors, added to all the other pressures driving toward that conflict, made the Second World War what it was. The first, of course, was the personality of Adolf Hitler. It was probably a safe bet that somebody in Weimar Germany would figure out how to build a bridge between the politically active but fragmented nationalist Right and the massive but politically inert German middle classes, restore Germany to great-power status, and gear up for a second attempt to elbow aside the British Empire. That the man who happened to do these things was an eccentric anti-Semite ideologue who combined shrewd political instincts, utter military incompetence, and a frankly psychotic faith in his own supposed infallibility, though, was in no way required by the logic of history.
Had Corporal Hitler taken an extra lungful of gas on the Western Front, someone else would likely have filled the same role in the politics of the time. We don’t even have to consider what might have happened if the nation that birthed Frederick the Great and Otto von Bismarck had come up with a third statesman of the same caliber. If the German head of state in 1939 had been merely a capable pragmatist with adequate government and military experience, and guided Germany’s actions by a logic less topsy-turvy than Hitler’s, the trajectory of those years would have been far different.
The second contingent factor that defined the outcome of the great wars of the twentieth century is broader in focus than the quirks of a single personality, but it was just as subject to those vagaries that make hash out of attempts at precise historical prediction. As discussed in an earlier post on this blog, it was by no means certain that America would be Britain’s ally when war finally came. From the Revolution onward, Britain was in many Americans’ eyes the national enemy; as late as the 1930s, when the US Army held its summer exercises, the standard scenario involved a British invasion of US territory.
All along, there was an Anglophile party in American cultural life, and its ascendancy in the years after 1900 played a major role in bringing the United States into two world wars on Britain’s side. Still, there was a considerably more important factor in play, which was a systematic British policy of conciliating the United States. From the American Civil War on, Britain allowed the United States liberties it would never have given any other power,  When the United States expanded its influence in Latin America and the Carribbean, Britain allowed itself to be upstaged there; when the United States shook off its  isolationism and built a massive blue-water navy, the British even allowed US naval vessels to refuel at British coaling stations during the global voyage of the “Great White Fleet” in 1907-9.
This was partly a reflection of the common cultural heritage that made many British politicians think of the United States as a sort of boisterous younger brother of theirs, and partly a cold-eyed recognition, in the wake of the Civil War, that war between Britain and the United States would almost certainly lead to a US invasion of Canada that Britain was very poorly positioned to counter. Still, there was another issue of major importance. To an extent few people realized at the time, the architecture of European peace after Waterloo depended on political arrangements that kept the German-speaking lands of the European core splintered into a diffuse cloud of statelets too small to threaten any of the major powers.
The great geopolitical fact of the 1860s was the collapse of that cloud into the nation of Germany, under the leadership of the dour northeastern kingdom of Prussia. In 1866, the Prussians pounded the stuffing out of Austria and brought the rest of the German states into a federation; in 1870-1871, the Prussians and their allies did the same thing to France, which was a considerably tougher proposition—this was the same French nation, remember, which brought Europe to its knees in Napoleon’s day—and the federation became the German Empire. The Austro-Hungarian Empire was widely considered the third great power in Europe until 1866; until 1870, France was the second; everybody knew that sooner or later the Germans were going to take on great power number one.
British policy toward the United States from 1871 onward was thus tempered by the harsh awareness that Britain could not afford to alienate a rising power who might become an ally, or at least a friendly neutral, when the inevitable war with Germany arrived. Above all, an alliance between Germany and the United States would have been Britain’s death warrant, and everyone in the Foreign Office and the Admiralty in London had to know that. The thought of German submarines operating out of US ports, German and American fleets combining to take on the Royal Navy, and American armies surging into Canada and depriving Britain of a critical source of raw materials and recruits while the British Army was pinned down elsewhere, must have given British planners many sleepless nights.
After 1918, that recognition must have been even more sharply pointed, because US loans and munitions shipments played a massive role in saving the western Allies from collapse in the face of the final German offensive in the autumn of 1917, and turned the tide in a war that, until then, had largely gone Germany’s way. During the two decades leading up to 1939, as Germany recovered and rearmed, British governments did everything they could to keep the United States on their side, with results that paid off handsomely when the Second World War finally came.
Let’s imagine, though, an alternative timeline in which the Foreign Office and the Admiralty from 1918 on are staffed by idiots. Let’s further imagine that Parliament is packed with clueless ideologues whose sole conception of foreign policy is that everyone, everywhere, ought to be bludgeoned into compliance with Britain’s edicts, no matter how moronic those happen to be. Let’s say, in particular, that one British government after another conducts its policy toward the United States on the basis of smug self-centered arrogance, and any move the US makes to assert itself on the international stage can count on an angry response from London. The United States launches an aircraft carrier? A threat to world peace, the London Timesroars.  The United States exerts diplomatic pressure on Mexico, and builds military bases in Panama? British diplomats head for the Carribbean and Latin America to stir up as much opposition to America’s agenda as possible.
Let’s say, furthermore, that in this alternative timeline, Adolf Hitler did indeed take one too many deep breaths on the Western Front, and lies in a military cemetery, one more forgotten casualty of the Great War. In his absence, the German Workers Party remains a fringe group, and the alliance between the nationalist Right and the middle classes is built instead by the Deutsche Volksfreiheitspartei (DVFP), which seizes power in 1934. Ulrich von Hassenstein, the new Chancellor, is a competent insider who knows how to listen to his diplomats and General Staff, and German foreign and military policy under his leadership pursues the goal of restoring Germany to world-power status using considerably less erratic means than those used by von Hassenstein’s equivalent in our timeline.
Come 1939, finally, as rising tensions between Germany and the Anglo-French alliance over Poland’s status move toward war, Chancellor von Hassenstein welcomes US President Charles Lindbergh to Berlin, where the two heads of state sign a galaxy of treaties and trade agreements and talk earnestly to the media about the need to establish a multipolar world order to replace Britain’s global hegemony. A second world war is in the offing, but the shape of that war will be very different from the one that broke out in our version of 1939, and while the United States almost certainly will be among the victors, Britain almost certainly will not.
Does all this sound absurd? Let’s change the names around and see.
Just as the great rivalry of the first half of the twentieth century was fought out between Britain and Germany, the great rivalry of the century’s second half was between the United States and Russia. If nuclear weapons hadn’t been invented, it’s probably a safe bet that at some point the rivalry would have ended in another global war.  As it was, the threat of mutual assured destruction meant that the struggle for global power had to be fought out less directly, in a flurry of proxy wars, sponsored insurgencies, economic warfare, subversion, sabotage, and bare-knuckle diplomacy. In that war, the United States came out on top, and Soviet Russia went the way of Imperial Germany, plunging into the same sort of political and economic chaos that beset the Weimar Republic in its day.
The supreme strategic imperative of the United States in that war was finding ways to drive as deep a wedge as possible between Russia and China, in order to keep them from taking concerted action against the US. That wasn’t all that difficult a task, since the two nations have very little in common and many conflicting interests. Nixon’s 1972 trip to China was arguably the defining moment in the Cold War, the point at which China’s separation from the Soviet bloc became total and Chinese integration with the American economic order began. From that point on, for Russia, it was basically all downhill.
In the aftermath of Russia’s defeat, the same strategic imperative remained, but the conditions of the post-Cold War world made it almost absurdly easy to carry out. All that would have been needed were American policies that gave Russia and China meaningful, concrete reasons to think that their national interests and aspirations would be easier to achieve in cooperation with a US-led global order than in opposition to it. Granting Russia and China the same position of regional influence that the US accords to Germany and Japan as a matter of course probably would have been enough. A little forbearance, a little foreign aid, a little adroit diplomacy, and the United States would have been in the catbird’s seat, with Russia and China glaring suspiciously at each other across their long and problematic mutual border, and bidding against each other for US support in their various disagreements.
But that’s not what happened, of course.
What happened instead was that the US embraced a foreign policy so astonishingly stupid that I’m honestly not sure the English language has adequate resources to describe it. Since 1990, one US administration after another, with the enthusiastic bipartisan support of Congress and the capable assistance of bureaucrats across official Washington from the Pentagon and the State Department on down, has pursued policies guaranteed to force Russia and China to set aside their serious mutual differences and make common cause against us. Every time the US faced a choice between competing policies, it’s consistently chosen the option most likely to convince Russia, China, or both nations at once that they had nothing to gain from further cooperation with American agendas.
What’s more, the US has more recently managed the really quite impressive feat of bringing Iran into rapprochement with the emerging Russo-Chinese alliance. It’s hard to think of another nation on Earth that has fewer grounds for constructive engagement with Russia or China than the Islamic Republic of Iran, but several decades of cluelessly hamfisted American blundering and bullying finally did the job. My American readers can now take pride in the state-of-the-art Russian air defense systems around Tehran, the bustling highways carrying Russian and Iranian products to each other’s markets, and the Russian and Chinese intelligence officers who are doubtless settling into comfortable digs on the north shore of the Persian Gulf, where they can snoop on the daisy chain of US bases along the south shore. After all, a quarter century of US foreign policy made those things happen.
It’s one thing to engage in this kind of serene disregard for reality when you’ve got the political unity, the economic abundance, and the military superiority to back it up. The United States today, like the British Empire in 1939, no longer has those. We’ve got an impressive fleet of aircraft carriers, sure, but Britain had an equally impressive fleet of battleships in 1939, and you’ll notice how much good those did them. Like Britain in 1939, the United States today is perfectly prepared for a kind of war that nobody fights any more, while rival nations less constrained by the psychology of previous investment and less riddled with institutionalized graft are fielding novel weapons systems designed to do end runs around our strengths and focus with surgical precision on our weaknesses.
Meanwhile, inside the baroque carapace of carriers, drones, and all the other high-tech claptrap of an obsolete way of war, the United States is a society in freefall, far worse off than Britain was during its comparatively mild 1930s downturn. Its leaders have forfeited the respect of a growing majority of its citizens; its economy has morphed into a Potemkin-village capitalism in which the manipulation of unpayable IOUs in absurd and rising amounts has all but replaced the actual production of goods and services; its infrastructure is so far fallen into decay that many US counties no longer pave their roads; most Americans these days think of their country’s political institutions as the enemy and its loudly proclaimed ideals as some kind of sick joke—and in both cases, not without reason. The national unity that made victory in two world wars and the Cold War possible went by the boards a long time ago, drowned in a tub by Tea Party conservatives who thought they were getting rid of government and limousine liberals who were going through the motions of sticking it to the Man.
I could go on tracing parallels for some time—in particular, despite a common rhetorical trope of US Russophobes, Vladimir Putin is not an Adolf Hitler but a fair equivalent of the Ulrich von Hassenstein of my alternate-history narrative—but here again, my readers can do the math themselves. The point I want to make is that all the signs suggest we are entering an era of international conflict in which the United States has thrown away nearly all its potential strengths, and handed its enemies advantages they would never have had if our leaders had the brains the gods gave geese. Since nuclear weapons still foreclose the option of major wars between the great powers, the conflict in question will doubtless be fought using the same indirect methods as the Cold War; in fact, it’s already being fought by those means, as the victims of proxy wars in Ukraine, Syria, and Yemen already know. The question in my mind is simply how soon those same methods get applied on American soil.
We thus stand at the beginning of a long, brutal epoch, as unforgiving as the one that dawned in 1939. Those who pin Utopian hopes on the end of American hegemony will get to add disappointment to that already bitter mix, since hegemony remains the same no matter who happens to be perched temporarily in the saddle. (I also wonder how many of the people who think they’ll rejoice at the end of American hegemony have thought through the impact on their hopes of collective betterment, not to mention their own lifestyles, once the 5% of the world’s population who live in the US can no longer claim a quarter or so of the world’s resources and wealth.) If there’s any hope possible at such a time, to my mind, it’s the one W.H. Auden proposed as the conclusion of his bleak and brilliant poem “September 1, 1939”:
Defenceless under the night,
Our world in stupor lies;
Yet, dotted everywhere,
Ironic points of light
Flash out wherever the just
Exchange their messages:
May I, composed like them
Of Eros and of dust,
Beleaguered by the same
Negation and despair,
Show an affirming flame.

The Era of Dissolution

Wed, 2015-06-10 20:06
The last of the five phases of the collapse process we’ve been discussing here in recent posts is the era of dissolution. (For those that haven’t been keeping track, the first four are the eras of pretense, impact, response, and breakdown). I suppose you could call the era of dissolution the Rodney Dangerfield of collapse, though it’s not so much that it gets no respect; it generally doesn’t even get discussed.
To some extent, of course, that’s because a great many of the people who talk about collapse don’t actually believe that it’s going to happen. That lack of belief stands out most clearly in the rhetorical roles assigned to collapse in so much of modern thinking. People who actually believe that a disaster is imminent generally put a lot of time and effort into getting out of its way in one way or another; it’s those who treat it as a scarecrow to elicit predictable emotional reactions from other people, or from themselves, who never quite manage to walk their talk.
Interestingly, the factor that determines the target of scarecrow-tactics of this sort seems to be political in nature. Groups that think they have a chance of manipulating the public into following their notion of good behavior tend to use the scarecrow of collapse to affect other people; for them, collapse is the horrible fate that’s sure to gobble us up if we don’t do whatever it is they want us to do. Those who’ve given up any hope of getting a response from the public, by contrast, turn the scarecrow around and use it on themselves; for them, collapse is a combination of Dante’s Inferno and the Big Rock Candy Mountain, the fantasy setting where the wicked get the walloping they deserve while they themselves get whatever goodies they’ve been unsuccessful at getting  in the here and now.
Then, of course, you get the people for whom collapse is less scarecrow than teddy bear, the thing that allows them to drift off comfortably to sleep in the face of an unwelcome future. It’s been my repeated observation that many of those who insist that humanity will become totally extinct in the very near future fall into this category. Most people, faced with a serious threat to their own lives, will take drastic action to save themselves; faced with a threat to the survival of their family or community, a good many people will take actions so drastic as to put their own lives at risk in an effort to save others they care about. The fact that so many people who insist that the human race is doomed go on to claim that the proper reaction is to sit around feeling very, very sad about it all does not inspire confidence in the seriousness of that prediction—especially when feeling very, very sad seems mostly to function as an excuse to keep enjoying privileged lifestyles for just a little bit longer.
So we have the people for whom collapse is a means of claiming unearned power, the people for whom it’s a blank screen on which to project an assortment of self-regarding fantasies, and the people for whom it’s an excuse to do nothing in the face of a challenging future. All three of those are popular gimmicks with an extremely long track record, and they’ll doubtless all see plenty of use millennia after industrial civilization has taken its place in the list of failed civilizations. The only problem with them is that they don’t happen to provide any useful guidance for those of us who have noticed that collapse isn’t merely a rhetorical gimmick meant to get emotional reactions—that it’s something that actually happens, to actual civilizations, and that it’s already happening to ours.
From the three perspectives already discussed, after all, realistic questions about what will come after the rubble stops bouncing are entirely irrelevant. If you’re trying to use collapse as a boogeyman to scare other people into doing what you tell them, your best option is to combine a vague sense of dread with an assortment of cherrypicked factoids intended to make a worst-case scenario look not nearly bad enough; if you’re trying to use collapse as a source of revenge fantasies where you get what you want and the people you don’t like get what’s coming to them, daydreams of various levels and modes of dampness are far more useful to you than sober assessments; while if you’re trying to use collapse as an excuse to maintain an unsustainable and planet-damaging SUV lifestyle, your best bet is to insist that everyone and everything dies all at once, so nothing will ever matter again to anybody.
On the other hand, there are also those who recognize that collapse happens, that we’re heading toward one, and that it might be useful to talk about what the world might look like on the far side of that long and difficult process. I’ve tried to sketch out a portrait of the postcollapse world in last year’s series of posts here on Dark Age America, and I haven’t yet seen any reason to abandon that portrait of a harsh but livable future, in which a sharply reduced global population returns to agrarian or nomadic lives in those areas of the planet not poisoned by nuclear or chemical wastes or rendered uninhabitable by prolonged drought or the other impacts of climate change, and in which much or most of today’s scientific and technological knowledge is irretrievably lost.
The five phases of collapse discussed in this latest sequence of posts is simply a description of how we get there—or, more precisely, of one of the steps by which we get there. That latter point’s a detail that a good many of my readers, and an even larger fraction of my critics, seem to have misplaced. The five-stage model is a map of how human societies shake off an unsustainable version of business as usual and replace it with something better suited to the realities of the time. It applies to a very wide range of social transformations, reaching in scale from the local to the global and in intensity from the relatively modest to the cataclysmic. To insist that it’s irrelevant because the current example of the species covers more geographical area than any previous example, or has further to fall than most, is like insisting that a law of physics that governs the behavior of marbles and billiards must somehow stop working just because you’re trying to do the same thing with bowling balls.
A difference of scale is not a difference of kind. Differences of scale have their own implications, which we’ll discuss a little later on in this post, but the complex curve of decline is recognizably the same in small things as in big ones, in the most as in the least catastrophic examples. That’s why I’ve used a relatively modest example—the collapse of the economic system of 1920s America and the resulting Great Depression—and an example from the midrange—the collapse of the French monarchy and the descent of 18th-century Europe into the maelstrom of the Napoleonic Wars—to provide convenient outlines for something toward the upper end of the scale—the decline and fall of modern industrial civilization and the coming of a deindustrial dark age. Let’s return to those examples, and see how the thread of collapse winds to its end.
As we saw in last week’s thrilling episode, the breakdown stage of the Great Depression came when the newly inaugurated Roosevelt administration completely redefined the US currency system. Up to that time, US dollar bills were in effect receipts for gold held in banks; after that time, those receipts could no longer be exchanged for gold, and the gold held by the US government became little more than a public relations gimmick. That action succeeded in stopping the ghastly credit crunch that shuttered every bank and most businesses in the US in the spring of 1933.
Roosevelt’s policies didn’t get rid of the broader economic dysfunction the 1929 crash had kickstarted. That was inherent in the industrial system itself, and remains a massive issue today, though its effects were papered over for a while by a series of temporary military, political, and economic factors that briefly enabled the United States to prosper at the expense of the rest of the world. The basic issue is simply that replacing human labor with machines powered by fossil fuel results in unemployment, and no law of nature or economics requires that new jobs can be found or created to replace the ones that are eliminated by mechanization. The history of the industrial age has been powerfully shaped by a whole series of attempts to ignore, evade, or paper over that relentless arithmetic.
Until 1940, the Roosevelt administration had no more luck with that project than the governments of most other nations.  It wasn’t until the Second World War made the lesson inescapable that anyone realized that the only way to provide full employment in an industrial society was to produce far more goods than consumers could consume, and let the military and a variety of other gimmicks take up the slack. That was a temporary gimmick, due to stark limitations in the resource base needed to support the mass production of useless goods, but in 1940, and even more so in 1950, few people recognized that and fewer cared. It’s our bad luck to be living at the time when that particular bill is coming due.
The first lesson to learn from the history of collapse, then, is that the breakdown phase doesn’t necessarily solve all the problems that brought it about. It doesn’t even necessarily take away every dysfunctional feature of the status quo. What it does with fair reliability is eliminate enough of the existing order of things that the problems being caused by that order decline to a manageable level. The more deeply rooted the problematic features of the status quo are in the structure of society and daily life, the harder it will be to change them, and the more likely other features are to be changed: in the example just given, it was much easier to break the effective link between the US currency and gold, and expand the money supply enough to get the economy out of cardiac arrest, than it was to break a link between mechanization and unemployment that’s hardwired into the basic logic of industrialism.
What this implies in turn is that it’s entirely possible for one collapse to cycle through the five stages we’ve explored, and then to have the era of dissolution morph straight into a new era of pretense in which the fact that all society’s problems haven’t been solved is one of the central things nobody in any relation to the centers of power wants to discuss. If the Second World War, the massive expansion of the petroleum economy, the invention of suburbia, the Cold War, and a flurry of other events hadn’t ushered in the immensely wasteful but temporarily prosperous boomtime of late 20th century America, there might well have been another vast speculative bubble in the mid- to late 1940s, resulting in another crash, another depression, and so on. This is after all what we’ve seen over the last twenty years: the tech stock bubble and bust, the housing bubble and bust, the fracking bubble and bust, each one hammering the economy further down the slope of decline.
With that in mind, let’s turn to our second example, the French Revolution. This is particularly fascinating since the aftermath of that particular era of breakdown saw a nominal return to the conditions of the era of pretense. After Napoleon’s final defeat in 1815, the Allied powers found an heir to the French throne and plopped him into the throne of the Bourbons as Louis XVIII to well-coached shouts of “Vive le Roi!” On paper, nothing had changed.
In reality, everything had changed, and the monarchy of post-Napoleonic France had roots about as deep and sturdy as the democracy of post-Saddam Iraq. Louis XVIII was clever enough to recognize this, and so managed to end his reign in the traditional fashion, feet first from natural causes. His heir Charles X was nothing like so clever, and got chucked off the throne after six years on it by another revolution in 1830. King Louis-Philippe went the same way in 1848—the French people were getting very good at revolution by that point. There followed a Republic, an Empire headed by Napoleon’s nephew, and finally another Republic which lasted out the century. All in all, French politics in the 19th century was the sort of thing you’d expect to see in an unusually excitable banana republic.
The lesson to learn from this example is that it’s very easy, and very common, for a society in the dissolution phase of collapse to insist that nothing has changed and pretend to turn back the clock. Depending on just how traumatic the collapse has been, everybody involved may play along with the charade, the way everyone in Rome nodded and smiled when Augustus Caesar pretended to uphold the legal forms of the defunct Roman Republic, and their descendants did exactly the same thing centuries later when Theodoric the Ostrogoth pretended to uphold the legal forms of the defunct Roman Empire. Those who recognize the charade as charade and play along without losing track of the realities, like Louis XVIII, can quite often navigate such times successfully; those who mistake charade for reality, like Charles X, are cruising for a bruising and normally get it in short order.
Combine these two lessons and you’ll get what I suspect will turn out to be a tolerably good sketch of the American future. Whatever comes out of the impact, response, and breakdown phases of the crisis looming ahead of the United States just now—whether it’s a fragmentary mess of successor states, a redefined nation beginning to recover from a period of personal rule by some successful demagogue or, just possibly, a battered and weary republic facing a long trudge back to its foundational principles, it seems very likely that everyone involved will do their level best to insist that nothing has really changed. If the current constitution has been abolished, it may be officially reinstated with much fanfare; there may be new elections, and some shuffling semblance of the two-party system may well come lurching out of the crypts for one or two more turns on the stage.
None of that will matter. The nation will have changed decisively in ways we can only begin to envision at this point, and the forms of twentieth-century American politics will cover a reality that has undergone drastic transformations, just as the forms of nineteenth-century French monarchy did. In due time, by some combination of legal and extralegal means, the forms will be changed to reflect the new realities, and the territory we now call the United States of America—which will almost certainly have a different name, and may well be divided into several different and competing nations by then—will be as prepared to face the next round of turmoil as it’s going to get.
Yes, there will be a next round of turmoil. That’s the thing that most people miss when thinking about the decline and fall of a civilization: it’s not a single event, or even a single linear process. It’s a whole series of cascading events that vary drastically in their importance, geographical scope, and body count. That’s true of every process of historic change.
It was true even of so simple an event as the 1929 crash and Great Depression: 1929 saw the crash, 1930 the suckers’ rally, 1931 the first wave of European bank failures, 1932 the unraveling of the US banking system, and so on until bombs falling on Pearl Harbor ushered in a different era. It was even more true of the French Revolution: between 1789 and 1815 France basically didn’t have a single year without dramatic events and drastic changes of one kind or another, and the echoes of the Revolution kept things stirred up for decades to come. Check out the fall of civilizations and you’ll see the same thing unfolding on a truly vast scale, with crisis after crisis along an arc centuries in length.
The process that’s going on around us is the decline and fall of industrial civilization. Everything we think of as normal and natural, modern and progressive, solid and inescapable is going to melt away into nothingness in the years, decades, and centuries ahead, to be replaced first by the very different but predictable institutions of a dark age, and then by the new and wholly unfamiliar forms of the successor societies of the far future. There’s nothing inevitable about the way we do things in today’s industrial world; our political arrangements, our economic practices, our social instutions, our cultural habits, our sciences and our technologies all unfold from industrial civilization’s distinctive and profoundly idiosyncratic worldview.  So does the central flaw in the entire baroque edifice, our lethally muddleheaded inability to understand our inescapable dependence on the biosphere that supports our lives. All that is going away in the time before us—but it won’t go away suddenly, or all at once.
Here in the United States, we’re facing one of the larger downward jolts in that prolonged process, the end of American global empire and of the robust economic benefits that the machinery of empire pumps from the periphery to the imperial center. Until recently, the five per cent of us who lived here got to enjoy a quarter of the world’s energy supply and raw materials and a third of its manufactured products. Those figures have already decreased noticeably, with consequences that are ringing through every corner of our society; in the years to come they’re going to decrease much further still, most likely to something like a five per cent share of the world’s wealth or even a little less. That’s going to impact every aspect of our lives in ways that very few Americans have even begun to think about.
All of that is taking place in a broader context, to be sure. Other countries will have their own trajectories through the arc of industrial civilization’s decline and fall, and some of those trajectories will be considerably less harsh in the short term than ours. In the long run, the human population of the globe is going to decline sharply; the population bubble that’s causing so many destructive effects just now will be followed in due time by a population bust, in which those four guys on horseback will doubtless play their usual roles. In the long run, furthermore, the vast majority of today’s technologies are going to go away as the resource base needed to support them gets used up, or stops being available due to other bottlenecks. Those are givens—but the long run is not the only scale that matters.
It’s not at all surprising that the foreshocks of that immense change are driving the kind of flight to fantasy criticized in the opening paragraphs of this essay. That’s business as usual when empires go down; pick up a good cultural history of the decline and fall of any empire in the last two millennia or so and you’ll find plenty of colorful prophecies of universal destruction. I’d like to encourage my readers, though, to step back from those fantasies—entertaining as they are—and try to orient themselves instead to the actual shape of the future ahead of us. That shape’s not only a good deal less gaseous than the current offerings of the Apocalypse of the Month Club (internet edition), it also offers an opportunity to do something about the future—a point we’ll be discussing further in posts to come.

The Era of Breakdown

Wed, 2015-06-03 16:49
The fourth of the stages in the sequence of collapse we’ve been discussing is the era of breakdown. (For those who haven’t been keeping track, the first three phases are the eras of pretense, impact, and response; the final phase, which we’ll be discussing next week, is the era of dissolution.) The era of breakdown is the phase that gets most of the press, and thus inevitably no other stage has attracted anything like the crop of misperceptions, misunderstandings, and flat-out hokum as this one.
The era of breakdown is the point along the curve of collapse at which business as usual finally comes to an end. That’s where the confusion comes in. It’s one of the central articles of faith in pretty much every human society that business as usual functions as a bulwark against chaos, a defense against whatever problems the society might face. That’s exactly where the difficulty slips in, because in pretty much every human society, what counts as business as usual—the established institutions and familiar activities on which everyone relies day by day—is the most important cause of the problems the society faces, and the primary cause of collapse is thus quite simply that societies inevitably attempt to solve their problems by doing all the things that make their problems worse.
The phase of breakdown is the point at which this exercise in futility finally grinds to a halt. The three previous phases are all attempts to avoid breakdown: in the phase of pretense, by making believe that the problems don’t exist; in the phase of impact, by making believe that the problems will go away if only everyone doubles down on whatever’s causing them; and in the phase of response, by making believe that changing something other than the things that are causing the problems will fix the problems. Finally, after everything else has been tried, the institutions and activities that define business as usual either fall apart or are forcibly torn down, and then—and only then—it becomes possible for a society to do something about its problems.
It’s important not to mistake the possibility of constructive action for the inevitability of a solution. The collapse of business as usual in the breakdown phase doesn’t solve a society’s problems; it doesn’t even prevent those problems from being made worse by bad choices. It merely removes the primary obstacle to a solution, which is the wholly fictitious aura of inevitability that surrounds the core institutions and activities that are responsible for the problems. Once people in a society realize that no law of God or nature requires them to maintain a failed status quo, they can then choose to dismantle whatever fragments of business as usual haven’t yet fallen down of their own weight.
That’s a more important action than it might seem at first glance. It doesn’t just put an end to the principal cause of the society’s problems. It also frees up resources that have been locked up in the struggle to keep business as usual going at all costs, and those newly freed resources very often make it possible for a society in crisis to transform itself drastically in a remarkably short period of time. Whether those transformations are for good or ill, or as usually happens, a mixture of the two, is another matter, and one I’ll address a little further on.
Stories in the media, some recent, some recently reprinted, happen to have brought up a couple of first-rate examples of the way that resources get locked up in unproductive activities during the twilight years of a failing society. A California newspaper, for example, recently mentioned that Elon Musk’s large and much-ballyhooed fortune is almost entirely a product of government subsidies. Musk is a smart guy; he obviously realized a good long time ago that federal and state subsidies for technology was where the money was at, and he’s constructed an industrial empire funded by US taxpayers to the tune of many billions of dollars. None of his publicly traded firms has ever made a profit, and as long as the subsidies keep flowing, none of them ever has to; between an overflowing feed trough of government largesse and the longstanding eagerness of fools to be parted from their money by way of the stock market, he’s pretty much set for life.
This is business as usual in today’s America. An article from 2013 pointed out, along the same lines, that the profits made by the five largest US banks were almost exactly equal to the amount of taxpayer money those same five banks got from the government. Like Elon Musk, the banks in question have figured out where the money is, and have gone after it with their usual verve; the revolving door that allows men in suits to shuttle back and forth between those same banks and the financial end of the US government doesn’t exactly hinder that process. It’s lucrative, it’s legal, and the mere fact that it’s bankrupting the real economy of goods and services in order to further enrich an already glutted minority of kleptocrats is nothing anyone in the citadels of power worries about.
A useful light on a different side of the same process comes from an editorial (in PDF) which claims thatsomething like half of all current scientific papers are unreliable junk. Is this the utterance of an archdruid, or some other wild-eyed critic of science? No, it comes from the editor of Lancet, one of the two or three most reputable medical journals on the planet. The managing editor of The New England Journal of Medicine, which has a comparable ranking to Lancet, expressed much the same opinion of the shoddy experimental design, dubious analysis, and blatant conflicts of interest that pervade contemporary scientific research.
Notice that what’s happening here affects the flow of information in the same way that misplaced government subsidies affect the flow of investment. The functioning of the scientific process, like that of the market, depends on the presupposition that everyone who takes part abides by certain rules. When those rules are flouted, individual actors profit, but they do so at the expense of the whole system: the results of scientific research are distorted so that (for example) pharmaceutical firms can profit from drugs that don’t actually have the benefits claimed for them, just as the behavior of the market is distorted so that (for example) banks that would otherwise struggle for survival, and would certainly not be able to pay their CEOs gargantuan bonuses, can continue on their merry way.
The costs imposed by these actions are real, and they fall on all other participants in science and the economy respectively. Scientists these days, especially but not only in such blatantly corrupt fields as pharmaceutical research, face a lose-lose choice between basing their own investigations on invalid studies, on the one hand, or having to distrust any experimental results they don’t replicate themselves, on the other. Meanwhile the consumers of the products of scientific research—yes, that would be all of us—have to contend with the fact that we have no way of knowing whether any given claim about the result of research is the product of valid science or not. Similarly, the federal subsidies that direct investment toward politically savvy entrepreneurs like Elon Musk, and politically well-connected banks such as Goldman Sachs, and away from less parasitic and more productive options distort the entire economic system by preventing the normal workings of the market from weeding out nonviable projects and firms, and rewarding the more viable ones.
Turn to the  historical examples we’ve been following for the last three weeks, and distortions of the same kind are impossible to miss. In the US economy before and during the stock market crash of 1929 and its long and brutal aftermath, a legal and financial system dominated by a handful of very rich men saw to it that the bulk of the nation’s wealth flowed uphill, out of productive economic activities and into speculative ventures increasingly detached from the productive economy. When the markets imploded, in turn, the same people did their level best to see to it that their lifestyles weren’t affected even though everyone else’s was. The resulting collapse in consumer expenditures played a huge role in driving the cascading collapse of the US economy that, by the spring of 1933, had shuttered every consumer bank in the nation and driven joblessness and impoverishment to record highs.
That’s what Franklin Roosevelt fixed. It’s always amused me that the people who criticize FDR—and of course there’s plenty to criticize in a figure who, aside from his far greater success as a wartime head of state, can best be characterized as America’s answer to Mussolini—always talk about the very mixed record of the economic policies of his second term. They rarely bother to mention the Hundred Days, in which FDR stopped a massive credit collapse in its tracks. The Hundred Days and their aftermath are the part of FDR’s presidency that mattered most; it was in that brief period that he slapped shock paddles on an economy in cardiac arrest and got a pulse going, by violating most of the rules that had guided the economy up to that time. That casual attitude toward economic dogma is one of the two things his critics have never been able to forgive; the other is that it worked.
In the same way, France before, during, and immediately after the Revolution was for all practical purposes a medieval state that had somehow staggered its way to the brink of the nineteenth century. The various revolutionary governments that succeeded one another in quick succession after 1789 made some badly needed changes, but it was left to Napoléon Bonaparte to drag France by the scruff of its collective neck out of the late Middle Ages. Napoléon has plenty of critics—and of course there’s plenty to criticize in a figure who was basically what Mussolini wanted to be when he grew up—but the man’s domestic policies were by and large inspired. To name only two of his most important changes, he replaced the sprawling provinces of medieval France with a system of smaller and geographically meaningful départements, and abolished the entire body of existing French law in favor of a newly created legal system, the Code Napoléon. When he was overthrown, those stayed; in fact, a great many other countries in Europe and elsewhere proceeded to adopt the Code Napoléon in place of their existing legal systems. There were several reasons for this, but one of the most important was that the new Code simply made that much more sense.

Both men were able to accomplish what they did, in turn, because abolishing the political, economic, and cultural distortions imposed on their respective countries by a fossilized status quo freed up all the resources that had bene locked up in maintaining those distortions. Slapping a range of legal barriers and taxes on the more egregious forms of speculative excess—another of the major achievements of the Roosevelt era—drove enough wealth back into the productive economy to lay the foundations of America’s postwar boom; in the same way, tipping a galaxy of feudal customs into history’s compost bin transformed France from the economic basket case it was in 1789 to the conqueror of Europe twenty years later, and the succesful and innovative economic and cultural powerhouse it became during most of the nineteenth century thereafter.

That’s one of the advantages of revolutionary change. By breaking down existing institutions and the encrusted layers of economic parasitism that inevitably build up around them over time, it reliably breaks loose an abundance of resources that were not available in the prerevolutionary period. Here again, it’s crucial to remember that the availability of resources doesn’t guarantee that they’ll be used wisely; they may be thrown away on absurdities of one kind or another. Nor, even more critically, does it mean that the same abundance of resources will be available indefinitely. The surge of additional resources made available by catabolizing old and corrupt systems is a temporary jackpot, not a permanent state of affairs. That said, when you combine the collapse of fossilized institutions that stand in the way of change, and a sudden rush of previously unavailable resources of various kinds, quite a range of possibilities previously closed to a society suddenly come open.

Applying this same pattern to the crisis of modern industrial civilization, though, requires attention to certain inescapable but highly unwelcome realities. In 1789, the problem faced by France was the need to get rid of a thousand years of fossilized political, economic, and social institutions at a time when the coming of the industrial age had made them hopelessly dysfunctional. In 1929, the problem faced by the United States was the need to pry the dead hand of an equally dysfunctional economic orthodoxy off the throat of the nation so that its economy would actually function again. In both cases, the era of breakdown was catalyzed by a talented despot, and was followed, after an interval of chaos and war, by a period of relative prosperity.

We may well get the despot this time around, too, not to mention the chaos and war, but the period of prosperity is probably quite another matter. The problem we face today, in the United States and more broadly throughout the world’s industrial societies, is that all the institutions of industrial civilization presuppose limitless economic growth, but the conditions that provided the basis for continued economic growth simply aren’t there any more. The 300-year joyride of industrialism was made possible by vast and cheaply extractable reserves of highly concentrated fossil fuels and other natural resources, on the one hand, and a biosphere sufficiently undamaged that it could soak up the wastes of human industry without imposing burdens on the economy, on the other. We no longer have either of those requirements.

With every passing year, more and more of the world’s total economic output has to be diverted from other activities to keep fossil fuels and other resources flowing into the industrial world’s power plants, factories, and fuel tanks; with every passing year, in turn, more and more of the world’s total economic output has to be diverted from other activities to deal with the rising costs of climate change and other ecological disruptions. These are the two jaws of the trap sketched out more than forty years ago in the pages of The Limits to Growth, still the most accurate (and thus inevitably the most savagely denounced) map of the predicament we face. The consequences of that trap can be summed up neatly: on a finite planet, after a certain point—the point of diminishing returns, which we’ve already passed—the costs of growth rise faster than the benefits, and finally force the global economy to its knees.

The task ahead of us is thus in some ways the opposite of the one that France faced in the aftermath of 1789. Instead of replacing a sclerotic and failing medieval economy with one better suited to a new era of industrial expansion, we need to replace a sclerotic and failing industrial economy with one better suited to a new era of deindustrial contraction. That’s a tall order, no question, and it’s not something that can be achieved easily, or in a single leap. In all probability, the industrial world will have to pass through the whole sequence of phases we’ve been discussing several times before things finally bottom out in the deindustrial dark ages to come.

Still, I’m going to shock my fans and critics alike here by pointing out that there’s actually some reason to think that positive change on more than an individual level will be possible as the industrial world slams facefirst into the limits to growth. Two things give me that measured sense of hope. The first is the sheer scale of the resources locked up in today’s spectacularly dysfunctional political, economic, and social institutions, which will become available for other uses when those institutions come apart. The $83 billion a year currently being poured down the oversized rathole of the five biggest US banks, just for starters, could pay for a lot of solar water heaters, training programs for organic farmers, and other things that could actually do some good.

Throw in the resources currently being chucked into all of the other attempts currently under way to prop up a failing system, and you’ve got quite the jackpot that could, in an era of breakdown, be put to work doing things worth while. It’s by no means certain, as already noted, that these resources will go to the best possible use, but it’s all but certain that they’ll go to something less stunningly pointless than, say, handing Elon Musk his next billion dollars.

The second thing that gives me a measured sense of hope is at once subtler and far more profound. These days, despite a practically endless barrage of rhetoric to the contrary, the great majority of Americans are getting fewer and fewer benefits from the industrial system, and are being forced to pay more and more of its costs, so that a relatively small fraction of the population can monopolize an ever-increasing fraction of the national wealth and contribute less and less in exchange. What’s more, a growing number of Americans are aware of this fact. The traditional schism of a collapsing society into a dominant minority and an internal proletariat, to use Arnold Toynbee’s terms, is a massive and accelerating social reality in the United States today.

As that schism widens, and more and more Americans are forced into the Third World poverty that’s among the unmentionable realities of public life in today’s United States, several changes of great importance are taking place. The first, of course, is precisely that a great many Americans are perforce learning to live with less—not in the playacting style popular just now on the faux-green end of the privileged classes, but really, seriously living with much less, because that’s all there is. That’s a huge shift and a necessary one, since the absurd extravagance many Americans consider to be a normal lifestyle is among the most important things that will be landing in history’s compost heap in the not too distant future.
At the same time, the collective consensus that keeps the hopelessly dysfunctional institutions of today’s status quo glued in place is already coming apart, and can be expected to dissolve completely in the years ahead. What sort of consensus will replace it, after the inevitable interval of chaos and struggle, is anybody’s guess at this point—though it’s vanishingly unlikely to have anything to do with the current political fantasies of left and right. It’s just possible, given luck and a great deal of hard work, that whatever new system gets cobbled together during the breakdown phase of our present crisis will embody at least some of the values that will be needed to get our species back into some kind of balance with the biosphere on which our lives depend. A future post will discuss how that might be accomplished—after, that is, we explore the last phase of the collapse process: the era of dissolution, which will be the theme of next week’s post.

The Era of Response

Wed, 2015-05-27 17:21
The third stage of the process of collapse, following what I’ve called the eras of pretense and impact, is the era of response. It’s easy to misunderstand what this involves, because both of the previous eras have their own kinds of response to whatever is driving the collapse; it’s just that those kinds of response are more precisely nonresponses, attempts to make the crisis go away without addressing any of the things that are making it happen.
If you want a first-rate example of the standard nonresponse of the era of pretense, you’ll find one in the sunny streets of Miami, Florida right now. As a result of global climate change, sea level has gone up and the Gulf Stream has slowed down. One consequence is that these days, whenever Miami gets a high tide combined with a stiff onshore wind, salt water comes boiling up through the storm sewers of the city all over the low-lying parts of town. The response of the Florida state government has been to ssue an order to all state employees that they’re not allowed to utter the phrase “climate change.”
That sort of thing is standard practice in an astonishing range of subjects in America these days. Consider the roles that the essentially nonexistent recovery from the housing-bubble crash of 2008-9 has played in political rhetoric since that time. The current inmate of the White House has been insisting through most of two turns that happy days are here again, and the usual reams of doctored statistics have been churned out in an effort to convince people who know better that they’re just imagining that something is wrong with the economy. We can expect to hear that same claim made in increasingly loud and confident tones right up until the day the bottom finally drops out. 
With the end of the era of pretense and the arrival of the era of impact comes a distinct shift in the standard mode of nonresponse, which can be used quite neatly to time the transition from one era to another. Where the nonresponses of the era of pretense insist that there’s nothing wrong and nobody has to do anything outside the realm of business as usual, the nonresponses of the era of impact claim just as forcefully that whatever’s gone wrong is a temporary difficulty and everything will be fine if we all unite to do even more of whatever activity defines business as usual. That this normally amounts to doing more of whatever made the crisis happen in the first place, and thus reliably makes things worse is just one of the little ironies history has to offer.
What unites the era of pretense with the era of impact is the unshaken belief that in the final analysis, there’s nothing essentially wrong with the existing order of things. Whatever little difficulties may show up from time to time may be ignored as irrelevant or talked out of existence, or they may have to be shoved aside by some concerted effort, but it’s inconceivable to most people in these two eras that the existing order of things is itself the source of society’s problems, and has to be changed in some way that goes beyond the cosmetic dimension. When the inconceivable becomes inescapable, in turn, the second phase gives way to the third, and the era of response has arrived.
This doesn’t mean that everyone comes to grips with the real issues, and buckles down to the hard work that will be needed to rebuild society on a sounder footing. Winston Churchill once noted with his customary wry humor that the American people can be counted on to do the right thing, once they have exhausted every other possibility. He was of course quite correct, but the same rule can be applied with equal validity to every other nation this side of Utopia, too. The era of response, in practice, generally consists of a desperate attempt to find something that will solve the crisis du jour, other than the one thing that everyone knows will solve the crisis du jour but nobody wants to do.
Let’s return to the two examples we’ve been following so far, the outbreak of the Great Depression and the coming of the French Revolution. In the aftermath of the 1929 stock market crash, once the initial impact was over and the “sucker’s rally” of early 1930 had come and gone, the federal government and the various power centers and pressure groups that struggled for influence within its capacious frame were united in pursuit of a single goal: finding a way to restore prosperity without doing either of the things that had to be done in order to restore prosperity.  That task occupied the best minds in the US elite from the summer of 1930 straight through until April of 1933, and the mere fact that their attempts to accomplish this impossibility proved to be a wretched failure shouldn’t blind anyone to the Herculean efforts that were involved in the attempt.
The first of the two things that had to be tackled in order to restore prosperity was to do something about the drastic imbalance in the distribution of income in the United States. As noted in previous posts, an economy dependent on consumer expenditures can’t thrive unless consumers have plenty of money to spend, and in the United States in the late 1920s, they didn’t—well, except for the very modest number of those who belonged to the narrow circles of the well-to-do. It’s not often recalled these days just how ghastly the slums of urban America were in 1929, or how many rural Americans lived in squalid one-room shacks of the sort you pretty much have to travel to the Third World to see these days. Labor unions and strikes were illegal in 1920s America; concepts such as a minimum wage, sick pay, and health benefits didn’t exist, and the legal system was slanted savagely against the poor.
You can’t build prosperity in a consumer society when a good half of your citizenry can’t afford more than the basic necessities of life. That’s the predicament that America found clamped to the tender parts of its economic anatomy at the end of the 1920s. In that decade, as in our time, the temporary solution was to inflate a vast speculative bubble, under the endearing delusion that this would flood the economy with enough unearned cash to make the lack of earned income moot. That worked over the short term and then blew up spectacularly, since a speculative bubble is simply a Ponzi scheme that the legal authorities refuse to prosecute as such, and inevitably ends the same way.
There were, of course, effective solutions to the problem of inadequate consumer income. They were exactly those measures that were taken once the era of response gave way to the era of breakdown; everyone knew what they were, and nobody with access to political or economic power was willing to see them put into effect, because those measures would require a modest decline in the relative wealth and political dominance of the rich as compared to everyone else. Thus, as usually happens, they were postponed until the arrival of the era of breakdown made it impossible to avoid them any longer.
The second thing that had to be changed in order to restore prosperity was even more explosive, and I’m quite certain that some of my readers will screech like banshees the moment I mention it. The United States in 1929 had a precious metal-backed currency in the most literal sense of the term. Paper bills in those days were quite literally receipts for a certain quantity of gold—1.5 grams, for much of the time the US spent on the gold standard. That sort of arrangement was standard in most of the world’s industrial nations; it was backed by a dogmatic orthodoxy all but universal among respectable economists; and it was strangling the US economy.
It’s fashionable among certain sects on the economic fringes these days to look back on the era of the gold standard as a kind of economic Utopia in which there were no booms and busts, just a warm sunny landscape of stability and prosperity until the wicked witches of the Federal Reserve came along and spoiled it all. That claim flies in the face of economic history. During the entire period that the United States was on the gold standard, from 1873 to 1933, the US economy was a moonscape cratered by more than a dozen significant depressions. There’s a reason for that, and it’s relevant to our current situation—in a backhanded manner, admittedly.
Money, let us please remember, is not wealth. It’s a system of arbitrary tokens that represent real wealth—that is, actual, nonfinancial goods and services. Every society produces a certain amount of real wealth each year, and those societies that use money thus need to have enough money in circulation to more or less correspond to the annual supply of real wealth. That sounds simple; in practice, though, it’s anything but. Nowadays, for example, the amount of real wealth being produced in the United States each year is contracting steadily as more and more of the nation’s economic output has to be diverted into the task of keeping it supplied with fossil fuels. That’s happening, in turn, because of the limits to growth—the awkward but inescapable reality that you can’t extract infinite resources, or dump limitless wastes, on a finite planet.
The gimmick currently being used to keep fossil fuel extraction funded and cover the costs of the rising impact of environmental disruptions, without cutting into a culture of extravagance that only cheap abundant fossil fuel and a mostly intact biosphere can support, is to increase the money supply ad infinitum. That’s become the bedrock of US economic policy since the 2008-9 crash. It’s not a gimmick with a long shelf life; as the mismatch between real wealth and the money supply balloons, distortions and discontinuities are surging out through the crawlspaces of our economic life, and crisis is the most likely outcome.
In the United States in the first half or so of the twentieth century, by contrast, the amount of real wealth being produced each year soared, largely because of the steady increases in fossil fuel energy being applied to every sphere of life. While the nation was on the gold standard, though, the total supply of money could only grow as fast as gold could be mined out of the ground, which wasn’t even close to fast enough. So you had more goods and services being produced than there was money to pay for them; people who wanted goods and services couldn’t buy them because there wasn’t enough money to go around; business that wanted to expand and hire workers were unable to do so for the same reason. The result was that moonscape of economic disasters I mentioned a moment ago.
The necessary response at that time was to go off the gold standard. Nobody in power wanted to do this, partly because of the dogmatic economic orthodoxy noted earlier, and partly because a money shortage paid substantial benefits to those who had guaranteed access to money. The rentier class—those people who lived off income from their investments—could count on stable or falling prices as long as the gold standard stayed in place, and the mere fact that the same stable or falling prices meant low wages, massive unemployment, and widespread destitution troubled them not at all. Since the rentier class included the vast majority of the US economic and political elite, in turn, going off the gold standard was unthinkable until it became unavoidable.
The period of the French revolution from the fall of the Bastille in 1789 to the election of the National Convention in 1792 was a period of the same kind, though driven by different forces. Here the great problem was how to replace the Old Regime—not just the French monarchy, but the entire lumbering mass of political, economic, and social laws, customs, forms, and institutions that France had inherited from the Middle Ages and never quite gotten around to adapting to drastically changed conditions—with something that would actually work. It’s among the more interesting features of the resulting era of response that nearly every detail differed from the American example just outlined, and yet the results were remarkably similar.
Thus the leaders of the National Assembly who suddenly became the new rulers of France in the summer of 1789 had no desire whatsoever to retain the traditional economic arrangements that gave France’s former elites their stranglehold on an oversized share of the nation’s wealth. The abolition of manorial rights that summer, together with the explosive rural uprisingsagainst feudal landlords and their chateaux in the wake of the Bastille’s fall, gutted the feudal system and left most of its former beneficiaries the choice between fleeing into exile and trying to find some way to make ends meet in a society that had no particular market for used aristocrats. The problem faced by the National Assembly wasn’t that of prying the dead fingers of a failed system off the nation’s throat; it was that of trying to find some other basis for national unity and effective government.
It’s a surprisingly difficult challenge. Those of my readers who know their way around current events will already have guessed that an attempt was made to establish a copy of whatever system was most fashionable among liberals at the time, and that this attempt turned out to be an abject failure. What’s more, they’ll have been quite correct. The National Assembly moved to establish a constitutional monarchy along British lines, bring in British economic institutions, and the like; it was all very popular among liberal circles in France and, naturally, in Britain as well, and it flopped. Those who recall the outcome of the attempt to turn Iraq into a nice pseudo-American democracy in the wake of the US invasion will have a tolerably good sense of how the project unraveled.
One of the unwelcome but reliable facts of history is that democracy doesn’t transplant well. It thrives only where it grows up naturally, out of the civil institutions and social habits of a people; when liberal intellectuals try to impose it on a nation that hasn’t evolved the necessary foundations for it, the results are pretty much always a disaster. That latter was the situation in France at the time of the Revolution. What happened thereafter  is what almost always happens to a failed democratic experiment: a period of chaos, followed by the rise of a talented despot who’s smart and ruthless enough to impose order on a chaotic situation and allow new, pragmatic institutions to emerge to replace those destroyed by clueless democratic idealists. In many cases, though by no means all, those pragmatic institutions have ended up providing a bridge to a future democracy, but that’s another matter.
Here again, those of my readers who have been paying attention to current events already know this; the collapse of the Soviet Union was followed in classic form by a failed democracy, a period of chaos, and the rise of a talented despot. It’s a curious detail of history that the despots in question are often rather short. Russia has had the great good fortune to find, as its despot du jour, a canny realist who has successfully brought it back from the brink of collapse and reestablished it as a major power with a body count considerably smaller than usual.. France was rather less fortunate; the despot it found, Napoleon Bonaparte, turned out to be a megalomaniac with an Alexander the Great complex who proceeded to plunge Europe into a quarter century of cataclysmic war. Mind you, things could have been even worse; when Germany ended up in a similar situation, what it got was Adolf Hitler.
Charismatic strongmen are a standard endpoint for the era of response, but they properly belong to the era that follows, the era of breakdown, which will be discussed next week. What I want to explore here is how an era of response might work out in the future immediately before us, as the United States topples from its increasingly unsteady imperial perch and industrial civilization as a whole slams facefirst into the limits to growth. The examples just cited outline the two most common patterns by which the era of response works itself out. In the first pattern, the old elite retains its grip on power, and fumbles around with increasing desperation for a response to the crisis. In the second, the old elite is shoved aside, and the new holders of power are left floundering in a political vacuum.
We could see either pattern in the United States. For what it’s worth, I suspect the latter is the more likely option; the spreading crisis of legitimacy that grips the country these days is exactly the sort of thing you saw in France before the Revolution, and in any number of other countries in the few decades just prior to revolutionary political and social change. Every time a government tries to cope with a crisis by claiming that it doesn’t exist, every time some member of the well-to-do tries to dismiss the collective burdens its culture of executive kleptocracy imposes on the country by flinging abuse at critics, every time institutions that claim to uphold the rule of law defend the rule of entrenched privilege instead, the United States takes another step closer to the revolutionary abyss.
I use that last word advisedly. It’s a common superstition in every troubled age that any change must be for the better—that the overthrow of a bad system must by definition lead to the establishment of a better one. This simply isn’t true. The vast majority of revolutions have established governments that were far more abusive than the ones they replaced. The exceptions have generally been those that brought about a social upheaval without wrecking the political system: where, for example, an election rather than a coup d’etat or a mass rising put the revolutionaries in power, and the political institutions of an earlier time remained in place with only such reshaping as new necessities required.
We could still see that sort of transformation as the United States sees the end of its age of empire and has to find its way back to a less arrogant and extravagant way of functioning in the world. I don’t think it’s likely, but I think it’s possible, and it would probably be a good deal less destructive than the other alternative. It’s worth remembering, though, that history is under no obligation to give us the future we think we want.

The Era of Impact

Wed, 2015-05-20 15:03
Of all the wistful superstitions that cluster around the concept of the future in contemporary popular culture, the most enduring has to be the notion that somehow, sooner or later, something will happen to shake the majority out of its complacency and get it to take seriously the crisis of our age. Week after week, I field comments and emails that presuppose that belief. People want to know how soon I think the shock of awakening will finally hit, or wonder whether this or that event will do the trick, or simply insist that the moment has to come sooner or later.
To all such inquiries and expostulations I have no scrap of comfort to offer. Quite the contrary, what history shows is that a sudden awakening to the realities of a difficult situation is far and away the least likely result of what I’ve called the era of impact, the second of the five stages of collapse. (The first, for those who missed last week’s post, is the era of pretense; the remaining three, which will be covered in the coming weeks, are the eras of response, breakdown, and dissolution.)
The era of impact is the point at which it becomes clear to most people that something has gone wrong with the most basic narratives of a society—not just a little bit wrong, in the sort of way that requires a little tinkering here and there, but really, massively, spectacularly wrong. It arrives when an asset class that was supposed to keep rising in price forever stops rising, does its Wile E. Coyote moment of hang time, and then drops like a stone. It shows up when an apparently entrenched political system, bristling with soldiers and secret police, implodes in a matter of days or weeks and is replaced by a provisional government whose leaders look just as stunned as everyone else. It comes whenever a state of affairs that was assumed to be permanent runs into serious trouble—but somehow it never seems to succeed in getting people to notice just how temporary that state of affairs always was.
Since history is the best guide we’ve got to how such events work out in the real world, I want to take a couple of examples of the kind just outlined and explore them in a little more detail. The stock market bubble of the 1920s makes a good case study on a relatively small scale. In the years leading up to the crash of 1929, stock values in the US stock market quietly disconnected themselves from the economic fundamentals and began what was, for the time, an epic climb into la-la land. There were important if unmentionable reasons for that airy detachment from reality; the most significant was the increasingly distorted distribution of income in 1920s America, which put more and more of the national wealth in the hands of fewer and fewer people and thus gutted the national economy.
It’s one of the repeated lessons of economic history that money in the hands of the rich does much less good for the economy as a whole than money in the hands of the working classes and the poor. The reasoning here is as simple as it is inescapable. Industrial economies survive and thrive on consumer expenditures, but consumer expenditures are limited by the ability of consumers to buy the things they want and need. As money is diverted away from the lower end of the economic pyramid, you get demand destruction—the process by which those who can’t afford to buy things stop buying them—and consumer expenditures fall off. The rich, by contrast, divert a large share of their income out of the consumer economy into investments; the richer they get, the more of the national wealth ends up in investments rather than consumer expenditures; and as consumer expenditures falter, and investments linked to the consumer economy falter in turn, more and more money ends up in illiquid speculative vehicles that are disconnected from the productive economy and do nothing to stimulate demand.
That’s what happened in the 1920s. All through the decade in the US, the rich got richer and the poor got screwed, speculation took the place of productive investment throughout the US economy, and the well-to-do wallowed in the wretched excess chronicled in F. Scott Fitzgerald’s The Great Gatsby while most other people struggled to get by. The whole decade was a classic era of pretense, crowned by the delusional insistence—splashed all over the media of the time—that everyone in the US could invest in the stock market and, since the market was of course going to keep on rising forever, everyone in the US would thus inevitably become rich.
It’s interesting to note that there were people who saw straight through the nonsense and tried to warn their fellow Americans about the inevitable consequences. They were denounced six ways from Sunday by all right-thinking people, in language identical to that used more recently on those of us who’ve had the effrontery to point out that an infinite supply of oil can’t be extracted from a finite planet.  The people who insisted that the soaring stock values of the late 1920s were the product of one of history’s great speculative bubbles were dead right; they had all the facts and figures on their side, not to mention plain common sense; but nobody wanted to hear it.
When the stock market peaked just before the Labor Day weekend in 1929 and started trending down, therefore, the immediate response of all right-thinking people was to insist at the top of their lungs that nothing of the sort was happening, that the market was simply catching its breath before its next great upward leap, and so on. Each new downward lurch was met by a new round of claims along these lines, louder, more dogmatic, and more strident than the one that preceded it, and nasty personal attacks on anyone who didn’t support the delusional consensus filled the media of the time.
People were still saying those things when the bottom dropped out of the market.
Tuesday, October 29, 1929 can reasonably be taken as the point at which the era of pretense gave way once and for all to the era of impact. That’s not because it was the first day of the crash—there had been ghastly slumps on the previous Thursday and Monday, on the heels of two months of less drastic but still seriously ugly declines—but because, after that day, the pundits and the media pretty much stopped pretending that nothing was wrong. Mind you, next to nobody was willing to talk about what exactly had gone wrong, or why it had gone wrong, but the pretense that the good fairy of capitalism had promised Americans happy days forever was out the window once and for all.
It’s crucial to note, though, that what followed this realization was the immediate and all but universal insistence that happy days would soon be back if only everyone did the right thing. It’s even more crucial to note that what nearly everyone identified as “the right thing”—running right out and buying lots of stocks—was a really bad idea that bankrupted many of those who did it, and didn’t help the imploding US economy at all.
It’s probably necessary to talk about this in a little more detail, since it’s been an article of blind faith in the United States for many decades now that it’s always a good idea to buy and hold stocks. (I suspect that stockbrokers have had a good deal to do with the promulgation of this notion.) It’s been claimed that someone who bought stocks in 1929 at the peak of the bubble, and then held onto them, would have ended up in the black eventually, and for certain values of “eventually,” this is quite true—but it took the Dow Jones industrial average until the mid-1950s to return to its 1929 high, and so for a quarter of a century our investor would have been underwater on his stock purchases.
What’s more, the Dow isn’t necessarily a good measure of stocks generally; many of the darlings of the market in the 1920s either went bankrupt in the Depression or never again returned to their 1929 valuations. Nor did the surge of money into stocks in the wake of the 1929 crash stave off the Great Depression, or do much of anything else other than provide a great example of the folly of throwing good money after bad. The moral to this story? In an era of impact, the advice you hear from everyone around you may not be in your best interest.
That same moral can be shown just as clearly in the second example I have in mind, the French Revolution. We talked briefly in last week’s post about the way that the French monarchy and aristocracy blinded themselves to the convulsive social and economic changes that were pushing France closer and closer to a collective explosion on the grand scale, and pursued business as usual long past the point at which business as usual was anything but a recipe for disaster. Even when the struggle between the Crown and the aristocracy forced Louis XVI to convene the États-Généraux—the rarely-held national parliament of France, which had powers more or less equivalent to a constitutional convention in the US—next to nobody expected anything but long rounds of political horse-trading from which some modest shifts in the balance of power might result.
That was before the summer of 1789. On June 17, the deputies of the Third Estate—the representatives of the commoners—declared themselves a National Assembly and staged what amounted to a coup d’etat; on July 14, faced with the threat of a military response from the monarchy, the Parisian mob seized the Bastille, kickstarting a wave of revolt across the country that put government and military facilities in the hands of the revolutionary National Guard and broke the back of the feudal system; on August 4, the National Assembly abolished all feudal rights and legal distinctions between the classes. Over less than two months, a political and social system that had been welded firmly in place for a thousand years all came crashing to the ground.
Those two months marked the end of the era of pretense and the arrival of the era of impact. The immediate response, with a modest number of exceptions among the aristocracy and the inner circles of the monarchy’s supporters, was frantic cheering and an insistence that everything would soon settle into a wonderful new age of peace, prosperity, and liberty. All the overblown dreams of the philosophes about a future age governed by reason were trotted out and treated as self-evident fact. Of course that’s not what happened; once it was firmly in power, the National Assembly used its unchecked authority as abusively as the monarchy had once done; factional struggles spun out of control, and before long mob rule and the guillotine were among the basic facts of life in Revolutionary France. 
Among the most common symptoms of an era of impact, in other words, is the rise of what we may as well call “crackpot optimism”—the enthusiastic and all but universal insistence, in the teeth of the evidence, that the end of business as usual will turn out to be the door to a wonderful new future. In the wake of the 1929 stock market crash, people were urged to pile back into the market in the belief that this would cause the economy to boom again even more spectacularly than before, and most of the people who followed this advice proceeded to lose their shirts. In the wake of the revolution of 1789, likewise, people across France were encouraged to join with their fellow citizens in building the shining new utopia of reason, and a great many of those who followed that advice ended up decapitated or, a little later, dying of gunshot or disease in the brutal era of pan-European warfare that extended almost without a break from the cannonade of Valmy in 1792 to the battle of Waterloo in 1815.
And the present example? That’s a question worth exploring, if only for the utterly pragmatic reason that most of my readers are going to get to see it up close and personal.
That the United States and the industrial world generally are deep in an era of pretense is, I think, pretty much beyond question at this point. We’ve got political authorities, global bankers, and a galaxy of pundits insisting at the top of their lungs that nothing is wrong, everything is fine, and we’ll be on our way to the next great era of prosperity if we just keep pursuing a set of boneheaded policies that have never—not once in the entire span of human history—brought prosperity to the countries that pursued them. We’ve got shelves full of books for sale in upscale bookstores insisting, in the strident language usual to such times, that life is wonderful in this best of all possible worlds, and it’s going to get better forever because, like, we have technology, dude! Across the landscape of the cultural mainstream, you’ll find no shortage of cheerleaders insisting at the top of their lungs that everything’s going to be fine, that even though they said ten years ago that we only have ten years to do something before disaster hits, why, we still have ten years before disaster hits, and when ten more years pass by, why, you can be sure that the same people will be insisting that we have ten more.
This is the classic rhetoric of an era of pretense. Over the last few years, though, it’s seemed to me that the voices of crackpot optimism have gotten more shrill, the diatribes more fact-free, and the logic even shoddier than it was in Bjorn Lomborg’s day, which is saying something. We’ve reached the point that state governments are making it a crime to report on water quality and forbidding officials from using such unwelcome phrases as “climate change.” That’s not the action of people who are confident in their beliefs; it’s the action of a bunch of overgrown children frantically clenching their eyes shut, stuffing their fingers in their ears, and shouting “La, la, la, I can’t hear you.”
That, in turn, suggests that the transition to the era of impact may be fairly close. Exactly when it’s likely to arrive is a complex question, and exactly what’s going to land the blow that will crack the crackpot optimism and make it impossible to ignore the arrival of real trouble is an even more complex one. In 1929, those who hadn’t bought into the bubble could be perfectly sure—and in fact, a good many of them were perfectly sure—that the usual mechanism that brings bubbles to a catastrophic end was about to terminate the boom of the 1920s with extreme prejudice, as indeed it did. In the last decades of the French monarchy, it was by no means clear exactly what sequence of events would bring the Ancien Régime crashing down, but such thoughtful observers as Talleyrand knew that something of the sort was likely to follow the crisis of legitimacy then under way.
The problem with trying to predict the trigger that will bring our current situation to a sudden stop is that we’re in such a target-rich environment. Looking over the potential candidates for the sudden shock that will stick a fork in the well-roasted corpse of business as usual, I’m reminded of the old board game Clue. Will Mr. Boddy’s killer turn out to be Colonel Mustard in the library with a lead pipe, Professor Plum in the conservatory with a candlestick, or Miss Scarlet in the dining room with a rope?
In much the same sense, we’ve got a global economy burdened to the breaking point with more than a quadrillion dollars of unpayable debt; we’ve got a global political system coming apart at the seams as the United States slips toward the usual fate of empires and its rivals circle warily, waiting for the kill; we’ve got a domestic political system here in the US entering a classic prerevolutionary condition under the impact of a textbook crisis of legitimacy; we’ve got a global climate that’s hammered by our rank stupidity in treating the atmosphere as a gaseous sewer for our wastes; we’ve got a global fossil fuel industry that’s frantically trying to pretend that scraping the bottom of the barrel means that the barrel is full, and the list goes on. It’s as though Colonel Mustard, Professor Plum, Miss Scarlet, and the rest of them all ganged up on Mr. Boddy at once, and only the most careful autopsy will be able to determine which of them actually dealt the fatal blow.
In the midst of all this uncertainty, there are three things that can, I think, be said for certain about the end of the current era of pretense and the coming of the era of impact. The first is that it’s going to happen. When something is unsustainable, it’s a pretty safe bet that it won’t be sustained indefinitely, and a society that keeps on embracing policies that swap short-term gains for long-term problems will sooner or later end up awash in the consequences of those policies. Timing such transitions is difficult at best; it’s an old adage among stock traders that the market can stay irrational longer than you can stay solvent. Still, points made above—especially the increasingly shrill tone of the defenders of the existing order—suggest to me that the era of impact may be here within a decade or so at the outside.
The second thing that can be said for certain about the coming era of impact is that it’s not the end of the world. Apocalyptic fantasies are common and popular in eras of pretense, and for good reason; fixating on the supposed imminence of the Second Coming, human extinction, or what have you, is a great way to distract yourself from the real crisis that’s breathing down your neck. If the real crisis in question is partly or wholly a result of your own actions, while the apocalyptic fantasy can be blamed on someone or something else, that adds a further attraction to the fantasy.
The end of industrial civilization will be a long, bitter, painful cascade of conflicts, disasters, and  accelerating decline in which a vast number of people are going to die before they otherwise would, and a great many things of value will be lost forever. That’s true of any falling civilization, and the misguided decisions of the last forty years have pretty much guaranteed that the current example is going to have an extra helping of all these unwelcome things. I’ve discussed at length, in earlier posts in the Dark Age America sequence here and in other sequences as well, why the sort of apocalyptic sudden stop beloved of Hollywood scriptwriters is the least likely outcome of the predicament of our time; still, insisting on the imminence and inevitability of some such game-ending event will no doubt be as popular as usual in the years immediately ahead.
The third thing that I think can be said for certain about the coming era of impact, though, is the one that counts. If it follows the usual pattern, as I expect it to do, once the crisis hits there will be serious, authoritative, respectable figures telling everyone exactly what they need to do to bring an end to the troubles and get the United States and the world back on track to renewed peace and prosperity. Taking these pronouncements seriously and following their directions will be extremely popular, and it will almost certainly also be a recipe for unmitigated disaster. If forewarned is forearmed, as the saying has it, this is a piece of firepower to keep handy as the era of pretense winds down. In next week’s post, we’ll talk about comparable weaponry relating to the third stage of collapse—the era of response.

The Era of Pretense

Wed, 2015-05-13 17:00
I've mentioned in previous posts here on The Archdruid Report the educational value of the comments I receive from readers in the wake of each week’s essay. My post two weeks ago on the death of the internet was unusually productive along those lines.  One of the comments I got in response to that post gave me the theme for last week’s essay, but there was at least one other comment calling for the same treatment. Like the one that sparked last week’s post, it appeared on one of the many other internet forums on which The Archdruid Report, and it unintentionally pointed up a common and crucial failure of imagination that shapes, or rather misshapes, the conventional wisdom about our future.
Curiously enough, the point that set off the commenter in question was the same one that incensed the author of the denunciation mentioned in last week’s post: my suggestion in passing that fifty years from now, most Americans may not have access to electricity or running water. The commenter pointed out angrily that I’d claimed that the twilight of industrial civilization would be a ragged arc of decline over one to three centuries. Now, he claimed, I was saying that it was going to take place in the next fifty years, and this apparently convinced him that everything I said ought to be dismissed out of hand.
I run into this sort of confusion all the time. If I suggest that the decline and fall of a civilization usually takes several centuries, I get accused of inconsistency if I then note that one of the sharper downturns included in that process may be imminent.  If I point out that the United States is likely within a decade or two of serious economic and political turmoil, driven partly by the implosion of its faltering global hegemony and partly by a massive crisis of legitimacy that’s all but dissolved the tacit contract between the existing order of US society and the masses who passively support it, I get accused once again of inconsistency if I then say that whatever comes out the far side of that crisis—whether it’s a battered and bruised United States or a patchwork of successor states—will then face a couple of centuries of further decline and disintegration before the deindustrial dark age bottoms out.
Now of course there’s nothing inconsistent about any of these statements. The decline and fall of a civilization isn’t a single event, or even a single linear process; it’s a complex fractal reality composed of many different events on many different scales in space and time. If it takes one to three centuries, as usual, those centuries are going to be taken up by an uneven drumbeat of wars, crises, natural disasters, and assorted breakdowns on a variety of time frames with an assortment of local, regional, national, or global effects. The collapse of US global hegemony is one of those events; the unraveling of the economic and technological framework that currently provides most Americans with electricity and running water is another, but neither of those is anything like the whole picture.
It’s probably also necessary to point out that any of my readers who think that being deprived of electricity and running water is the most drastic kind of collapse imaginable have, as the saying goes, another think coming. Right now, in our oh-so-modern world, there are billions of people who get by without regular access to electricity and running water, and most of them aren’t living under dark age conditions. A century and a half ago, when railroads, telegraphs, steamships, and mechanical printing presses were driving one of history’s great transformations of transport and information technology, next to nobody had electricity or running water in their homes. The technologies of 1865 are not dark age technologies; in fact, the gap between 1865 technologies and dark age technologies is considerably greater, by most metrics, than the gap between 1865 technologies and the ones we use today.
Furthermore, whether or not Americans have access to running water and electricity may not have as much to say about the future of industrial society everywhere in the world as the conventional wisdom would suggest.  I know that some of my American readers will be shocked out of their socks to hear this, but the United States is not the whole world. It’s not even the center of the world. If the United States implodes over the next two decades, leaving behind a series of bankrupt failed states to squabble over its territory and the little that remains of its once-lavish resource base, that process will be a great source of gaudy and gruesome stories for the news media of the world’s other continents, but it won’t affect the lives of the readers of those stories much more than equivalent events in Africa and the Middle East affect the lives of Americans today.
As it happens, over the next one to three centuries, the benefits of industrial civilization are going to go away for everyone. (The costs will be around a good deal longer—in the case of the nuclear wastes we’re so casually heaping up for our descendants, a good quarter of a million years, but those and their effects are rather more localized than some of today’s apocalyptic rhetoric likes to suggest.) The reasoning here is straightforward. White’s Law, one of the fundamental principles of human ecology, states that economic development is a function of energy per capita; the immense treasure trove of concentrated energy embodied in fossil fuels, and that alone, made possible the sky-high levels of energy per capita that gave the world’s industrial nations their brief era of exuberance; as fossil fuels deplete, and remaining reserves require higher and higher energy inputs to extract, the levels of energy per capita the industrial nations are used to having will go away forever.
It’s important to be clear about this. Fossil fuels aren’t simply one energy source among others; in terms of concentration, usefulness, and fungibility—that is, the ability to be turned into any other form of energy that might be required—they’re in a category all by themselves. Repeated claims that fossil fuels can be replaced with nuclear power, renewable energy resources, or what have you sound very good on paper, but every attempt to put those claims to the test so far has either gone belly up in short order, or become a classic subsidy dumpster surviving purely on a diet of government funds and mandates.
Three centuries ago, the earth’s fossil fuel reserves were the largest single deposit of concentrated energy in this part of the universe; now we’ve burnt through nearly all the easily accessible reserves, and we’re scrambling to keep the tottering edifice of industrial society going by burning through the dregs that remain. As those run out, the remaining energy resources—almost all of them renewables—will certainly sustain a variety of human societies, and some of those will be able to achieve a fairly high level of complexity and maintain some kinds of advanced technologies. The kind of absurd extravagance that passes for a normal standard of living among the more privileged inmates of the industrial nations is another matter, and as the fossil fuel age sunsets out, it will end forever.
The fractal trajectory of decline and fall mentioned earlier in this post is simply the way this equation works out on the day-to-day scale of ordinary history. Still, those of us who happen to be living through a part of that trajectory might reasonably be curious about how it’s likely to unfold in our lifetimes. I’ve discussed in a previous series of posts, and in my book Decline and Fall: The End of Empire and the Future of Democracy in 21st Century America, how the end of US global hegemony is likely to unfold, but as already noted, that’s only a small portion of the broader picture. Is a broader view possible?
Fortunately history, the core resource I’ve been using to try to make sense of our future, has plenty to say about the broad patterns that unfold when civilizations decline and fall. Now of course I know all I have to do is mention that history might be relevant to our present predicament, and a vast chorus of voices across the North American continent and around the world will bellow at rooftop volume, “But it’s different this time!” With apologies to my regular readers, who’ve heard this before, it’s probably necessary to confront that weary thoughtstopper again before we proceed.
As I’ve noted before, claims that it’s different this time are right where it doesn’t matter and wrong where it counts.  Predictions made on the basis of history—and not just by me—have consistently predicted events over the last decade or so far more accurately than predictions based on the assumption that history doesn’t matter. How many times, dear reader, have you heard someone insist that industrial civilization is going to crash to ruin in the next six months, and then watched those six months roll merrily by without any sign of the predicted crash? For that matter, how many times have you heard someone insist that this or that policy that’s never worked any other time that it’s been tried, or this or that piece of technological vaporware that’s been the subject of failed promises for decades, will inevitably put industrial society back on its alleged trajectory to the stars—and how many times has the policy or the vaporware been quietly shelved, and something else promoted using the identical rhetoric, when it turned out not to perform as advertised?
It’s been a source of wry amusement to me to watch the same weary, dreary, repeatedly failed claims of imminent apocalypse and inevitable progress being rehashed year after year, varying only in the fine details of the cataclysm du jour and the techno-savior du jour, while the future nobody wants to talk about is busily taking shape around us. Decline and fall isn’t something that will happen sometime in the conveniently distant future; it’s happening right now in the United States and around the world. The amusement, though, is tempered with a sense of familiarity, because the period in which decline is under way but nobody wants to admit that fact is one of the recurring features of the history of decline.
There are, very generally speaking, five broad phases in the decline and fall of a civilization. I know it’s customary in historical literature to find nice dull labels for such things, but I’m in a contrary mood as I write this, so I’ll give them unfashionably colorful names: the eras of pretense, impact, response, breakdown, and dissolution. Each of these is complex enough that it’ll need a discussion of its own; this week, we’ll talk about the era of pretense, which is the one we’re in right now.
Eras of pretense are by no means limited to the decline and fall of civilizations. They occur whenever political, economic, or social arrangements no longer work, but the immediate costs of admitting that those arrangements don’t work loom considerably larger in the collective imagination than the future costs of leaving those arrangements in place. It’s a curious but consistent wrinkle of human psychology that this happens even if those future costs soar right off the scale of frightfulness and lethality; if the people who would have to pay the immediate costs don’t want to do so, in fact, they will reliably and cheerfully pursue policies that lead straight to their own total bankruptcy or violent extermination, and never let themselves notice where they’re headed.
Speculative bubbles are a great setting in which to watch eras of pretense in full flower. In the late phases of a bubble, when it’s clear to anyone who has two spare neurons to rub together that the boom du jour is cobbled together of equal parts delusion and chicanery, the people who are most likely to lose their shirts in the crash are the first to insist at the top of their lungs that the bubble isn’t a bubble and their investments are guaranteed to keep on increasing in value forever. Those of my readers who got the chance to watch some of their acquaintances go broke in the real estate bust of 2008-9, as I did, will have heard this sort of self-deception at full roar; those who missed the opportunity can make up for the omission by checking out the ongoing torrent of claims that the soon-to-be-late fracking bubble is really a massive energy revolution that will make America wealthy and strong again.
The history of revolutions offers another helpful glimpse at eras of pretense. France in the decades before 1789, to cite a conveniently well-documented example, was full of people who had every reason to realize that the current state of affairs was hopelessly unsustainable and would have to change. The things about French politics and economics that had to change, though, were precisely those things that the French monarchy and aristocracy were unwilling to change, because any such reforms would have cost them privileges they’d had since time out of mind and were unwilling to relinquish.
Louis XIV, who finished up his long and troubled reign a supreme realist, is said to have muttered “Après moi, le déluge”—“Once I’m gone, this sucker’s going down” may not be a literal translation, but it catches the flavor of the utterance—but that degree of clarity was rare in his generation, and all but absent in those of his increasingly feckless successors. Thus the courtiers and aristocrats of the Old Regime amused themselves at the nation’s expense, dabbled in avant-garde thought, and kept their eyes tightly closed to the consequences of their evasions of looming reality, while the last opportunities to excuse themselves from a one-way trip to visit the guillotine and spare France the cataclysms of the Terror and the Napoleonic wars slipped silently away.
That’s the bitter irony of eras of pretense. Under most circumstances, they’re the last period when it would be possible to do anything constructive on the large scale about the crisis looming immediately ahead, but the mass evasion of reality that frames the collective thinking of the time stands squarely in the way of any such constructive action. In the era of pretense before a speculative bust, people who could have quietly cashed in their positions and pocketed their gains double down on their investments, and guarantee that they’ll be ruined once the market stops being liquid. In the era of pretense before a revolution, in the same way, those people and classes that have the most to lose reliably take exactly those actions that ensure that they will in fact lose everything. If history has a sense of humor, this is one of the places that it appears in its most savage form.
The same points are true, in turn, of the eras of pretense that precede the downfall of a civilization. In a good many cases, where too few original sources survive, the age of pretense has to be inferred from archeological remains. We don’t know what motives inspired the ancient Mayans to build their biggest pyramids in the years immediately before the Terminal Classic period toppled over into a savage political and demographic collapse, but it’s hard to imagine any such project being set in motion without the usual evasions of an era of pretense being involved  Where detailed records of dead civilizations survive, though, the sort of rhetorical handwaving common to bubbles before the bust and decaying regimes on the brink of revolution shows up with knobs on. Thus the panegyrics of the Roman imperial court waxed ever more lyrical and bombastic about Rome’s invincibility and her civilizing mission to the nations as the Empire stumbled deeper into its terminal crisis, echoing any number of other court poets in any number of civilizations in their final hours.
For that matter, a glance through classical Rome’s literary remains turns up the remarkable fact that those of her essayists and philosophers who expressed worries about her survival wrote, almost without exception, during the Republic and the early Empire; the closer the fall of Rome actually came, the more certainty Roman authors expressed that the Empire was eternal and the latest round of troubles was just one more temporary bump on the road to peace and prosperity. It took the outsider’s vision of Augustine of Hippo to proclaim that Rome really was falling—and even that could only be heard once the Visigoths sacked Rome and the era of pretense gave way to the age of impact.
The present case is simply one more example to add to an already lengthy list. In the last years of the nineteenth century, it was common for politicians, pundits, and mass media in the United States, the British empire, and other industrial nations to discuss the possibility that the advanced civilization of the time might be headed for the common fate of nations in due time. The intellectual history of the twentieth century is, among other things, a chronicle of how that discussion was shoved to the margins of our collective discourse, just as the ecological history of the same century is among other things a chronicle of how the worries of the previous era became the realities of the one we’re in today. The closer we’ve moved toward the era of impact, that is, the more unacceptable it has become for anyone in public life to point out that the problems of the age are not just superficial.
Listen to the pablum that passes for political discussion in Washington DC or the mainstream US media these days, or the even more vacuous noises being made by party flacks as the country stumbles wearily toward yet another presidential election. That the American dream of upward mobility has become an American nightmare of accelerating impoverishment outside the narrowing circle of the kleptocratic rich, that corruption and casual disregard for the rule of law are commonplace in political institutions from local to Federal levels, that our medical industry charges more than any other nation’s and still provides the worst health care in the industrial world, that our schools no longer teach anything but contempt for learning, that the national infrastructure and built environment are plunging toward Third World conditions at an ever-quickening pace, that a brutal and feckless foreign policy embraced by both major parties is alienating our allies while forcing our enemies to set aside their mutual rivalries and make common cause against us: these are among the issues that matter, but they’re not the issues you’ll hear discussed as the latest gaggle of carefully airbrushed candidates go through their carefully scripted elect-me routines on their way to the 2016 election.
If history teaches anything, though, it’s that eras of pretense eventually give way to eras of impact. That doesn’t mean that the pretense will go away—long after Alaric the Visigoth sacked Rome, for example, there were still plenty of rhetors trotting out the same tired clichés about Roman invincibility—but it does mean that a significant number of people will stop finding the pretense relevant to their own lives. How that happens in other historical examples, and how it might happen in our own time, will be the theme of next week’s post.

The Whisper of the Shutoff Valve

Wed, 2015-05-06 18:35
Last week’s post on the impending decline and fall of the internet fielded a great many responses. That was no surprise, to be sure; nor was I startled in the least to find that many of them rejected the thesis of the post with some heat. Contemporary pop culture’s strident insistence that technological progress is a clock that never runs backwards made such counterclaims inevitable.
Still, it’s always educational to watch the arguments fielded to prop up the increasingly shaky edifice of the modern mythology of progress, and the last week was no exception. A response I found particularly interesting from that standpoint appeared on one of the many online venues where Archdruid Report posts appear. One of the commenters insisted that my post should be rejected out of hand as mere doom and gloom; after all, he pointed out, it was ridiculous for me to suggest that fifty years from now, a majority of the population of the United States might be without reliable electricity or running water.
I’ve made the same prediction here and elsewhere a good many times. Each time, most of my readers or listeners seem to have taken it as a piece of sheer rhetorical hyperbole. The electrical grid and the assorted systems that send potable water flowing out of faucets are so basic to the rituals of everyday life in today’s America that their continued presence is taken for granted.  At most, it’s conceivable that individuals might choose not to connect to them; there’s a certain amount of talk about off-grid living here and there in the alternative media, for example.  That people who want these things might not have access to them, though, is pretty much unthinkable.
Meanwhile, in Detroit and Baltimore, tens of thousands of residents are in the process of losing their access to water and electricity.
The situation in both cities is much the same, and there’s every reason to think that identical headlines will shortly appear in reference to other cities around the nation. Not that many decades ago, Detroit and Baltimore were important industrial centers with thriving economies. Along with more than a hundred other cities in America’s Rust Belt, they were thrown under the bus with the first wave of industrial offshoring in the 1970s.  The situation for both cities has only gotten worse since that time, as the United States completed its long transition from a manufacturing economy producing goods and services to a bubble economy that mostly produces unpayable IOUs.
These days, the middle-class families whose tax payments propped up the expansive urban systems of an earlier day have long since moved out of town. Most of the remaining residents are poor, and the ongoing redistribution of wealth in America toward the very rich and away from everyone else has driven down the income of the urban poor to the point that many of them can no longer afford to pay their water and power bills. City utilities in Detroit and Baltimore have been sufficiently sensitive to political pressures that large-scale utility shutoffs have been delayed, but shifts in the political climate in both cities are bringing the delays to an end; water bills have increased steadily, more and more people have been unable to pay them, and the result is as predictable as it is brutal.
The debate over the Detroit and Baltimore shutoffs has followed the usual pattern, as one side wallows in bash-the-poor rhetoric while the other side insists plaintively that access to utilities is a human right. Neither side seems to be interested in talking about the broader context in which these disputes take shape. There are two aspects to that broader context, and it’s a tossup which is the more threatening.
The first aspect is the failure of the US economy to recover in any meaningful sense from the financial crisis of 2008. Now of course politicians from Obama on down have gone overtime grandstanding about the alleged recovery we’re in. I invite any of my readers who bought into that rhetoric to try the following simple experiment. Go to your favorite internet search engine and look up how much the fracking industry has added to the US gross domestic product each year from 2009 to 2014. Now subtract that figure from the US gross domestic product for each of those years, and see how much growth there’s actually been in the rest of the economy since the real estate bubble imploded.
What you’ll find, if you take the time to do that, is that the rest of the US economy has been flat on its back gasping for air for the last five years. What makes this even more problematic, as I’ve noted in several previous posts here, is that the great fracking boom about which we’ve heard so much for the last five years was never actually the game-changing energy revolution its promoters claimed; it was simply another installment in the series of speculative bubbles that has largely replaced constructive economic activity in this country over the last two decades or so.
What’s more, it’s not the only bubble currently being blown, and it may not even be the largest. We’ve also got a second tech-stock bubble, with money-losing internet corporations racking up absurd valuations in the stock market while they burn through millions of dollars of venture capital; we’ve got a student loan bubble, in which billions of dollars of loans that will never be paid back have been bundled, packaged, and sold to investors just like all those no-doc mortgages were a decade ago; car loans are getting the same treatment; the real estate market is fizzing again in many urban areas as investors pile into another round of lavishly marketed property investments—well, I could go on for some time. It’s entirely possible that if all the bubble activity were to be subtracted from the last five years or so of GDP, the result would show an economy in freefall.
Certainly that’s the impression that emerges if you take the time to check out those economic statistics that aren’t being systematically jiggered by the US government for PR purposes. The number of long-term unemployed in America is at an all-time high; roads, bridges, and other basic infrastructure is falling to pieces; measurements of US public health—generally considered a good proxy for the real economic condition of the population—are well below those of other industrial countries, heading toward Third World levels; abandoned shopping malls litter the landscape while major retailers announce more than 6000 store closures. These are not things you see in an era of economic expansion, or even one of relative stability; they’re markers of decline.
The utility shutoffs in Detroit and Baltimore are further symptoms of the same broad process of economic unraveling. It’s true, as pundits in the media have been insisting since the story broke, that utilities get shut off for nonpayment of bills all the time. It’s equally true that shutting off the water supply of 20,000 or 30,000 people all at once is pretty much unprecedented. Both cities, please note, have had very large populations of poor people for many decades now.  Those who like to blame a “culture of poverty” for the tangled relationship between US governments and the American poor, and of course that trope has been rehashed by some of the pundits just mentioned, haven’t yet gotten around to explaining how the culture of poverty all at once inspired tens of thousands of people who had been paying their utility bills to stop doing so.
There are plenty of good reasons, after all, why poor people who used to pay their bills can’t do so any more. Standard business models in the United States used to take it for granted that the best way to run the staffing dimensions of any company, large or small, was to have as many full-time positions as possible and to use raises and other practical incentives to encourage employees who were good at their jobs to stay with the company. That approach has been increasingly unfashionable in today’s America, partly due to perverse regulatory incentives that penalize employers for offering full-time positions, partly to the emergence of attitudes in corner offices that treat employees as just another commodity. (I doubt it’s any kind of accident that most corporations nowadays refer to their employment offices as “human resource departments.” What do you do with a resource? You exploit it.)
These days, most of the jobs available to the poor are part-time, pay very little, and include nasty little clawbacks in the form of requirements that employees pay out of pocket for uniforms, equipment, and other things that employers used to provide as a matter of course. Meanwhile housing prices and rents are rising well above their post-2008 dip, and a great many other necessities are becoming more costly—inflation may be under control, or so the official statistics say, but anyone who’s been shopping at the same grocery store for the last eight years knows perfectly well that prices kept on rising anyway.
So you’ve got falling incomes running up against rising costs for food, rent, and utilities, among other things. In the resulting collision, something’s got to give, and for tens of thousands of poor Detroiters and Baltimoreans, what gave first was the ability to keep current on their water bills. Expect to see the same story playing out across the country as more people on the bottom of the income pyramid find themselves in the same situation. What you won’t hear in the media, though it’s visible enough if you know where to look and are willing to do so, is that people above the bottom of the income pyramid are also losing ground, being forced down toward economic nonpersonhood. From the middle classes down, everyone’s losing ground.
That process doesn’t continue any further than the middle class, to be sure. It’s been pointed out repeatedly that over the last four decades or so, the distribution of wealth in America has skewed further and further out of balance, with the top 20% of incomes taking a larger and larger share at the expense of everybody else. That’s an important factor in bringing about the collision just described. Some thinkers on the radical fringes of American society, which is the only place in the US you can talk about such things these days, have argued that the raw greed of the well-to-do is the sole reason why so many people lower down the ladder are being pushed further down still.
Scapegoating rhetoric of that sort is always comforting, because it holds out the promise—theoretically, if not practically—that something can be done about the situation. If only the thieving rich could be lined up against a convenient brick wall and removed from the equation in the time-honored fashion, the logic goes, people in Detroit and Baltimore could afford to pay their water bills!  I suspect we’ll hear such claims increasingly often as the years pass and more and more Americans find their access to familiar comforts and necessities slipping away.  Simple answers are always popular in such times, not least when the people being scapegoated go as far out of their way to make themselves good targets for such exercises as the American rich have done in recent decades.
John Kenneth Galbraith’s equation of the current US political and economic elite with the French aristocracy on the eve of revolution rings even more true than it did when he wrote it back in 1992, in the pages of The Culture of Contentment. The unthinking extravagances, the casual dismissal of the last shreds of noblesse oblige, the obsessive pursuit of personal advantages and private feuds without the least thought of the potential consequences, the bland inability to recognize that the power, privilege, wealth, and sheer survival of the aristocracy depended on the system the aristocrats themselves were destabilizing by their actions—it’s all there, complete with sprawling overpriced mansions that could just about double for Versailles. The urban mobs that played so large a role back in 1789 are warming up for their performances as I write these words; the only thing left to complete the picture is a few tumbrils and a guillotine, and those will doubtless arrive on cue.
The senility of the current US elite, as noted in a previous post here, is a massive political fact in today’s America. Still, it’s not the only factor in play here. Previous generations of wealthy Americans recognized without too much difficulty that their power, prosperity, and survival depended on the willingness of the rest of the population to put up with their antics. Several times already in America’s history, elite groups have allied with populist forces to push through reforms that sharply weakened the power of the wealthy elite, because they recognized that the alternative was a social explosion even more destructive to the system on which elite power depends.
I suppose it’s possible that the people currently occupying the upper ranks of the political and economic pyramid in today’s America are just that much more stupid than their equivalents in the Jacksonian, Progressive, and New Deal eras. Still, there’s at least one other explanation to hand, and it’s the second of the two threatening contextual issues mentioned earlier.
Until the nineteenth century, fresh running water piped into homes for everyday use was purely an affectation of the very rich in a few very wealthy and technologically adept societies. Sewer pipes to take dirty water and human wastes out of the house belonged in the same category. This wasn’t because nobody knew how plumbing works—the Romans had competent plumbers, for example, and water faucets and flush toilets were to be found in Roman mansions of the imperial age. The reason those same things weren’t found in every Roman house was economic, not technical.
Behind that economic issue lay an ecological reality.  White’s Law, one of the foundational principles of human ecology, states that economic development is a function of energy per capita. For a society before the industrial age, the Roman Empire had an impressive amount of energy per capita to expend; control over the agricultural economy of the Mediterranean basin, modest inputs from sunlight, water and wind, and a thriving slave industry fed by the expansion of Roman military power all fed into the capacity of Roman society to develop itself economically and technically. That’s why rich Romans had running water and iced drinks in summer, while their equivalents in ancient Greece a few centuries earlier had to make do without either one.
Fossil fuels gave industrial civilization a supply of energy many orders of magnitude greater than any previous human civilization has had—a supply vast enough that the difference remains huge even after the vast expansion of population that followed the industrial revolution. There was, however, a catch—or, more precisely, two catches. To begin with, fossil fuels are finite, nonrenewable resources; no matter how much handwaving is employed in the attempt to obscure this point—and whatever else might be in short supply these days, that sort of handwaving is not—every barrel of oil, ton of coal, or cubic foot of natural gas that’s burnt takes the world one step closer to the point at which there will be no economically extractable reserves of oil, coal, or natural gas at all.
That’s catch #1. Catch #2 is subtler, and considerably more dangerous. Oil, coal, and natural gas don’t leap out of the ground on command. They have to be extracted and processed, and this takes energy. Companies in the fossil fuel industries have always targeted the deposits that cost less to extract and process, for obvious economic reasons. What this means, though, is that over time, a larger and larger fraction of the energy yield of oil, coal, and natural gas has to be put right back into extracting and processing oil, coal, and natural gas—and this leaves less and less for all other uses.
That’s the vise that’s tightening around the American economy these days. The great fracking boom, to the extent that it wasn’t simply one more speculative gimmick aimed at the pocketbooks of chumps, was an attempt to make up for the ongoing decline of America’s conventional oilfields by going after oil that was far more expensive to extract. The fact that none of the companies at the heart of the fracking boom ever turned a profit, even when oil brought more than $100 a barrel, gives some sense of just how costly shale oil is to get out of the ground. The financial cost of extraction, though, is a proxy for the energy cost of extraction—the amount of energy, and of the products of energy, that had to be thrown into the task of getting a little extra oil out of marginal source rock.
Energy needed to extract energy, again, can’t be used for any other purpose. It doesn’t contribute to the energy surplus that makes economic development possible. As the energy industry itself takes a bigger bite out of each year’s energy production, every other economic activity loses part of the fuel that makes it run. That, in turn, is the core reason why the American economy is on the ropes, America’s infrastructure is falling to bits—and Americans in Detroit and Baltimore are facing a transition to Third World conditions, without electricity or running water.
I suspect, for what it’s worth, that the shutoff notices being mailed to tens of thousands of poor families in those two cities are a good working model for the way that industrial civilization itself will wind down. It won’t be sudden; for decades to come, there will still be people who have access to what Americans today consider the ordinary necessities and comforts of everyday life; there will just be fewer of them each year. Outside that narrowing circle, the number of economic nonpersons will grow steadily, one shutoff notice at a time.
As I’ve pointed out in previous posts, the line of fracture between the senile elite and what Arnold Toynbee called the internal proletariat—the people who live within a failing civilization’s borders but receive essentially none of its benefits—eventually opens into a chasm that swallows what’s left of the civilization. Sometimes the tectonic processes that pull the chasm open are hard to miss, but there are times when they’re a good deal more difficult to sense in action, and this is one of these latter times. Listen to the whisper of the shutoff valve, and you’ll hear tens of thousands of Americans being cut off from basic services the rest of us, for the time being, still take for granted.

The Death of the Internet: A Pre-Mortem

Wed, 2015-04-29 17:25
The mythic role assigned to progress in today’s popular culture has any number of odd effects, but one of the strangest is the blindness to the downside that clamps down on the collective imagination of our time once people become convinced that something or other is the wave of the future. It doesn’t matter in the least how many or obvious the warning signs are, or how many times the same tawdry drama has been enacted.  Once some shiny new gimmick gets accepted as the next glorious step in the invincible march of progress, most people lose the ability to imagine that the wave of the future might just do what waves generally do: that is to say, crest, break, and flow back out to sea, leaving debris scattered on the beach in its wake.
It so happens that I grew up in the middle of just such a temporary wave of the future, in the south Seattle suburbs in the 1960s, where every third breadwinner worked for Boeing. The wave in question was the supersonic transport, SST for short: a jetliner that would fly faster than sound, cutting hours off long flights. The inevitability of the SST was an article of faith locally, and not just because Boeing was building one; an Anglo-French consortium was in the lead with the Concorde, and the Soviets were working on the Tu-144, but the Boeing 2707 was expected to be the biggest and baddest of them all, a 300-seat swing-wing plane that was going to make commercial supersonic flight an everyday reality.
Long before the 2707 had even the most ghostly sort of reality, you could buy model kits of the plane, complete with Pan Am decals, at every hobby store in the greater Seattle area. For that matter, take Interstate 5 south from downtown Seattle past the sprawling Boeing plant just outside of town, and you’d see the image of the 2707 on the wall of one of the huge assembly buildings, a big delta-winged shape in white and gold winging its way through the imagined air toward the gleaming future in which so many people believed back then.
There was, as it happened, a small problem with the 2707, a problem it shared with all the other SST projects; it made no economic sense at all. It was, to be precise, what an earlier post here called  a subsidy dumpster: that is, a project that was technically feasible but economically impractical, and existed mostly as a way to pump government subsidies into Boeing’s coffers. Come 1971, the well ran dry: faced with gloomy numbers from the economists, worried calculations from environmental scientists, and a public not exactly enthusiastic about dozens of sonic booms a day rattling plates and cracking windows around major airports, Congress cut the project’s funding.
That happened right when the US economy generally, and the notoriously cyclical airplane industry in particular, were hitting downturns. Boeing was Seattle’s biggest employer in those days, and when it laid off employees en masse, the result was a local depression of legendary severity. You heard a lot of people in those days insisting that the US had missed out on the next aviation boom, and Congress would have to hang its head in shame once Concordes and Tu-144s were hauling passengers all over the globe. Of course that’s not what happened; the Tu-144 flew a handful of commercial flights and then was grounded for safety reasons, and the Concorde lingered on, a technical triumph but an economic white elephant, until the last plane retired from service in 2003.
All this has been on my mind of late as I’ve considered the future of the internet. The comparison may seem far-fetched, but then that’s what supporters of the SST would have said if anyone had compared the Boeing 2707 to, say, the zeppelin, another wave of the future that turned out to make too little economic sense to matter. Granted, the internet isn’t a subsidy dumpster, and it’s also much more complex than the SST; if anything, it might be compared to the entire system of commercial air travel, which we still have with us or the moment. Nonetheless, a strong case can be made that the internet, like the SST, doesn’t actually make economic sense; it’s being propped up by a set of financial gimmickry with a distinct resemblance to smoke and mirrors; and when those go away—and they will—much of what makes the internet so central a part of pop culture will go away as well.
It’s probably necessary to repeat here that the reasons for this are economic, not technical. Every time I’ve discussed the hard economic realities that make the internet’s lifespan in the deindustrial age  roughly that of a snowball in Beelzebub’s back yard, I’ve gotten a flurry of responses fixating on purely  technical issues. Those issues are beside the point.  No doubt it would be possible to make something like the internet technically feasible in a society on the far side of the Long Descent, but that doesn’t matter; what matters is that the internet has to cover its operating costs, and it also has to compete with other ways of doing the things that the internet currently does.
It’s a source of wry amusement to me that so many people seem to have forgotten that the internet doesn’t actually do very much that’s new. Long before the internet, people were reading the news, publishing essays and stories, navigating through unfamiliar neighborhoods, sharing photos of kittens with their friends, ordering products from faraway stores for home delivery, looking at pictures of people with their clothes off, sending anonymous hate-filled messages to unsuspecting recipients, and doing pretty much everything else that they do on the internet today. For the moment, doing these things on the internet is cheaper and more convenient than the alternatives, and that’s what makes the internet so popular. If that changes—if the internet becomes more costly and less convenient than other options—its current popularity is unlikely to last.
Let’s start by looking at the costs. Every time I’ve mentioned the future of the internet on this blog, I’ve gotten comments and emails from readers who think that the price of their monthly internet service is a reasonable measure of the cost of the internet as a whole. For a useful corrective to this delusion, talk to people who work in data centers. You’ll hear about trucks pulling up to the loading dock every single day to offload pallet after pallet of brand new hard drives and other components, to replace those that will burn out that same day. You’ll hear about power bills that would easily cover the electricity costs of a small city. You’ll hear about many other costs as well. Data centers are not cheap to run, there are many thousands of them, and they’re only one part of the vast infrastructure we call the internet: by many measures, the most gargantuan technological project in the history of our species.
Your monthly fee for internet service covers only a small portion of what the internet costs. Where does the rest come from? That depends on which part of the net we’re discussing. The basic structure is paid for by internet service providers (ISPs), who recoup part of the costs from your monthly fee, part from the much larger fees paid by big users, and part by advertising. Content providers use some mix of advertising, pay-to-play service fees, sales of goods and services, packaging and selling your personal data to advertisers and government agencies, and new money from investors and loans to meet their costs. The ISPs routinely make a modest profit on the deal, but many of the content providers do not. Amazon may be the biggest retailer on the planet, for example, and its cash flow has soared in recent years, but its expenses have risen just as fast, and it rarely makes a profit. Many other content provider firms, including fish as big as Twitter, rack up big losses year after year.
How do they stay in business? A combination of vast amounts of investment money and ultracheap debt. That’s very common in the early decades of a new industry, though it’s been made a good deal easier by the Fed’s policy of next-to-zero interest rates. Investors who dream of buying stock in the next Microsoft provide venture capital for internet startups, banks provide lines of credit for existing firms, the stock and bond markets snap up paper of various kinds churned out by internet businesses, and all that money goes to pay the bills. It’s a reasonable gamble for the investors; they know perfectly well that a great many of the firms they’re funding will go belly up within a few years, but the few that don’t will either be bought up at inflated prices by one of the big dogs of the online world, or will figure out how to make money and then become big dogs themselves.
Notice, though, that this process has an unexpected benefit for ordinary internet users: a great many services are available for free, because venture-capital investors and lines of credit are footing the bill for the time being. Boosting the number of page views and clickthroughs is far more important for the future of an internet company these days than making a profit, and so the usual business plan is to provide plenty of free goodies to the public without worrying about the financial end of things. That’s very convenient just now for internet users, but it fosters the illusion that the internet costs nothing.
As mentioned earlier, this sort of thing is very common in the early decades of a new industry. As the industry matures, markets become saturated, startups become considerably riskier, and venture capital heads for greener pastures.  Once this happens, the companies that dominate the industry have to stay in business the old-fashioned way, by earning a profit, and that means charging as much as the market will bear, monetizing services that are currently free, and cutting service to the lowest level that customers will tolerate. That’s business as usual, and it means the end of most of the noncommercial content that gives the internet so much of its current role in popular culture.
All other things being equal, in other words, the internet can be expected to follow the usual trajectory of a maturing industry, becoming more expensive, less convenient, and more tightly focused on making a quick buck with each passing year. Governments have already begun to tax internet sales, removing one of the core “stealth subsidies” that boosted the internet at the expense of other retail sectors, and taxation of the internet will only increase as cash-starved officials contemplate the tidal waves of money sloshing back and forth online. None of these changes will kill the internet, but they’ll slap limits on the more utopian fantasies currently burbling about the web, and provide major incentives for individuals and businesses to back away from the internet and do things in the real world instead.
Then there’s the increasingly murky world of online crime, espionage, and warfare, which promises to push very hard in the same direction in the years ahead.  I think most people are starting to realize that on the internet, there’s no such thing as secure data, and the costs of conducting business online these days include a growing risk of having your credit cards stolen, your bank accounts looted, your identity borrowed for any number of dubious purposes, and the files on your computer encrypted without your knowledge, so that you can be forced to pay a ransom for their release—this latter, or so I’ve read, is the latest hot new trend in internet crime.
Online crime is one of the few fields of criminal endeavor in which raw cleverness is all you need to make out, as the saying goes, like a bandit. In the years ahead, as a result, the internet may look less like an information superhighway and more like one of those grim inner city streets where not even the muggers go alone. Trends in online espionage and warfare are harder to track, but either or both could become a serious burden on the internet as well.
Online crime, espionage, and warfare aren’t going to kill the internet, any more than the ordinary maturing of the industry will. Rather, they’ll lead to a future in which costs of being online are very often greater than the benefits, and the internet is by and large endured rather than enjoyed. They’ll also help drive the inevitable rebound away from the net. That’s one of those things that always happens and always blindsides the cheerleaders of the latest technology: a few decades into its lifespan, people start to realize that they liked the old technology better, thank you very much, and go back to it. The rebound away from the internet has already begun, and will only become more visible as time goes on, making a great many claims about the future of the internet look as absurd as those 1950s articles insisting that in the future, every restaurant would inevitably be a drive-in.
To be sure, the resurgence of live theater in the wake of the golden age of movie theaters didn’t end cinema, and the revival of bicycling in the aftermath of the automobile didn’t make cars go away. In the same way, the renewal of interest in offline practices and technologies isn’t going to make the internet go away. It’s simply going to accelerate the shift of avant-garde culture away from an increasingly bleak, bland, unsafe, and corporate- and government-controlled internet and into alternative venues. That won’t kill the internet, though once again it will put a stone marked R.I.P. atop the grave of a lot of the utopian fantasies that have clustered around today’s net culture.
All other things being equal, in fact, there’s no reason why the internet couldn’t keep on its present course for years to come. Under those circumstances, it would shed most of the features that make it popular with today’s avant-garde, and become one more centralized, regulated, vacuous mass medium, packed to the bursting point with corporate advertising and lowest-common-denominator content, with dissenting voices and alternative culture shut out or shoved into corners where nobody ever looks. That’s the normal trajectory of an information technology in today’s industrial civilization, after all; it’s what happened with radio and television in their day, as the gaudy and grandiose claims of the early years gave way to the crass commercial realities of the mature forms of each medium.
But all other things aren’t equal.
Radio and television, like most of the other familiar technologies that define life in a modern industrial society, were born and grew to maturity in an expanding economy. The internet, by contrast, was born during the last great blowoff of the petroleum age—the last decades of the twentieth century, during which the world’s industrial nations took the oil reserves that might have cushioned the transition to sustainability, and blew them instead on one last orgy of over-the-top conspicuous consumption—and it’s coming to maturity in the early years of an age of economic contraction and ecological blowback.
The rising prices, falling service quality, and relentless monetization of a maturing industry, together with the increasing burden of online crime and the inevitable rebound away from internet culture, will thus be hitting the internet in a time when the global economy no longer has the slack it once did, and the immense costs of running the internet in anything like its present form will have to be drawn from a pool of real wealth that has many other demands on it. What’s more, quite a few of those other demands will be far more urgent than the need to provide consumers with a convenient way to send pictures of kittens to their friends. That stark reality will add to the pressure to monetize internet services, and provide incentives to those who choose to send their kitten pictures by other means.
It’s crucial to remember here, as noted above, that the internet is simply a cheaper and more convenient way of doing things that people were doing long before the first website went live, and a big part of the reason why it’s cheaper and more convenient right now is that internet users are being subsidized by the investors and venture capitalists who are funding the internet industry. That’s not the only subsidy on which the internet depends, though. Along with the rest of industrial society, it’s also subsidized by half a billion years of concentrated solar energy in the form of fossil fuels.  As those deplete, the vast inputs of energy, labor, raw materials, industrial products, and other forms of wealth that sustain the internet will become increasingly expensive to provide, and ways of distributing kitten pictures that don’t require the same inputs will prosper in the resulting competition.
There are also crucial issues of scale. Most pre-internet communications and information technologies scale down extremely well. A community of relatively modest size can have its own public library, its own small press, its own newspaper, and its own radio station running local programming, and could conceivably keep all of these functioning and useful even if the rest of humanity suddenly vanished from the map. Internet technology doesn’t have that advantage. It’s orders of magnitude more complex and expensive than a radio transmitter, not to mention the 14th-century technology of printing presses and card catalogs; what’s more, on the scale of a small community, the benefits of using internet technology instead of simpler equivalents wouldn’t come close to justifying the vast additional cost.
Now of course the world of the future isn’t going to consist of a single community surrounded by desolate wasteland. That’s one of the reasons why the demise of the internet won’t happen all at once. Telecommunications companies serving some of the more impoverished parts of rural America are already letting their networks in those areas degrade, since income from customers doesn’t cover the costs of maintenance.  To my mind, that’s a harbinger of the internet’s future—a future of uneven decline punctuated by local and regional breakdowns, some of which will be fixed for a while.
That said, it’s quite possible that there will still be an internet of some sort fifty years from now. It will connect government agencies, military units, defense contractors, and the handful of universities that survive the approaching implosion of the academic industry here in the US, and it may provide email and a few other services to the very rich, but it will otherwise have a lot more in common with the original DARPAnet than with the 24/7 virtual cosmos imagined by today’s more gullible netheads.
Unless you’re one of the very rich or an employee of one of the institutions just named, furthermore, you won’t have access to the internet of 2065.  You might be able to hack into it, if you have the necessary skills and are willing to risk a long stint in a labor camp, but unless you’re a criminal or a spy working for the insurgencies flaring in the South or the mountain West, there’s not much point to the stunt. If you’re like most Americans in 2065, you live in Third World conditions without regular access to electricity or running water, and you’ve got other ways to buy things, find out what’s going on in the world, find out how to get to the next town and, yes, look at pictures of people with their clothes off. What’s more, in a deindustrializing world, those other ways of doing things will be cheaper, more resilient, and more useful than reliance on the baroque intricacies of a vast computer net.
Exactly when the last vestiges of the internet will sputter to silence is a harder question to answer. Long before that happens, though, it will have lost its current role as one of the poster children of the myth of perpetual progress, and turned back into what it really was all the time: a preposterously complex way to do things most people have always done by much simpler means, which only seemed to make sense during that very brief interval of human history when fossil fuels were abundant and cheap.
***In other news, I’m pleased to announce that the third anthology of deindustrial SF stories from this blog’s “Space Bats” contest, After Oil 3: The Years of Rebirth, is now available in print and e-book formats. Those of my readers who’ve turned the pages of the two previous After Oil anthologies already know that this one has a dozen eminently readable and thought-provoking stories about the world on the far side of the Petroleum Age; the rest of you—why, you’re in for a treat. Those who are interested in contributing to the next After Oil anthology will find the details here.

A Field Guide to Negative Progress

Wed, 2015-04-22 17:23
I've commented before in these posts that writing is always partly a social activity. What Mortimer Adler used to call the Great Conversation, the dance of ideas down the corridors of the centuries, shapes every word in a writer’s toolkit; you can hardly write a page in English without drawing on a shade of meaning that Geoffrey Chaucer, say, or William Shakespeare, or Jane Austen first put into the language. That said, there’s also a more immediate sense in which any writer who interacts with his or her readers is part of a social activity, and one of the benefits came my way just after last week’s post.
That post began with a discussion of the increasingly surreal quality of America’s collective life these days, and one of my readers—tip of the archdruidical hat to Anton Mett—had a fine example to offer. He’d listened to an economic report on the media, and the talking heads were going on and on about the US economy’s current condition of, ahem, “negative growth.” Negative growth? Why yes, that’s the opposite of growth, and it’s apparently quite a common bit of jargon in economics just now.
Of course the English language, as used by the authors named earlier among many others, has no shortage of perfectly clear words for the opposite of growth. “Decline” comes to mind; so does “decrease,” and so does “contraction.” Would it have been so very hard for the talking heads in that program, or their many equivalents in our economic life generally, to draw in a deep breath and actually come right out and say “The US economy has contracted,” or “GDP has decreased,” or even “we’re currently in a state of economic decline”? Come on, economists, you can do it!
But of course they can’t.  Economists in general are supposed to provide, shall we say, negative clarity when discussing certain aspects of contemporary American economic life, and talking heads in the media are even more subject to this rule than most of their peers. Among the things about which they’re supposed to be negatively clear, two are particularly relevant here; the first is that economic contraction happens, and the second is that that letting too much of the national wealth end up in too few hands is a very effective way to cause economic contraction. The logic here is uncomfortably straightforward—an economy that depends on consumer expenditures only prospers if consumers have plenty of money to spend—but talking about that equation would cast an unwelcome light on the culture of mindless kleptocracy entrenched these days at the upper end of the US socioeconomic ladder. So we get to witness the mass production of negative clarity about one of the main causes of negative growth.
It’s entrancing to think of other uses for this convenient mode of putting things. I can readily see it finding a role in health care—“I’m sorry, ma’am,” the doctor says, “but your husband is negatively alive;” in sports—“Well, Joe, unless the Orioles can cut down that negative lead of theirs, they’re likely headed for a negative win;” and in the news—“The situation in Yemen is shaping up to be yet another negative triumph for US foreign policy.” For that matter, it’s time to update one of the more useful proverbs of recent years: what do you call an economist who makes a prediction? Negatively right.
Come to think of it, we might as well borrow the same turn of phrase for the subject of last week’s post, the deliberate adoption of older, simpler, more independent technologies in place of today’s newer, more complex, and more interconnected ones. I’ve been talking about that project so far under the negatively mealy-mouthed label “intentional technological regress,” but hey, why not be cool and adopt the latest fashion? For this week, at least, we’ll therefore redefine our terms a bit, and describe the same thing as “negative progress.” Since negative growth sounds like just another kind of growth, negative progress ought to pass for another kind of progress, right?
With this in mind, I’d like to talk about some of the reasons that individuals, families, organizations, and communities, as they wend their way through today’s cafeteria of technological choices, might want to consider loading up their plates with a good hearty helping of negative progress.
Let’s start by returning to one of the central points raised here in earlier posts, the relationship between progress and the production of externalities. By and large, the more recent a technology is, the more of its costs aren’t paid by the makers or the users of the technology, but are pushed off onto someone else. As I pointed out a post two months ago, this isn’t accidental; quite the contrary, as noted in the post just cited, it’s hardwired into the relationship between progress and market economics, and bids fair to play a central role in the unraveling of the entire project of industrial civilization.
The same process of increasing externalities, though, has another face when seen from the point of view of the individual user of any given technology. When you externalize any cost of a technology, you become dependent on whoever or whatever picks up the cost you’re not paying. What’s more, you become dependent on the system that does the externalizing, and on whoever controls that system. Those dependencies aren’t always obvious, but they impose costs of their own, some financial and some less tangible. What’s more, unlike the externalized costs, a great many of these secondary costs land directly on the user of the technology.
It’s interesting, and may not be entirely accidental, that there’s no commonly used term for the entire structure of externalities and dependencies that stand behind any technology. Such a term is necessary here, so for the present purpose,  we’ll call the structure just named the technology’s externality system. Given that turn of phrase, we can restate the point about progress made above: by and large, the more recent a technology is, the larger the externality system on which it depends.
An example will be useful here, so let’s compare the respective externality systems of a bicycle and an automobile. Like most externality systems, these divide up more or less naturally into three categories: manufacture, maintenance, and use. Everything that goes into fabricating steel parts, for instance, all the way back to the iron ore in the mine, is an externality of manufacture; everything that goes into making lubricating oil, all the way back to drilling for the oil well, is an externality of maintenance; everything that goes into building roads suitable for bikes and cars is an externality of use.
Both externality systems are complex, and include a great many things that aren’t obvious at first glance. The point I want to make here, though, is that the car’s externality system is far and away the more complex of the two. In fact, the bike’s externality system is a subset of the car’s, and this reflects the specific historical order in which the two technologies were developed. When the technologies that were needed for a bicycle’s externality system came into use, the first bicycles appeared; when all the additional technologies needed for a car’s externality system were added onto that foundation, the first cars followed. That sort of incremental addition of externality-generating technologies is far and away the most common way that technology progresses.
We can thus restate the pattern just analyzed in a way that brings out some of its less visible and more troublesome aspects: by and large, each new generation of technology imposes more dependencies on its users than the generation it replaces. Again, a comparison between bicycles and automobiles will help make that clear. If you want to ride a bike, you’ve committed yourself to dependence on all the technical, economic, and social systems that go into manufacturing, maintaining, and using the bike; you can’t own, maintain, and ride a bike without the steel mills that produce the frame, the chemical plants that produce the oil you squirt on the gears, the gravel pits that provide raw material for roads and bike paths, and so on.
On the other hand, you’re not dependent on a galaxy of other systems that provide the externality system for your neighbor who drives. You don’t depend on the immense network of pipelines, tanker trucks, and gas stations that provide him with fuel; you don’t depend on the interstate highway system or the immense infrastructure that supports it; if you did the sensible thing and bought a bike that was made by a local craftsperson, your dependence on vast multinational corporations and all of their infrastructure, from sweatshop labor in Third World countries to financial shenanigans on Wall Street, is considerably smaller than that of your driving neighbor. Every dependency you have, your neighbor also has, but not vice versa.
Whether or not these dependencies matter is a complex thing. Obviously there’s a personal equation—some people like to be independent, others are fine with being just one more cog in the megamachine—but there’s also a historical factor to consider. In an age of economic expansion, the benefits of dependency very often outweigh the costs; standards of living are rising, opportunities abound, and it’s easy to offset the costs of any given dependency. In a stable economy, one that’s neither growing nor contracting, the benefits and costs of any given dependency need to be weighed carefully on a case by case basis, as one dependency may be worth accepting while another costs more than it’s worth.
On the other hand, in an age of contraction and decline—or, shall we say, negative expansion?—most dependencies are problematic, and some are lethal. In a contracting economy, as everyone scrambles to hold onto as much as possible of the lifestyles of a more prosperous age, your profit is by definition someone else’s loss, and dependency is just another weapon in the Hobbesian war of all against all. By many measures, the US economy has been contracting since before the bursting of the housing bubble in 2008; by some—in particular, the median and modal standards of living—it’s been contracting since the 1970s, and the unmistakable hissing sound as air leaks out of the fracking bubble just now should be considered fair warning that another round of contraction is on its way.
With that in mind, it’s time to talk about the downsides of dependency.
First of all, dependency is expensive. In the struggle for shares of a shrinking pie in a contracting economy, turning any available dependency into a cash cow is an obvious strategy, and one that’s already very much in play. Consider the conversion of freeways into toll roads, an increasingly popular strategy in large parts of the United States. Consider, for that matter, the soaring price of health care in the US, which hasn’t been accompanied by any noticeable increase in quality of care or treatment outcomes. In the dog-eat-dog world of economic contraction, commuters and sick people are just two of many captive populations whose dependencies make them vulnerable to exploitation. As the spiral of decline continues, it’s safe to assume that any dependency that can be exploited will be exploited, and the more dependencies you have, the more likely you are to be squeezed dry.
The same principle applies to power as well as money; thus, whoever owns the systems on which you depend, owns you. In the United States, again, laws meant to protect employees from abusive behavior on the part of employers are increasingly ignored; as the number of the permanently unemployed keeps climbing year after year, employers know that those who still have jobs are desperate to keep them, and will put up with almost anything in order to keep that paycheck coming in. The old adage about the inadvisability of trying to fight City Hall has its roots in this same phenomenon; no matter what rights you have on paper, you’re not likely to get far with them when the other side can stop picking up your garbage and then fine you for creating a public nuisance, or engage in some other equally creative use of their official prerogatives. As decline accelerates, expect to see dependencies increasingly used as levers for exerting various kinds of economic, political, and social power at your expense.
Finally, and crucially, if you’re dependent on a failing system, when the system goes down, so do you. That’s not just an issue for the future; it’s a huge if still largely unmentioned reality of life in today’s America, and in most other corners of the industrial world as well. Most of today’s permanently unemployed got that way because the job on which they depended for their livelihood got offshored or automated out of existence; much of the rising tide of poverty across the United States is a direct result of the collapse of political and social systems that once countered the free market’s innate tendency to drive the gap between rich and poor to Dickensian extremes. For that matter, how many people who never learned how to read a road map are already finding themselves in random places far from help because something went wrong with their GPS units?
It’s very popular among those who recognize the problem with being shackled to a collapsing system to insist that it’s a problem for the future, not the present.  They grant that dependency is going to be a losing bet someday, but everything’s fine for now, so why not enjoy the latest technological gimmickry while it’s here? Of course that presupposes that you enjoy the latest technological gimmicry, which isn’t necessarily a safe bet, and it also ignores the first two difficulties with dependency outlined above, which are very much present and accounted for right now. We’ll let both those issues pass for the moment, though, because there’s another factor that needs to be included in the calculation.
A practical example, again, will be useful here. In my experience, it takes around five years of hard work, study, and learning from your mistakes to become a competent vegetable gardener. If you’re transitioning from buying all your vegetables at the grocery store to growing them in your backyard, in other words, you need to start gardening about five years before your last trip to the grocery store. The skill and hard work that goes into growing vegetables is one of many things that most people in the world’s industrial nations externalize, and those things don’t just pop back to you when you leave the produce section of the store for the last time. There’s a learning curve that has to be undergone.
Not that long ago, there used to be a subset of preppers who grasped the fact that a stash of cartridges and canned wieners in a locked box at their favorite deer camp cabin wasn’t going to get them through the downfall of industrial civilization, but hadn’t factored in the learning curve. Businesses targeting the prepper market thus used to sell these garden-in-a-box kits, which had seed packets for vegetables, a few tools, and a little manual on how to grow a garden. It’s a good thing that Y2K, 2012, and all those other dates when doom was supposed to arrive turned out to be wrong, because I met a fair number of people who thought that having one of those kits would save them even though they last grew a plant from seed in fourth grade. If the apocalypse had actually arrived, survivors a few years later would have gotten used to a landscape scattered with empty garden-in-a-box kits, overgrown garden patches, and the skeletal remains of preppers who starved to death because the learning curve lasted just that much longer than they did.
The same principle applies to every other set of skills that has been externalized by people in today’s industrial society, and will be coming back home to roost as economic contraction starts to cut into the viability of our externality systems. You can adopt them now, when you have time to get through the learning curve while there’s still an industrial society around to make up for the mistakes and failures that are inseparable from learning, or you can try to adopt them later, when those same inevitable mistakes and failures could very well land you in a world of hurt. You can also adopt them now, when your dependencies haven’t yet been used to empty your wallet and control your behavior, or you can try to adopt them later, when a much larger fraction of the resources and autonomy you might have used for the purpose will have been extracted from you by way of those same dependencies.
This is a point I’ve made in previous posts here, but it applies with particular force to negative progress—that is, to the deliberate adoption of older, simpler, more independent technologies in place of the latest, dependency-laden offerings from the corporate machine. As decline—or, shall we say, negative growth—becomes an inescapable fact of life in postprogress America, decreasing your dependence on sprawling externality systems is going to be an essential tactic.
Those who become early adopters of the retro future, to use an edgy term from last week’s post, will have at least two, and potentially three, significant advantages. The first, as already noted, is that they’ll be much further along the learning curve by the time rising costs, increasing instabilities, and cascading systems failures either put the complex technosystems out of reach or push the relationship between costs and benefits well over into losing-proposition territory. The second is that as more people catch onto the advantages of older, simpler, more sustainable technologies, surviving examples will become harder to find and more expensive to buy; in this case as in many others, collapsing first ahead of the rush is, among other things, the more affordable option.
The third advantage? Depending on exactly which old technologies you happen to adopt, and whether or not you have any talent for basement-workshop manufacture and the like, you may find yourself on the way to a viable new career as most other people will be losing their jobs—and their shirts. As the global economy comes unraveled and people in the United States lose their current access to shoddy imports from Third World sweatshops, there will be a demand for a wide range of tools and simple technologies that still make sense in a deindustrializing world. Those who already know how to use such technologies will be prepared to teach others how to use them; those who know how to repair, recondition, or manufacture those technologies will be prepared to barter, or to use whatever form of currency happens to replace today’s mostly hallucinatory forms of money, to good advantage.
My guess, for what it’s worth, is that salvage trades will be among the few growth industries in the 21st century, and the crafts involved in turning scrap metal and antique machinery into tools and machines that people need for their homes and workplaces will be an important part of that economic sector. To understand how that will work, though, it’s probably going to be necessary to get a clearer sense of the way that today’s complex technostructures are likely to come apart. Next week, with that in mind, we’ll spend some time thinking about the unthinkable—the impending death of the internet.

The Retro Future

Wed, 2015-04-15 18:16
Is it just me, or has the United States taken yet another great leap forward into the surreal over the last few days? Glancing through the news, I find another round of articles babbling about how fracking has guaranteed America a gaudy future as a petroleum and natural gas exporter. Somehow none of these articles get around to mentioning that the United States is a major net importer of both commodities, that most of the big-name firms in the fracking industry have been losing money at a rate of billions a year since the boom began, and that the pileup of bad loans to fracking firms is pushing the US banking industry into a significant credit crunch, but that’s just par for the course nowadays.
Then there’s the current tempest in the media’s teapot, Hillary Clinton’s presidential run. I’ve come to think of Clinton as the Khloe Kardashian of American politics, since she owed her original fame to the mere fact that she’s related to someone else who once caught the public eye. Since then she’s cycled through various roles because, basically, that’s what Famous People do, and the US presidency is just the next reality-TV gig on her bucket list. I grant that there’s a certain wry amusement to be gained from watching this child of privilege, with the help of her multimillionaire friends, posturing as a champion of the downtrodden, but I trust that none of my readers are under the illusion that this rhetoric will amount to anything more than all that chatter about hope and change eight years ago.
Let us please be real: whoever mumbles the oath of office up there on the podium in 2017, whether it’s Clinton or the interchangeably Bozoesque figures currently piling one by one out of the GOP’s clown car to contend with her, we can count on more of the same: more futile wars, more giveaways to the rich at everyone else’s expense, more erosion of civil liberties, more of all the other things Obama’s cheerleaders insisted back in 2008 he would stop as soon as he got into office.  As Arnold Toynbee pointed out a good many years ago, one of the hallmarks of a nation in decline is that the dominant elite sinks into senility, becoming so heavily invested in failed policies and so insulated from the results of its own actions that nothing short of total disaster will break its deathgrip on the body politic.
While we wait for the disaster in question, though, those of us who aren’t part of the dominant elite and aren’t bamboozled by the spectacle du jour might reasonably consider what we might do about it all. By that, of course, I don’t mean that it’s still possible to save industrial civilization in general, and the United States in particular, from the consequences of their history. That possibility went whistling down the wind a long time ago. Back in 2005, the Hirsch Report showed that any attempt to deal with the impending collision with the hard ecological limits of a finite planet had to get under way at least twenty years before the peak of global conventional petroleum reserves, if there was to be any chance of avoiding massive disruptions. As it happens, 2005 also marked the peak of conventional petroleum production worldwide, which may give you some sense of the scale of the current mess.
Consider, though, what happened in the wake of that announcement. Instead of dealing with the hard realities of our predicament, the industrial world panicked and ran the other way, with the United States well in the lead. Strident claims that ethanol—er, solar—um, biodiesel—okay, wind—well, fracking, then—would provide a cornucopia of cheap energy to replace the world’s rapidly depleting reserves of oil, coal, and natural gas took the place of a serious energy policy, while conservation, the one thing that might have made a difference, was as welcome as garlic aioli at a convention of vampires.
That stunningly self-defeating response had a straightforward cause, which was that everyone except a few of us on the fringes treated the whole matter as though the issue was how the privileged classes of the industrial world could maintain their current lifestyles on some other resource base.  Since that question has no meaningful answer, questions that could have been answered—for example, how do we get through the impending mess with at least some of the achievements of the last three centuries intact?—never got asked at all. At this point, as a result, ten more years have been wasted trying to come up with answers to the wrong question, and most of the  doors that were still open in 2005 have been slammed shut by events since that time.
Fortunately, there are still a few possibilities for constructive action open even this late in the game. More fortunate still, the ones that will likely matter most don’t require Hillary Clinton, or any other member of America’s serenely clueless ruling elite, to do something useful for a change. They depend, rather, on personal action, beginning with individuals, families, and local communities and spiraling outward from there to shape the future on wider and wider scales.
I’ve talked about two of these possibilities at some length in posts here. The first can be summed up simply enough in a cheery sentence:  “Collapse now and avoid the rush!”  In an age of economic contraction—and behind the current facade of hallucinatory paper wealth, we’re already in such an age—nothing is quite so deadly as the attempt to prop up extravagant lifestyles that the real economy of goods and services will no longer support. Those who thrive in such times are those who downshift ahead of the economy, take the resources that would otherwise be wasted on attempts to sustain the unsustainable, and apply them to the costs of transition to less absurd ways of living. The acronym L.E.S.S.—“Less Energy, Stuff, and Stimulation”—provides a good first approximation of the direction in which such efforts at controlled collapse might usefully move.
The point of this project isn’t limited to its advantages on the personal scale, though these are fairly substantial. It’s been demonstrated over and over again that personal example is far more effective than verbal rhetoric at laying the groundwork for collective change. A great deal of what keeps so many people pinned in the increasingly unsatisfying and unproductive lifestyles sold to them by the media is simply that they can’t imagine a better alternative. Those people who collapse ahead of the rush and demonstrate that it’s entirely possible to have a humane and decent life on a small fraction of the usual American resource footprint are already functioning as early adopters; with every month that passes, I hear from more people—especially young people in their teens and twenties—who are joining them, and helping to build a bridgehead to a world on the far side of the impending crisis.
The second possibility is considerably more complex, and resists summing up so neatly. In a series of posts here  in 2010 and 2011, and then in my book Green Wizardry, I sketched out the toolkit of concepts and approaches that were central to the appropriate technology movement back in the 1970s, where I had my original education in the subjects central to this blog. I argued then, and still believe now, that by whatever combination of genius and sheer dumb luck, the pioneers of that movement managed to stumble across a set of approaches to the work of sustainability that are better suited to the needs of our time than anything that’s been proposed since then.
Among the most important features of what I’ve called the “green wizardry” of appropriate tech is the fact that those who want to put it to work don’t have to wait for the Hillary Clintons of the world to lift a finger. Millions of dollars in government grants and investment funds aren’t necessary, or even particularly useful. From its roots in the Sixties counterculture, the appropriate tech scene inherited a focus on do-it-yourself projects that could be done with hand tools, hard work, and not much money. In an age of economic contraction, that makes even more sense than it did back in the day, and the ability to keep yourself and others warm, dry, fed, and provided with many of the other needs of life without potentially lethal dependencies on today’s baroque technostructures has much to recommend it.
Nor, it has to be said, is appropriate tech limited to those who can afford a farm in the country; many of the most ingenious and useful appropriate tech projects were developed by and for people living in ordinary homes and apartments, with a small backyard or no soil at all available for gardening. The most important feature of appropriate tech, though, is that the core elements of its toolkit—intensive organic gardening and small-scale animal husbandry, homescale solar thermal technologies, energy conservation, and the like—are all things that will still make sense long after the current age of fossil fuel extraction has gone the way of the dinosaurs. Getting these techniques into as many hands as possible now is thus not just a matter of cushioning the impacts of the impending era of crisis; it’s also a way to start building the sustainable world of the future right now.
Those two strategies, collapsing ahead of the rush and exploring the green wizardry of appropriate technology, have been core themes of this blog for quite a while now. There’s a third project, though, that I’ve been exploring in a more abstract context here for a while now, and it’s time to talk about how it can be applied to some of the most critical needs of our time.
In the early days of this blog, I pointed out that technological progress has a feature that’s not always grasped by its critics, much less by those who’ve turned faith in progress into the established religion of our time. Very few new technologies actually meet human needs that weren’t already being met, and so the arrival of a new technology generally leads to the abandonment of an older technology that did the same thing. The difficulty here is that new technologies nowadays are inevitably more dependent on global technostructures, and the increasingly brittle and destructive economic systems that support them, than the technologies they replace. New technologies look more efficient than old ones because more of the work is being done somewhere else, and can therefore be ignored—for now.
This is the basis for what I’ve called the externality trap. As technologies get more complex, that complexity allows more of their costs to be externalized—that is to say, pushed onto someone other than the makers or users of the technology. The pressures of a market economy guarantee that those economic actors who externalize more of their costs will prosper at the expense of those who externalize less. The costs thus externalized, though, don’t go away; they get passed from hand to hand like hot potatoes and finally pile up in the whole systems—the economy, the society, the biosphere itself—that have no voice in economic decisions, but are essential to the prosperity and survival of every economic actor, and sooner or later those whole systems will break down under the burden.  Unlimited technological progress in a market economy thus guarantees the economic, social, and/or environmental destruction of the society that fosters it.
The externality trap isn’t just a theoretical possibility. It’s an everyday reality, especially but not only in the United States and other industrial societies. There are plenty of forces driving the rising spiral of economic, social, and environmental disruption that’s shaking the industrial world right down to its foundations, but among the most important is precisely the unacknowledged impact of externalized costs on the whole systems that support the industrial economy. It’s fashionable these days to insist that increasing technological complexity and integration will somehow tame that rising spiral of crisis, but the externality trap suggests that exactly the opposite is the case—that the more complex and integrated technologies become, the more externalities they will generate. It’s precisely because technological complexity makes it easy to ignore externalized costs that progress becomes its own nemesis.
Yes, I know, suggesting that progress isn’t infallibly beneficent is heresy, and suggesting that progress will necessarily terminate itself with extreme prejudice is heresy twice over. I can’t help that; it so happens that in most declining civilizations, ours included, the things that most need to be said are the things that, by and large, nobody wants to hear. That being the case, I might as well make it three for three and point out that the externality trap is a problem rather than a predicament. The difference, as longtime readers know, is that problems can be solved, while predicaments can only be faced. We don’t have to keep loading an ever-increasing burden of externalized costs on the whole systems that support us—which is to say, we don’t have to keep increasing the complexity and integration of the technologies that we use in our daily lives. We can stop adding to the burden; we can even go the other way.
Now of course suggesting that, even thinking it, is heresy on the grand scale. I’m reminded of a bit of technofluff in the Canadian media a week or so back that claimed to present a radically pessimistic view of the next ten years. Of course it had as much in common with actual pessimism as lite beer has with a pint of good brown ale; the worst thing the author, one Douglas Coupland, is apparently able to imagine is that industrial society will keep on doing what it’s doing now—though the fact that more of what’s happening now apparently counts as radical pessimism these days is an interesting point, and one that deserves further discussion.
The detail of this particular Dystopia Lite that deserves attention here, though, is Coupland’s dogmatic insistence that “you can never go backward to a lessened state of connectedness.” That’s a common bit of rhetoric out of the mouths of tech geeks these days, to be sure, but it isn’t even remotely true. I know quite a few people who used to be active on social media and have dropped the habit. I know others who used to have allegedly smart phones and went back to ordinary cell phones, or even to a plain land line, because they found that the costs of excess connectedness outweighed the benefits. Technological downshifting is already a rising trend, and there are very good reasons for that fact.
Most people find out at some point in adolescence that there really is such a thing as drinking too much beer. I think a lot of people are slowly realizing that the same thing is true of connectedness, and of the other prominent features of today’s fashionable technologies. One of the data points that gives me confidence in that analysis is the way that people like Coupland angrily dismiss the possibility. Part of his display of soi-disant pessimism is the insistence that within a decade, people who don’t adopt the latest technologies will be dismissed as passive-aggressive control freaks. Now of course that label could be turned the other way just as easily, but the point I want to make here is that nobody gets that bent out of shape about behaviors that are mere theoretical possibilities. Clearly, Coupland and his geek friends are already contending with people who aren’t interested in conforming to the technosphere.
It’s not just geek technologies that are coming in for that kind of rejection, either. These days, in the town where I live, teenagers whose older siblings used to go hotdogging around in cars ten years ago are doing the same thing on bicycles today. Granted, I live in a down-at-the-heels old mill town in the north central Appalachians, but there’s more to it than that. For a lot of these kids, the costs of owning a car outweigh the benefits so drastically that cars aren’t cool any more. One consequence of that shift in cultural fashion is that these same kids aren’t contributing anything like so much to the buildup of carbon dioxide in the atmosphere, or to the other externalized costs generated by car ownership.
I’ve written here already about deliberate technological regression as a matter of public policy. Over the last few months, though, it’s become increasingly clear to me that deliberate technological regression as a matter of personal choice is also worth pursuing. Partly this is because the deathgrip of failed policies on the political and economic order of the industrial world, as mentioned earlier, is tight enough that any significant change these days has to start down here at the grassroots level, with individuals, families, and communities, if it’s going to get anywhere at all; partly, it’s because technological regression, like anything else that flies in the face of the media stereotypes of our time, needs the support of personal example in order to get a foothold; partly, it’s because older technologies, being less vulnerable to the impacts of whole-system disruptions, will still be there meeting human needs when the grid goes down, the economy freezes up, or something really does break the internet, and many of them will still be viable when the fossil fuel age is a matter for the history books.
Still, there’s another aspect, and it’s one that the essay by Douglas Coupland mentioned above managed to hit squarely: the high-tech utopia ballyhooed by the first generation or so of internet junkies has turned out in practice to be a good deal less idyllic, and in fact a good deal more dystopian, than its promoters claimed. All the wonderful things we were supposedly going to be able to do turned out in practice to consist of staring at little pictures on glass screens and pushing buttons, and these are not exactly the most interesting activities in the world, you know. The people who are dropping out of social media and ditching their allegedly smart phones for a less connected lifestyle have noticed this.
What’s more, a great many more people—the kids hotdogging on bikes here in Cumberland are among them—are weighing  the costs and benefits of complex technologies with cold eyes, and deciding that an older, simpler technology less dependent on global technosystems is not just more practical, but also, and importantly, more fun. True believers in the transhumanist cyberfuture will doubtless object to that last point, but the deathgrip of failed ideas on societies in decline isn’t limited to the senile elites mentioned toward the beginning of this post; it can also afflict the fashionable intellectuals of the day, and make them proclaim the imminent arrival of the future’s rising waters when the tide’s already turned and is flowing back out to sea.
I’d like to suggest, in fact, that it’s entirely possible that we could be heading toward a future in which people will roll their eyes when they think of Twitter, texting, 24/7 connectivity, and the rest of today’s overblown technofetishism—like, dude, all that stuff is so twenty-teens! Meanwhile, those of us who adopt the technologies and habits of earlier eras, whether that adoption is motivated by mere boredom with little glass screens or by some more serious set of motives, may actually be on the cutting edge: the early adopters of the Retro Future. We’ll talk about that more in the weeks ahead.

The Burden of Denial

Wed, 2015-04-08 16:29
It occurred to me the other day that quite a few of the odder features of contemporary American culture make perfect sense if you assume that everybody knows exactly what’s wrong and what’s coming as our society rushes, pedal to the metal, toward its face-first collision with the brick wall of the future. It’s not that they don’t get it; they get it all too clearly, and they just wish that those of us on the fringes would quit reminding them of the imminent impact, so they can spend whatever time they’ve got left in as close to a state of blissful indifference as they can possibly manage.  
  I grant that this realization probably had a lot to do with the context in which it came to me. I was sitting in a restaurant, as it happens, with a vanload of fellow Freemasons.  We’d carpooled down to Baltimore, some of us to receive one of the higher degrees of Masonry and the rest to help with the ritual work, and we stopped for dinner on the way back home. I’ll spare you the name of the place we went; it was one of those currently fashionable beer-and-burger joints where the waitresses have all been outfitted with skirts almost long enough to cover their underwear, bare midriffs, and the sort of push-up bras that made them look uncomfortably like inflatable dolls—an impression that their too obviously scripted jiggle-and-smile routines did nothing to dispell.
Still, that wasn’t the thing that made the restaurant memorable. It was the fact that every wall in the place had television screens on it. By this I don’t mean that there was one screen per wall; I mean that they were lined up side by side right next to each other, covering the upper part of every single wall in the place, so that you couldn’t raise your eyes above head level without looking at one. They were all over the interior partitions of the place, too. There must have been forty of them in one not too large restaurant, each one blaring something different into the thick air, while loud syrupy music spattered down on us from speakers on the ceiling and the waitresses smiled mirthlessly and went through their routines. My burger and fries were tolerably good, and two tall glasses of Guinness will do much to ameliorate even so charmless a situation; still, I was glad to get back on the road.
The thing I’d point out is that all this is quite recent. Not that many years ago, it was tolerably rare to see a TV screen in an American restaurant, and even those bars that had a television on the premises for the sake of football season generally had the grace to leave the thing off the rest of the time. Within the last decade, I’ve watched televisions sprout in restaurants and pubs I used to enjoy, for all the world like buboes on the body of a plague victim: first one screen, then several, then one on each wall, then metastatizing across the remaining space. Meanwhile, along the same lines, people who used to go to coffee shops and the like to read the papers, talk with other patrons, or do anything else you care to name are now sitting in the same coffee shops in total silence, hunched over their allegedly smart phones like so many scowling gargoyles on the walls of a medieval cathedral.
Yes, there were people in the restaurant crouched in the gargoyle pose over their allegedly smart phones, too, and that probably also had something to do with my realization that evening.  It so happens that the evening before my Baltimore trip, I’d recorded a podcast interview with Chris Martenson on his Peak Prosperity show, and he’d described to me a curious response he’d been fielding from people who attended his talks on the end of the industrial age and the unwelcome consequences thereof. He called it “the iPhone moment”—the point at which any number of people in the audience pulled that particular technological toy out of their jacket pockets and waved it at him, insisting that its mere existence somehow disproved everything he was saying.
You’ve got to admit, as modern superstitions go, this one is pretty spectacular.  Let’s take a moment to look at it rationally. Do iPhones produce energy? Nope. Will they refill our rapidly depleting oil and gas wells, restock the ravaged oceans with fish, or restore the vanishing topsoil from the world’s  fields? Of course not. Will they suck carbon dioxide from the sky, get rid of the vast mats of floating plastic that clog the seas, or do something about the steadily increasing stockpiles of nuclear waste that are going to sicken and kill people for the next quarter of a million years unless the waste gets put someplace safe—if there is anywhere safe to put it at all? Not a chance. As a response to any of the predicaments that are driving the crisis of our age, iPhones are at best irrelevant.  Since they consume energy and resources, and the sprawling technosystems that make them function consume energy and resources at a rate orders of magnitude greater, they’re part of the problem, not any sort of a solution
Now of course the people waving their iPhones at Chris Martenson aren’t thinking about any of these things. A good case could be made that they’re not actually thinking at all. Their reasoning, if you want to call it that, seems to be that the existence of iPhones proves that progress is still happening, and this in turn somehow proves that progress will inevitably bail us out from the impacts of every one of the predicaments we face. To call this magical thinking is an insult to honest sorcerers; rather, it’s another example of the arbitrary linkage of verbal noises to emotional reactions that all too often passes for thinking in today’s America. Readers of classic science fiction may find all this weirdly reminiscent of a scene from some edgily updated version of H.G. Wells’ The Island of Doctor Moreau: “Not to doubt Progress: that is the Law. Are we not Men?”
Seen from a certain perspective, though, there’s a definite if unmentionable logic to “the iPhone moment,” and it has much in common with the metastatic spread of television screens across pubs and restaurants in recent years. These allegedly smart phones don’t do anything to fix the rising spiral of problems besetting industrial civilization, but they make it easier for people to distract themselves from those problems for a little while longer. That, I’d like to suggest, is also what’s driving the metastasis of television screens in the places that people used to go to enjoy a meal, a beer, or a cup of coffee and each other’s company. These days, that latter’s too risky; somebody might mention a friend who lost his job and can’t get another one, a spouse who gets sicker with each overpriced prescription the medical industry pushes on her, a kid who didn’t come back from Afghanistan, or the like, and then it’s right back to the reality that everyone’s trying to avoid. It’s much easier to sit there in silence staring at little colored pictures on a glass screen, from which all such troubles have been excluded.
Of course that habit has its own downsides. To begin with, those who are busy staring at the screens have to know, on some level, that sooner or later it’s going to be their turn to lose their jobs, or have their health permanently wrecked by the side effects their doctors didn’t get around to telling them about, or have their kids fail to come back from whatever America’s war du jour happens to be just then, or the like. That’s why so many people these days put so much effort into insisting as loudly as possible that the poor and vulnerable are to blame for their plight. The people who say this know perfectly well that it’s not true, but repeating such claims over and over again is the only defense they’ve got against the bitter awareness that their jobs, their health, and their lives or those of the people they care about could all too easily be next on the chopping block.
What makes this all the more difficult for most Americans to face is that none of these events are happening in a vacuum.  They’re part of a broader process, the decline and fall of modern industrial society in general and the United States of America in particular. Outside the narrowing circles of the well-to-do, standards of living for most Americans have been declining since the 1970s, along with standards of education, public health, and most of the other things that make for a prosperous and stable society. Today, a nation that once put human bootprints on the Moon can’t afford to maintain its roads and bridges or keep its cities from falling into ruin. Hiding from that reality in an imaginary world projected onto glass screens may be comforting in the short term; the mere fact that realities don’t go away just because they’re ignored does nothing to make this choice any less tempting.
What’s more, the world into which that broader process of decline is bringing us is not one in which staring at little colored pictures on a glass screen will count for much. Quite the contrary, it promises to be a world in which raw survival, among other things, will depend on having achieved at least a basic mastery of one or more of a very different range of skills. There’s no particular mystery about those latter skills; they were, in point of fact, the standard set of basic human survival skills for thousands of years before those glass screens were invented, and they’ll still be in common use when the last of the glass screens has weathered away into sand; but they have to be learned and practiced before they’re needed, and there may not be all that much time left to learn and practice them before hard necessity comes knocking at the door.
I think a great many people who claim that everything’s fine are perfectly aware of all this. They know what the score is; it’s doing something about it that’s the difficulty, because taking meaningful action at this very late stage of the game runs headlong into at least two massive obstacles. One of them is practical in nature, the other psychological, and human nature being what it is, the psychological dimension is far and away the most difficult of the two.
Let’s deal with the practicalities first. The non-negotiable foundation of any meaningful response to the crisis of our time, as I’ve pointed out more than once here, can be summed up conveniently with the acronym L.E.S.S.—that is, Less Energy, Stuff, and Stimulation. We are all going to have much less of these things at our disposal in the future.  Using less of them now frees up time, money, and other resources that can be used to get ready for the inevitable transformations. It also makes for decreased dependence on systems and resources that in many cases are already beginning to fail, and in any case will not be there indefinitely in a future of hard limits and inevitable scarcities.
On the other hand, using L.E.S.S. flies in the face of two powerful forces in contemporary culture. The first is the ongoing barrage of advertising meant to convince people that they can’t possibly be happy without the latest time-, energy-, and resource-wasting trinket that corporate interests want to push on them. The second is the stark shivering terror that seizes most Americans at the thought that anybody might think that they’re poorer than they actually are. Americans like to think of themselves as proud individualists, but like so many elements of the American self-image, that’s an absurd fiction; these days, as a rule, Americans are meek conformists who shudder with horror at the thought that they might be caught straying in the least particular from whatever other people expect of them.
That’s what lies behind the horrified response that comes up the moment someone suggests that using L.E.S.S. might be a meaningful part of our response to the crises of our age. When people go around insisting that not buying into the latest overhyped and overpriced lump of technogarbage is tantamount to going back to the caves—and yes, I field such claims quite regularly—you can tell that what’s going on in their minds has nothing to do with the realities of the situation and everything to do with stark unreasoning fear. Point out that a mere thirty years ago, people got along just fine without email and the internet, and you’re likely to get an even more frantic and abusive reaction, precisely because your listener knows you’re right and can’t deal with the implications.
This is where we get into the psychological dimension. What James Howard Kunstler has usefully termed the psychology of previous investment is a massive cultural force in today’s America. The predicaments we face today are in very large part the product of a long series of really bad decisions that were made over the last four decades or so. Most Americans, even those who had little to do with making those decisions, enthusiastically applauded them, and treated those who didn’t with no small amount of abuse and contempt. Admitting just how misguided those decisions turned out to be thus requires a willingness to eat crow that isn’t exactly common among Americans these days. Thus there’s a strong temptation to double down on the bad decisions, wave those iPhones in the air, and put a few more television screens on the walls to keep the cognitive dissonance at bay for a little while longer.
That temptation isn’t an abstract thing. It rises out of the raw emotional anguish woven throughout America’s attempt to avoid looking at the future it’s made for itself. The intensity of that anguish can be measured most precisely, I think, in one small but telling point: the number of people whose final response to the lengthening shadow of the future is, “I hope I’ll be dead before it happens.”
Think about those words for a moment. It used to be absolutely standard, and not only in America, for people of every social class below the very rich to work hard, save money, and do without so that their children could have a better life than they had. That parents could say to their own children, “I got mine, Jack; too bad your lives are going to suck,” belonged in the pages of lurid dime novels, not in everyday life. Yet that’s exactly what the words “I hope I’ll be dead before it happens” imply.  The destiny that’s overtaking the industrial world isn’t something imposed from outside; it’s not an act of God or nature or callous fate; rather, it’s unfolding with mathematical exactness from the behavior of those who benefit from the existing order of things.  It could be ameliorated significantly if those same beneficiaries were to let go of the absurd extravagance that characterizes what passes for a normal life in the modern industrial world these days—it’s just that the act of letting go involves an emotional price that few people are willing to pay.
Thus I don’t think that anyone says “I hope I’ll be dead before it happens” lightly. I don’t think the people who are consigning their own children and grandchildren to a ghastly future, and placing their last scrap of hope on the prospect that they themselves won’t live to see that future arrive, are making that choice out of heartlessness or malice. The frantic concentration on glass screens, the bizarre attempts to banish unwelcome realities by waving iPhones in their faces, and the other weird behavior patterns that surround American society’s nonresponse to its impending future, are signs of the enormous strain that so many Americans these days are under as they try to keep pretending that nothing is wrong in the teeth of the facts.
Denying a reality that’s staring you in the face is an immensely stressful process, and the stress gets worse as the number of things that have to be excluded from awareness mounts up. These days, that list is getting increasingly long. Look away from the pictures on the glass screens, and the United States is visibly a nation in rapid decline: its cities collapsing, its infrastructure succumbing to decades of malign neglect, its politics mired in corruption and permanent gridlock, its society frayed to breaking, and the natural systems that support its existence passing one tipping point after another and lurching through chaotic transitions.
Oklahoma has passed California as the most seismically active state in the Union as countless gallons of fracking fluid pumped into deep disposal wells remind us that nothing ever really “goes away.” It’s no wonder that so many shrill voices these days are insisting that nothing is wrong, or that it’s all the fault of some scapegoat or other, or that Jesus or the Space Brothers or somebody will bail us out any day now, or that we’re all going to be wiped out shortly by some colorful Hollywood cataclysm that, please note, is never our fault.
There is, of course, another option.
Over the years since this blog first began to attract an audience, I’ve spoken to quite a few people who broke themselves out of that trap, or were popped out of it willy-nilly by some moment of experience just that little bit too forceful to yield to the exclusionary pressure; many of them have talked about how the initial burst of terror—no, no, you can’t say that, you can’t think that!—gave way to an immense feeling of release and freedom, as the burden of keeping up the pretense dropped away and left them able to face the world in front of them at last.
I suspect, for what it’s worth, that a great many more people are going to be passing through that transformative experience in the years immediately ahead. A majority? Almost certainly not; to judge by historical precedents, the worse things get, the more effort will go into the pretense that nothing is wrong at all, and the majority will cling like grim death to that pretense until it drags them under. That said, a substantial minority might make a different choice: to let go of the burden of denial soon enough to matter, to let themselves plunge through those moments of terror and freedom, and to haul themselves up, shaken but alive, onto the unfamiliar shores of the future.
When they get there, there will be plenty of work for them to do. I’ve discussed some of the options in previous posts on this blog, but there’s at least one that hasn’t gotten a detailed examination yet, and it’s one that I’ve come to think may be of crucial importance in the decades ahead. We’ll talk about that next week.

Atlantis Won't Sink, Experts Agree

Wed, 2015-04-01 17:32
If you’re like most Atlanteans these days, you’ve heard all sorts of unnerving claims about the future of our continent. Some people are even saying that recent earth tremors are harbingers of a cataclysm that will plunge Atlantis to the bottom of the sea. Those old prophecies from the sacred scrolls of the Sun Temple have had the dust blown off them again, adding to the stew of rumors.
So is there anything to it? Should you be worried about the future of Atlantis?
Not according to the experts. I visited some of the most widely respected hierarchs here in the City of the Golden Gates yesterday to ask them about the rumors, and they assured me that there’s no reason to take the latest round of alarmist claims at all seriously.
  ***My first stop was the temple complex of black orichalcum just outside the Palace of the Ten Kings, where Nacil Buper, Grand Priestess of the Temple of Night, took time out of her busy schedule to meet with me. I asked her what she thought about the rumors of imminent catastrophe. “Complete and utter nonsense,” she replied briskly. “There are always people who want to insist that the end is nigh, and they can always find something to use to justify that sort of thing. Remember a few years ago, when everyone was running around insisting that the end of the Forty-First Grand Cycle of Time was going to bring the destruction of the world? This is more of the same silliness.”
Just at that moment, the floor shook beneath us, and I asked her about the earth tremors, pointing out that those seem to be more frequent than they were just a few years back.
“Atlantis has always had earthquakes,” the Grand Priestess reminded me, gesturing with her scepter of human bone.  “There are natural cycles affecting their frequency, and there’s no proof that they’re more frequent because of anything human beings are doing. In fact, I’m far from convinced that they’re any more frequent than they used to be. There are serious questions about whether the priests of the Sun Temple have been fiddling with their data, you know.”
“And the claim from those old prophecies that offering human sacrifices to Mu-Elortep, Lord of Evil, might have something to do with it?” I asked. 
“That’s the most outrageous kind of nonsense,” the Grand Priestess replied. “Atlanteans have been worshipping the Lord of Evil for more than a century and a half. It’s one of the foundations of our society and our way of life, and we should be increasing the number of offerings to Mu-Elortep as rapidly as we can, not listening to crazies from the fringe who insist that there’s something wrong with slaughtering people for the greater glory of the Lord of Evil. We can’t do without Mu-Elortep, not if we’re going to restore Atlantis to full prosperity and its rightful place in the world order, and if that means sacrifices have to be made—and it does—then sacrifices need to be made.”
She leaned forward confidentially, and her necklace of infant’s skulls rattled. “You know as well as I do that all this is just another attempt by the Priests of the Sun to dodge their responsibility for their own bad policies. Nobody would care in the least about all these crazy rumors of imminent doom if the Sun Priest Erogla hadn’t made such a fuss about the old prophecies in the scrolls of the Sun Temple a few years back. The Sun Temple’s the real problem we face. Fortunately, though, we of the Temple of Night have a majority in the Council of the Ten Kings now. We’re working on legislation right now to eradicate poverty in Atlantis by offering up the poor to Mu-Elortep in one grand bonfire. Once that’s done, I’m convinced, Atlantis will be on the road to a full recovery.”
  ***After my conversation with the Grand Priestess, I went uphill to the foot of the Sacred Mountain, where the Sun Temple rises above the golden-roofed palaces of the Patricians of Atlantis. I had made an appointment to see Tarc Omed, the Hierophant of the Priests of the Sun; he met me in his private chamber, and had his servants pour us purple wine from Valusia as we talked.
“I know the kind of thing you must have heard from the Temple of Night,” the Hierophant said wearily. “It’s all our fault the economy’s in trouble. Everything’s our fault. That’s how they avoid responsibility for the consequences of the policies they’ve been pursuing for decades now.”
I asked him what he thought of Nacil Buper’s claim that offering up the poor as human sacrifices would solve all the problems Atlantis faces these days.
“Look,” he said, “everybody knows that we’ve got to wean ourselves off making human sacrifices to the Lord of Evil one of these days. There’s no way we can keep that up indefinitely, and it’s already causing measurable problems. That’s why we’re proposing increased funding for more sustainable forms of worship directed toward other deities, so we can move step by step to a society that doesn’t have to engage in human sacrifice or deal with Mu-Elortep at all.”
And the ground tremors? Do they have anything to do with the sacrifices?
“That’s a good question. It’s hard to say whether any particular burst of tremors is being caused by the prophesied curse, you know, but that’s no reason for complacency.”
A tremor shook the room, and we both steadied our golden goblets of wine on the table. “Doesn’t that lend support to the rumors that Atlantis might sink soon?” I asked.
Tarc Omed looked weary again, and leaned back in his great chair of gold and ivory. “We have to be realistic,” he said. “Right now, Atlantean society depends on human sacrifice, and transitioning away from that isn’t something we can do overnight. We need to get those more sustainable forms of worship up and running first, and that can’t be done without negotiated compromises and the support of as many stakeholders as possible. Alarmism doesn’t further that.”
I thought of one of the things Nacil Buper had said. “But aren’t the prophecies of doom we’re discussing right there in the sacred scrolls of the Sun Temple?”
“We don’t consider that relevant just now,” the Hierophant told me firmly. “What matters right at the moment is to build a coalition strong enough to take back a majority in the Council of the Ten Kings, stop the Temple of Night’s crazy plan to sacrifice all of the poor to Mu-Elortep, and make sure that human sacrifices are conducted in as painless and sanitary a fashion as possible and increased only at the rate that’s really necessary, while we work toward phasing out human sacrifice altogether. Of course we can’t continue on our current path, but I have faith that Atlanteans can and will work together to stop any sort of worst-case scenario from happening.”
  ***From the Temple of the Sun I walked out of the patrician district, into one of the working class neighborhoods overlooking the Old Harbor. The ground shook beneath my feet a couple of times as I went. People working in the taverns and shops looked up at the Sacred Mountain each time, and then went back to their labor. It made me feel good to know that their confidence was shared by both the hierarchs I’d just interviewed.
I decided to do some person-in-the-street interviews for the sake of local color, and stepped into one of the taverns. Introducing myself to the patrons as a reporter, I asked what they thought about the rumors of disaster and the ongoing earth tremors.
“Oh, I’m sure the Priests of the Sun will think of something,” one patron said. I wrote that down on my wax tablet.
“Yeah,” agreed another. “How long have these prophecies been around? And Atlantis is still above water, isn’t it? I’m not worried.”
“I used to believe that stuff back in the day,” said a third patron. “You know, you buy into all kinds of silly things when you’re young and gullible, then grow out of it once it’s time to settle down and deal with the real world.  I sure did.”
That got nods and murmurs of approval all around. “I honestly think a lot of the people who are spreading these rumors actually want Atlantis to sink,” the third patron went on. “All this obsessing about those old prophecies and how awful human sacrifice is—I mean, can we get real, please?”
“You can say that again,” said the second patron. “I bet they do want Atlantis to sink. I bet they’re actually Lemurian sympathizers.”
The third patron turned to look at him.  “You know, that would make a lot of sense—”
Just then another tremor, a really strong one, shook the tavern. The whole room went dead silent for a moment. As the tremor died down, everybody started talking loudly all at once. I said my goodbyes and headed for the door.
As I stopped outside to put my wax tablet into the scribe’s case on my belt, one of the other patrons—a woman who hadn’t said anything—came through the door after me. “If you’re looking for a different point of view,” she told me, “you ought to go down to the Sea Temple. They’ll give you an earful.”
I thanked her, and started downhill toward the Old Harbor.
  ***I’d never been to the Sea Temple before; I don’t think most Atlanteans ever go there, though it’s been right there next to the Old Harbor since time out of mind. When I got there, the big doors facing the harbor were wide open, but the place seemed empty; the only sounds were the flapping of the big blue banners above the temple and the cries of sea birds up overhead.
As another tremor rattled the city, I walked in through the open doors. I didn’t see anyone at first, but after a few moments a woman in the blue robes of a Sea Priestess came out of the sanctuary further inside and hurried toward me. She had a basket of scrolls in her arms.
I introduced myself, explained that I was a journalist, and asked if she minded answering some questions.
“Not if you don’t mind walking with me to the harbor,” she said. “I’m in a bit of a hurry.”
“Sure,” I told her. “So what do you think about all these scary rumors? Do you really think Atlantis could end up underwater?”
We left the temple and started across the plaza outside, toward the harbor. “Have you read the prophecies of Emor Fobulc?” she asked me.
“Can’t say I have.”
“They predicted everything that’s happened: the rise of the cult of Mu-Elortep, the sacrifices, the earth tremors, and now the Sign.”
“The what?”
“When’s the last time you looked at the top of the Sacred Mountain?”
I stopped and looked right then. There was a plume of smoke rising from the great rounded peak. After a moment, I hurried to catch up to her.
“That’s the Sign,” she told me. “It means that the fires of Under-Earth have awakened and Atlantis will soon be destroyed.”
“Seriously?”
“Seriously.”
I thought about it for a moment as we walked, and the ground shook beneath our feet. “There could be plenty of other explanations for that smoke, you know.”
The priestess looked at me for a long moment. “No doubt,” she said dryly. 
By then we were near the edge of the quay, and half a dozen people came hurrying down the gangplank from a ship that was tied up there, an old-fashioned sailing vessel with a single mast and the prow carved to look like a swan. One of them, a younger priestess, bowed, took the basket of scrolls, and hurried back on board the ship. Another, who was dressed like a mariner, bowed too, and said to the priestess I’d spoken with, “Is there anything else, Great Lady?”
“Nothing,” she said. “We should go.” She turned to me. “You may come with us if you wish.”
“I need to have this story back to the pressroom before things shut down this afternoon,” I told her. “Are you going to be coming back within two hours or so?”
I got another of her long silent looks. “No,” she said. “We’ll be much longer than that.”
“Sorry, then—I hate to turn down a cruise, but work is work.”
She didn’t have anything to say to that, and the others more or less bundled her up the gangplank onto the ship. A couple of sailors untied the cables holding the ship against the quay and then climbed on board before it drifted away. A few minutes later the ship was pulling out into the Old Harbor; I could hear the oarsmen belowdecks singing one of their chanteys while the sailors climbed aloft and got the sail unfurled and set to the breeze.
After a few more minutes, I turned and started back up the hill toward the middle of town. As I climbed the slope, I could see more and more of the City of the Golden Gates around me in the afternoon sun: the Palace of the Ten Kings with the Temple of Night beside it, the Sun Temple and the golden roofs of the patricians’ palaces higher up the slope. The ground was shaking pretty much nonstop, but I barely noticed it, I’d gotten so used to the tremors.
The view got better as I climbed. Below, the Old Harbor spread out to one side and the New Harbor to the other. Next to the New Harbor was the charnel ground of Elah-Slio, where smoke was rising from the altars and long lines of victims were being driven forward with whips to be offered up as sacrifices to Mu-Elortep; off the other way, beyond the Old Harbor, I spotted twenty or so sails in the middle distance, heading away from Atlantis, and the ship with the priestess on it hurrying to join them.
That’s when it occurred to me that the Sea Priestess couldn’t have been serious when she said that Atlantis would soon be destroyed. Surely, if the prophecies were true, the Sea Priestesses would have had more important things to do than go on some kind of long vacation cruise. I laughed at how gullible I’d been there for a moment, and kept climbing the hill into the sunlight.
Above the Sacred Mountain, the cloud of smoke had gotten much bigger, and it looked as though some kind of red glow was reflecting off the bottom of it. I wondered what that meant, but figured I’d find out from the news soon enough. It certainly made me feel good to know that there was no reason whatever to worry about the far-fetched notion that Atlantis might end up at the bottom of the sea.

(Note: due to a date-linked transtemporal anomaly, this week’s planned Archdruid Report post got switched with a passage from the Swenyliad, an Atlantean chronicle dating from 9613 BCE. We apologize for any inconvenience.)

Planet of the Space Bats

Wed, 2015-03-25 17:16
As my regular readers know, I’ve been talking for quite a while now here about the speculative bubble that’s built up around the fracking phenomenon, and the catastrophic bust that’s guaranteed to follow so vast and delusional a boom. Over the six months or so, I’ve noted the arrival of one warning sign after another of the impending crash. As the saying has it, though, it’s not over ‘til the fat lady sings, so I’ve been listening for the first notes of the metaphorical aria that, in the best Wagnerian style, will rise above the orchestral score as the fracking industry’s surrogate Valhalla finally bursts into flames and goes crashing down into the Rhine.
 
I think I just heard those first high notes, though, in an improbable place: the email inbox of the Ancient Order of Druids in America (AODA), the Druid order I head.
I have no idea how many of my readers know the first thing about my unpaid day job as chief executive—the official title is Grand Archdruid—of one of the two dozen or so Druid orders in the western world. Most of what goes into that job, and the admittedly eccentric minority religious tradition behind it, has no relevance to the present subject. Still, I think most people know that Druids revere the natural world, and take ecology seriously even when that requires scrapping some of the absurd extravagances that pass for a normal lifestyle these days. Thus a Druid order is arguably the last place that would come to mind if you wanted to sell stock in a fracking company.
Nonetheless, that’s what happened. The bemused AODA office staff the other day fielded a solicitation from a stock firm trying to get Druids to invest their assets in the fracking industry.
Does that sound like a desperation move to you, dear reader? It certainly does to me—and there’s good reason to think that it probably sounds that way to the people who are trying to sell shares in fracking firms to one final round of clueless chumps, too. A recent piece in the Wall Street Journal (available outside the paywall here) noted that American banks have suddenly found themselves stuck with tens of millions of dollars’ worth of loans to fracking firms which they hoped to package up and sell to investors—but suddenly nobody’s buying. Bankruptcies and mass layoffs are becoming an everyday occurrence in the fracking industry, and the price of oil continues to lurch down as producers maximize production for the sake of immediate cash flow.
Why, though, isn’t the drop in the price of oil being met by an upsurge in consumption that drives the price back up, as the accepted rules of economics would predict? That’s the cream of the jest. Here in America, and to a lesser extent elsewhere in the industrial world, four decades of enthusiastically bipartisan policies that benefited the rich at everyone else’s expense managed to prove Henry Ford’s famous argument: if you don’t pay your own employees enough that they can afford to buy your products, sooner or later, you’re going to go broke.
By driving down wages and forcing an ever larger fraction of the US population into permanent unemployment and poverty, the movers and shakers of America’s political class have managed to trigger a classic crisis of overproduction, in which goods go begging for buyers because too few people can afford to buy them at any price that will pay for their production. It’s not just oil that’s affected, either: scores of other commodities are plunging in price as the global economy tips over into depression. There’s a specter haunting the industrial world; it’s the ghost of Karl Marx, laughing with mordant glee as the soi-disant masters of the universe, having crushed his misbegotten Soviet stepchildren, go all out to make his prophecy of capitalism’s self-immolation look remarkably prescient.
The soaring price of crude oil in the wake of the 2005 global peak of conventional oil production should have served notice to the industrial world that, to adapt the title of Richard Heinberg’s excellent 2003 summary of the situation, the party was over:  the long era in which energy supplies had increased year over year was giving way to an unwelcome new reality in which decreasing energy supplies and increasing environmental blowback were the defining themes. As my readers doubtless noticed, though, the only people who willing to grasp that were out here on the fringes where archdruids lurk. Closer to the mainstream of our collective thinking, most people scrunched shut their eyes, plugged their ears with their fingers, and shouted “La, la, la, I can’t hear you” at the top of their lungs, in a desperate attempt to keep reality from getting a word in edgewise.
For the last five years or so, any attempt to talk about the impending twilight of the age of oil thus ran headfirst into a flurry of pro-fracking propaganda. Fatuous twaddle about America’s inevitable future as the world’s new energy superpower took the place of serious discussions of the predicament into which we’ve backed ourselves—and not for the first time, either. That’s what makes the attempt to get Druids to invest their life savings in fracking so funny, in a bleak sort of way: it’s an attempt to do for the fracking boom what the fracking boom attempted to do for industrial civilization as a whole—to pretend, in the teeth of the facts, that the unsustainable can be sustained for just a little while longer.
A few months back, I decided to celebrate this sort of thinking by way of the grand old Druid custom of satire. The Great Squirrel Case Challenge of 2015 solicited mock proposals for solving the world’s energy problems that were even nuttier than the ones in the mainstream media. That was no small challenge—a detail some of my readers pointed up by forwarding any number of clueless stories from the mainstream media loudly praising energy boondoggles of one kind or another.
I’m delighted to say, though, that the response was even better than I’d hoped for.  The contest fielded more than thirty entries, ranging from the merely very good to the sidesplittingly funny. There were two winners, one chosen by the members of the Green Wizardsforum, one chosen by me; in both cases, it was no easy choice, and if I had enough author’s copies of my new book After Progress, I’d probably just up and given prizes to all the entries, they were that good. Still, it’s my honor to announce the winners:
My choice for best squirrel case—drumroll, please—goes to Steve Morgan, for his fine gosh-wow sales prospectus for, ahem, Shares of Hydrocarbons Imported from Titan. The Green Wizards forum choice—drumroll again—goes to Jason Heppenstall for his hilarious parody of a sycophantic media story, King Solomon’s Miners. Please join me in congratulating them. (Steve and Jason, drop me a comment with your mailing addresses, marked not for posting, and I’ll get your prizes on the way.)
Their hard-won triumph probably won’t last long. In the months and years ahead, I expect to see claims even more ludicrous being taken oh-so-seriously by the mainstream media, because the alternative is to face up to just how badly we’ve bungled the opportunities of the last four decades or so and just how rough a road we have ahead of us as a result. What gave the fracking bubble whatever plausibility it ever had, after all, was the way it fed on one of the faith-based credos at the heart of contemporary popular culture: the insistence, as pervasive as it is irrational, that the universe is somehow obligated to hand us abundant new energy sources to replace the ones we’ve already used so profligately. Lacking that blind faith, it would have been obvious to everyone—as it was to those of us in the peak oil community—that the fracking industry was scraping the bottom of the barrel and pretending that this proved the barrel was full.
Read the morning news with eyes freed from the deathgrip of the conventional wisdom and it’s brutally obvious that that’s what happened, and that the decline and fall of our civilization is well under way. Here in the US, a quarter of the country is in the fourth year of record drought, with snowpack on California’s Sierra Nevada mountains about 9% of normal; the Gulf Stream is slowing to a crawl due to the rapid melting of the Greenland ice sheets; permanent joblessness and grinding poverty have become pervasive in this country; the national infrastructure is coming apart after decades of malign neglect—well, I could go on; if you want to know what life is like in a falling civilization, go look out the window.
In the mainstream media, on the occasions when such things are mentioned at all, they’re treated as disconnected factoids irrelevant to the big picture. Most people haven’t yet grasped that these things arethe big picture—that while we’re daydreaming about an assortment of shiny futures that look more or less like the present with more toys, climate change, resource depletion, collapsing infrastructure, economic contraction, and the implosion of political and cultural institutions are creating the future we’re going to inhabit. Too many of us suffer from a weird inability to imagine a future that isn’t simply a continuation of the present, even when such a future stands knocking at our own front doors.
So vast a failure of imagination can’t be overcome by the simple expedient of pointing out the ways that it’s already failed to explain the world in which we live. That said, there are other ways to break the grip of the conventional wisdom, and I’m pleased to say that one of those other ways seems to be making modest but definite headway just now.
Longtime readers here will remember that in 2011, this blog launched a contest for short stories about the kind of future we can actually expect—a future in which no deus ex machina saves industrial civilization from the exhaustion of its resource base, the deterioration of the natural systems that support it, and the normal process of decline and fall. That contest resulted in an anthology, After Oil: SF Stories of a Post-Petroleum Future, which found a surprisingly large audience. On the strength of its success, I ran a second contest in 2014, which resulted in two more volumes—After Oil 2: The Years of Crisis, which is now available, and After Oil 3: The Years of Rebirth, which is in preparation. Demand for the original volume has remained steady, and the second is selling well; after a conversation with the publisher, I’m pleased to announce that we’re going to do it again, with a slight twist.
The basic rules are mostly the same as before:
Stories should be between 2500 and 7500 words in length; They should be entirely the work of their author or authors, and should not borrow characters or setting from someone else’s work;They should be in English, with correct spelling, grammar and punctuation; They should be stories—narratives with a plot and characters—and not simply a guided tour of some corner of the future as the author imagines it; They should be set in our future, not in an alternate history or on some other planet;They should be works of realistic fiction or science fiction, not magical or supernatural fantasy—that is, the setting and story should follow the laws of nature as those are presently understood;They should take place in settings subject to thermodynamic, ecological, and economic limits to growth; and as before,They must not rely on “alien space bats”—that is, dei ex machina inserted to allow humanity to dodge the consequences of the limits to growth. (Aspiring authors might want to read the whole “Alien Space Bats” post for a more detailed explanation of what I mean here; reading the stories from one or both of the published After Oil volumes might also be a good plan.)
This time, though, I’m adding an additional rule:
Stories submitted for this contest must be set at least one thousand years in the future—that is, after March 25, 3015 in our calendar.
That’s partly a reflection of a common pattern in entries for the two previous contests, and partly something deeper. The common pattern? A great many authors submitted stories that were set during or immediately after the collapse of industrial civilization; there’s certainly room for those, enough so that the entire second volume is basically devoted to them, but tales of surviving decline and fall are only a small fraction of the galaxy of potential stories that would fit within the rules listed above.  I’d like to encourage entrants to consider telling something different, at least this time.
The deeper dimension? That’s a reflection of the blindness of the imagination discussed earlier in this post, the inability of so many people to think of a future that isn’t simply a prolongation of the present. Stories set in the immediate aftermath of our civilization don’t necessarily challenge that, and I think it’s high time to start talking about futures that are genuinely other—neither utopia nor oblivion, but different, radically different, from the linear extrapolations from the present that fill so many people’s imaginations these days, and have an embarrassingly large role even in science fiction.
You have to read SF from more than a few decades back to grasp just how tight the grip of a single linear vision of the future has become on what used to be a much more freewheeling literature of ideas. In book after book, and even more in film after film, technologies that are obviously derived from ours, ideologies that are indistinguishable from ours, political and economic arrangements that could pass for ours, and attitudes and ideas that belong to this or that side of today’s cultural struggles get projected onto the future as though they’re the only imaginable options. This takes place even when there’s very good reason to think that the linear continuation of current trends isn’t an option at all—for example, the endlessly regurgitated, done-to-death trope of interstellar travel.
Let us please be real:  we aren’t going to the stars—not in our lifetimes, not in the lifetime of industrial civilization, not in the lifetime of our species. There are equally  good thermodynamic and economic reasons to believe that many of the other standard tropes of contemporary science fiction are just as unreachable—that, for example, limitless energy from gimmicks of the dilithium-crystal variety, artificial intelligences capable of human or superhuman thought, and the like belong to fantasy, not to the kind of science fiction that has any likelihood of becoming science fact. Any of my readers who want to insist that human beings can create anything they can imagine, by the way, are welcome to claim that, just as soon as they provide me with a working perpetual motion machine.
It’s surprisingly common to see people insist that the absence of the particular set of doodads common to today’s science fiction would condemn our descendants to a future of endless boredom. This attitude shows a bizarre stunting of the imagination—not least because stories about interstellar travel normally end up landing the protagonists in a world closely modeled on some past or present corner of the Earth. If our genus lasts as long as the average genus of vertebrate megafauna, we’ve got maybe ten million years ahead of us, or roughly two thousand times as long as all of recorded human history to date: more than enough time for human beings to come up with a dazzling assortment of creative, unexpected, radically differentsocieties, technologies, and ways of facing the universe and themselves.
That’s what I’d like to see in submissions to this year’s Space Bats challenge—yes, it’ll be an annual thing from here on out, as long as the market for such stories remains lively. A thousand years from now, industrial civilization will be as far in the past as the Roman Empire was at the time of the Renaissance, and new human societies will have arisen to pass their own judgment on the relics of our age. Ten thousand years from now, or ten million? Those are also options. Fling yourself into the far future, far enough that today’s crises are matters for the history books, or tales out of ancient myth, or forgotten as completely as the crises and achievements of the Neanderthal people are today, and tell a story about human beings (or, potentially, post-human beings) confronting the challenges of their own time in their own way. Do it with verve and a good readable style, and your story may be be one of the ones chosen to appear in the pages of After Oil 4:  The Future’s Distant Shores.
The mechanics are pretty much the same as before. Write your story and post it to the internet—if you don’t have a blog, you can get one for free from Blogspot or Wordpress. Post a link to it in the comments to The Archdruid Report. You can write more than one story, but please let me know which one you want entered in the competition—there will be only one entry accepted per author this time. Stories must be written and posted online, and a link posted to this blog, by August 30, 2015 to be eligible for inclusion in the anthology.

The View From Outside

Wed, 2015-03-18 17:25
Recently I’ve been reacquainting myself with the stories of Clark Ashton Smith. Though he’s largely forgotten today, Smith was one of the leading lights of Weird Tales magazine during its 1930s golden age, ranking with H.P Lovecraft and Robert Howard as a craftsman of fantasy fiction. Like Lovecraft, Howard, and most of the other authors in the Weird Talesstable, Smith was an outsider; he spent his life in a small town in rural California; he was roundly ignored by the literary scene of his day, and returned the favor with gusto. With the twilight of the pulps, Smith’s work was consigned to the dustbin of literary history.  It was revived briefly during the fantasy boom of the 1970, only to sink from sight again when the fantasy genre drowned in a swamp of faux-medieval clichés thereafter.
There’s no shortage of reasons to give Smith another look today, starting with his mastery of image and atmosphere and the wry humor that shaped the best of his mature work. Still, that’s a theme for another time, and possibly another forum. The theme that’s relevant to this blog is woven into one of  Smith’s classic stories, The Dark Age. First published in 1938, it’s among the earliest science fiction stories I know of that revolves around an organized attempt to preserve modern science through a future age of barbarism.
The story’s worth reading in its own right, so I won’t hand out spoilers here. Still, I don’t think it will give away anything crucial to mention that one of the mainsprings of the story is the inability of the story’s scientists to find or make common ground with the neo-barbarian hill tribes around them. That aspect of the story has been much on my mind of late. Despite the rockets and rayguns that provide so much of its local color, science fiction is always about the present, which it displays in an unfamiliar light by showing a view from outside, from the distant perspective of an imaginary future.
That’s certainly true of Smith’s tale, which drew much of its force at the time of its composition from the widening chasm between the sciences and the rest of human culture that C.P. Snow discussed two decades later in his famous work “The Two Cultures.” That chasm has opened up a good deal further since Smith’s time, and its impact on the future deserves discussion here, not least because it’s starting to come into sight even through the myopic lenses of today’s popular culture.
I’m thinking here, for example, of a recent blog post by Scott Adams, the creator of the “Dilbert” comic strip. There’s a certain poetic justice in seeing popular culture’s acknowledged expert on organizational failure skewer one of contemporary science’s more embarrassing habits, but there’s more to the spectacle than a Dilbertesque joke. As Adams points out, there’s an extreme mismatch between the way that science works and the way that scientists expect their claims to be received by the general public. Within the community of researchers, the conclusions of the moment are, at least in theory, open to constant challenge—but only from within the scientific community.
The general public is not invited to take part in those challenges. Quite the contrary, it’s supposed to treat the latest authoritative pronouncement as truth pure and simple, even when that contradicts the authoritative pronouncements of six months before. Now of course there are reasons why scientists might not want to field a constant stream of suggestions and challenges from people who don’t have training in relevant disciplines, but the fact remains that expecting people to blindly accept whatever scientists say about nutrition, when scientific opinion on that subject has been whirling around like a weathercock for decades now, is not a strategy with a long shelf life. Sooner or later people start asking why they should take the latest authoritative pronouncement seriously, when so many others landed in the trash can of discarded opinions a few years further on.
There’s another, darker reason why such questions are increasingly common just now. I’m thinking here of the recent revelation that the British scientists tasked by the government with making dietary recommendations have been taking payola of various kinds from the sugar industry.  That’s hardly a new thing these days. Especially but not only in those branches of science concerned with medicine, pharmacology, and nutrition, the prostitution of the scientific process by business interests has become an open scandal. When a scientist gets behind a podium and makes a statement about the safety or efficacy of a drug, a medical treatment, or what have you, the first question asked by an ever-increasing number of people outside the scientific community these days is “Who’s paying him?”
It would be bad enough if that question was being asked because of scurrilous rumors or hostile propaganda. Unfortunately, it’s being asked because there’s nothing particularly unusual about the behavior of the British scientists mentioned above. These days, in any field where science comes into contact with serious money, scientific studies are increasingly just another dimension of marketing. From influential researchers being paid to put their names on dubious studies to give them unearned credibility to the systematic concealment of “outlying” data that doesn’t support the claims made for this or that lucrative product, the corruption of science is an ongoing reality, and one that existing safeguards within the scientific community are not effectively countering.
Scientists have by and large treated the collapse in scientific ethics as an internal matter. That’s a lethal mistake, because the view that matters here is the view from outside. What looks to insiders like a manageable problem that will sort itself out in time, looks from outside the laboratory and the faculty lounge like institutionalized corruption on the part of a self-proclaimed elite whose members cover for each other and are accountable to no one. It doesn’t matter, by the way, how inaccurate that view is in specific cases, how many honest men and women are laboring at lab benches, or how overwhelming the pressure to monetize research that’s brought to bear on scientists by university administrations and corporate sponsors: none of that finds its way into the view from outside, and in the long run, the view from outside is the one that counts..
The corruption of science by self-interest is an old story, and unfortunately it’s most intense in those fields where science impacts the lives of nonscientists most directly:  yes, those would be medicine, pharmacology, and nutrition. I mentioned in an earlier blog post here a friend whose lifelong asthma, which landed her in the hospital repeatedly and nearly killed her twice, was cured at once by removing a common allergen from her diet. Mentioning this to her physician led to the discovery that he’d known about the allergy issue all along, but as he explained, “We prefer to medicate for that.” Understandably so, as a patient who’s cured of an ailment is a good deal less lucrative for the doctor than one who has to keep on receiving regular treatments and prescriptions—but as a result of that interaction among others, the friend in question has lost most of what respect she once had for mainstream medicine, and is now learning herbalism to meet her health care needs.
It’s an increasingly common story these days, and I could add plenty of other accounts here. The point I want to make, though, is that it’s painfully obvious that the physician who preferred to medicate never thought about the view from outside. I have no way of knowing what combination of external pressures and personal failings led him to conceal a less costly cure from my friend, and keep her on expensive and ineffective drugs with a gallery of noxious side effects instead, but from outside the walls of the office, it certainly looked like a callous betrayal of whatever ethics the medical profession might still have left—and again, the view from outside is the one that counts.
It counts because institutional science only has the authority and prestige it possesses today because enough of those outside the scientific community accept its claim to speak the truth about nature. Not that many years ago, all things considered, scientists didn’t have the authority or the prestige, and no law of nature or of society guarantees that they’ll keep either one indefinitely. Every doctor who would rather medicate than cure, every researcher who treats conflicts of interest as just another detail of business as usual, every scientist who insists in angry tones that nobody without a Ph.D. in this or that discipline is entitled to ask why this week’s pronouncement should be taken any more seriously than the one it just disproved—and let’s not even talk about the increasing, and increasingly public, problem of overt scientific fraud in the pharmaceutical field among others—is hastening the day when modern science is taken no more seriously by the general public than, say, academic philosophy is today.
That day may not be all that far away. That’s the message that should be read, and is far too rarely read, in the accelerating emergence of countercultures that reject the authority of science in one field. As a recent and thoughtful essay in Slate pointed out, that crisis of authority is what gives credibility to such movements as climate denialists and “anti-vaxxers” (the growing number of parents who refuse to have their children vaccinated). A good many any people these days, when the official voices of the scientific community say this or that, respond by asking “Why should we believe you?”—and too many of them don’t get a straightforward answer that addresses their concerns.
A bit of personal experience from a different field may be relevant here. Back in the late 1980s and early 1990s, when I lived in Seattle, I put a fair amount of time into collecting local folklore concerning ghosts and other paranormal phenomena. I wasn’t doing this out of any particular belief, or for that matter any particular unbelief; I was seeking a sense of the mythic terrain of the Puget Sound region, the landscapes of belief and imagination that emerged from the experiences of people on the land, with an eye toward the career writing fiction that I then hoped to launch. While I was doing this research, when something paranormal was reported anywhere in the region, I generally got to hear about it fairly quickly, and in the process I got to watch a remarkable sequence of events that repeated itself like a broken record in more cases than I can count.
Whether the phenomenon that was witnessed was an unusual light in the sky, a seven-foot-tall hairy biped in the woods, a visit from a relative who happened to be dead at the time, or what have you, two things followed promptly once the witness went public. The first was the arrival of a self-proclaimed skeptic, usually a member of CSICOP (the Committee for Scientific Investigation of Claims of the Paranormal), who treated the witness with scorn and condescension, made dogmatic claims about what must have happened, and responded to any disagreement with bullying and verbal abuse. The other thing that followed was the arrival of an investigator from one of the local paranormal-research organizations, who was invariably friendly and supportive, listened closely to the account of the witness, and took the incident seriously. I’ll let you guess which of the proposed explanations the witness usually ended up embracing, not to mention which organization he or she often joined.
The same process on a larger and far more dangerous scale is shaping attitudes toward science across a wide and growing sector of American society. Notice that unlike climate denialism, the anti-vaxxer movement isn’t powered by billions of dollars of grant money, but it’s getting increasing traction. The reason is as simple as it is painful: parents are asking physicians and scientists, “How do I know this substance you want to put into my child is safe?”—and the answers they’re getting are not providing them with the reassurance they need.
It’s probably necessary here to point out that I’m no fan of the anti-vaxxer movement. Since epidemic diseases are likely to play a massive role in the future ahead of us, I’ve looked into anti-vaxxer arguments with some care, and they don’t convince me at all. It’s clear from the evidence that vaccines do far more often than not provide protection against dangerous diseases; while some children are harmed by the side effects of vaccination, that’s true of every medical procedure, and the toll from side effects is orders of magnitude smaller than the annual burden of deaths from these same diseases in the pre-vaccination era.
Nor does the anti-vaxxer claim that vaccines cause autism hold water. (I have Aspergers syndrome, so the subject’s of some personal interest to me.)  The epidemiology of autism spectrum disorders simply doesn’t support that claim; to my educated-layperson’s eyes, at least, it matches that of an autoimmune disease instead, complete with the rapid increase in prevalence in recent years. The hypothesis I’d be investigating now, if I’d gone into biomedical science rather than the history of ideas, is that autism spectrum disorders are sequelae of an autoimmune disease that strikes in infancy or early childhood, and causes damage to any of a variety of regions in the central nervous system—thus the baffling diversity of neurological deficits found in those of us on the autism spectrum.
Whether that’s true or not will have to be left to trained researchers. The point that I want to make here is that I don’t share the beliefs that drive the anti-vaxxer movement. Similarly, I’m sufficiently familiar with the laws of thermodynamics and the chemistry of the atmosphere to know that when the climate denialists insist that dumping billions of tons of carbon dioxide into the atmosphere can’t change its capacity to retain heat, they’re smoking their shorts.  I’ve retained enough of a childhood interest in paleontology, and studied enough of biology and genetics since then, to be able to follow the debates between evolutionary biology and so-called “creation science,” and I’m solidly on Darwin’s side of the bleachers. I could go on; I have my doubts about a few corners of contemporary scientific theory, but then so do plenty of scientists.
That is to say, I don’t agree with the anti-vaxxers, the climate denialists, the creationists, or their equivalents, but I think I understand why they’ve rejected the authority of science, and it’s not because they’re ignorant cretins, much as though the proponents and propagandists of science would like to claim that. It’s because they’ve seen far too much of the view from outside. Parents who encounter a medical industry that would rather medicate than heal are more likely to listen to anti-vaxxers; Americans who watch climate change activists demand that the rest of the world cut its carbon footprint, while the activists themselves get to keep cozy middle-class lifestyles, are more likely to believe that global warming is a politically motivated hoax; Christians who see atheists using evolution as a stalking horse for their ideology are more likely to turn to creation science—and all three, and others, are not going to listen to scientists who insist that they’re wrong, until and unless the scientists stop and take a good hard look at how they and their proclamations look when viewed from outside.
I’m far from sure that anybody in the scientific community is willing to take that hard look. It’s possible; these days, even committed atheists are starting to notice that whenever Richard Dawkins opens his mouth, twenty people who were considering atheism decide to give God a second chance. The arrogant bullying that used to be standard practice among the self-proclaimed skeptics and “angry atheists” has taken on a sullen and defensive tone recently, as though it’s started to sink in that yelling abuse at people who disagree with you might not be the best way to win their hearts and minds. Still, for that same act of reflection to get any traction in the scientific community, a great many people in that community are going to have to rethink the way they handle dealing with the public, especially when science, technology, and medicine cause harm. That, in turn, is only going to happen if enough of today’s scientists remember the importance of the view from outside.
In the light of the other issues I’ve tried to discuss over the years in this blog, that view has another dimension, and it’s a considerably harsher one. Among the outsiders whose opinion of contemporary science matters most are some that haven’t been born yet: our descendants, who will inhabit a world shaped by science and the technologies that have resulted from scientific research. It’s still popular to insist that their world will be a Star Trek fantasy of limitlessness splashed across the galaxy, but I think most people are starting to realize just how unlikely that future actually is.
Instead, the most likely futures for our descendants are those in which the burdens left behind by today’s science and technology are much more significant than the benefits.  Those most likely futures will be battered by unstable climate and rising oceans due to anthropogenic climate change, stripped of most of the world's topsoil, natural resources, and ecosystems, strewn with the radioactive and chemical trash that our era produced in such abundance and couldn’t be bothered to store safely—and most of today’s advanced technologies will have long since rusted into uselessness, because the cheap abundant energy and other nonrenewable resources that were needed to keep them running all got used up in our time.
People living in such a future aren’t likely to remember that a modest number of scientists signed petitions and wrote position papers protesting some of these things. They’re even less likely to recall the utopian daydreams of perpetual progress and limitless abundance that encouraged so many other people in the scientific community to tell themselves that these things didn’t really matter—and if by chance they do remember those daydreams, their reaction to them won’t be pretty. That science today, like every other human institution in every age, combines high ideals and petty motives in the usual proportions will not matter to them in the least.
Unless something changes sharply very soon, their view from outside may well see modern science—all of it, from the first gray dawn of the scientific revolution straight through to the flamelit midnight when the last laboratory was sacked and burned by a furious mob—as a wicked dabbling in accursed powers that eventually brought down just retribution upon a corrupt and arrogant age. So long as the proponents and propagandists of science ignore the view from outside, and blind themselves to the ways that their own defense of science is feeding the forces that are rising against it, the bleak conclusion of the Clark Ashton Smith story cited at the beginning of this post may yet turn out to be far more prophetic than the comfortable fantasies of perpetual scientific advancement cherished by so many people today.
********
On a less bleak but not wholly unrelated subject, I’m pleased to announce that my forthcoming book After Progress is rolling off the printing press as I write this. There were a few production delays, and so it’ll be next month before orders from the publisher start being shipped; the upside to this is that the book can still be purchased for 20% off the cover price. I’m pretty sure that this book will offend people straight across the spectrum of acceptable opinion in today’s industrial society, so get your copy now, pop some popcorn, and get ready to enjoy the show.