AODA Blog

The Hate that Dare Not Speak its Name

Wed, 2017-01-18 15:12
As the United States stumbles toward the last act of its electoral process two days from now, and the new administration prepares to take over the reins of power from its feckless predecessor, the obligatory caterwauling of the losing side has taken on an unfamiliar shrillness. Granted, the behavior of both sides in the last few decades of American elections can be neatly summed up in the words “sore loser”; the Republicans in 1992 and 2008 behaved not one whit better than the Democrats in 1980 and 2000.  I think it’s fair, though, to say that the current example has plunged well past the low-water mark set by those dismal occasions. The question I’d like to discuss here is why that should be.
I think we can all admit that there are plenty of reasons why Americans might reasonably object to the policies and appointments of the incoming president, but the same thing has been true of every other president we’ve had since George Washington’s day. Equally, both of our major parties have long been enthusiastic practitioners of the fine art of shrieking in horror at the other side’s behavior, while blithely excusing the identical behavior on their side.  Had the election last November gone the other way, for example, we can be quite certain that all the people who are ranting about Donald Trump’s appointment of Goldman Sachs employees to various federal offices would be busy explaining how reasonable it was for Hillary Clinton to do exactly the same thing—as of course she would have.
That said, I don’t think reasonable differences of opinion on the one hand, and the ordinary hypocrisy of partisan politics on the other, explain the extraordinarily stridency, the venom, and the hatred being flung at the incoming administration by its enemies. There may be many factors involved, to be sure, but I’d like to suggest that one factor in particular plays a massive role here.
To be precise, I think a lot of what we’re seeing is the product of class bigotry.
Some definitions are probably necessary here. We can define bigotry as the act of believing hateful things about all the members of a given category of people, just because they belong to that category. Thus racial bigots believe hateful things about everyone who belongs to races they don’t like, religious bigots do the same thing to every member of the religions they don’t like, and so on through the dismal chronicle of humanity’s collective nastiness.
Defining social class is a little more difficult to do in the abstract, as different societies draw up and enforce their class barriers in different ways. In the United States, though, the matter is made a good deal easier by the lack of a fully elaborated feudal system in our nation’s past, on the one hand, and on the other, the tolerably precise dependency of how much privilege you have in modern American society on how much money you make. Thus we can describe class bigotry in the United States, without too much inaccuracy, as bigotry directed against people who make either significantly more money than the bigot does, or significantly less. (Of course that’s not all there is to social class, not by a long shot, but for our present purposes, as an ostensive definition, it will do.)
Are the poor bigoted against the well-to-do? You bet. Bigotry directed up the social ladder, though, is far more than matched, in volume and nastiness, by bigotry directed down. It’s a source of repeated amusement to me that rich people in this country so often inveigh against the horrors of class warfare. Class warfare is their bread and butter. The ongoing warfare of the rich against the poor, and of the affluent middle and upper middle classes against the working class, create and maintain the vast disparities of wealth and privilege in contemporary American society. What upsets the rich and the merely affluent about class warfare, of course, is the thought that they might someday be treated the way they treat everyone else.
Until last year, if you wanted to experience the class bigotry that’s so common among the affluent classes in today’s America, you pretty much had to be a member of those affluent classes, or at least good enough at passing to be present at the social events where their bigotry saw free play. Since Donald Trump broke out of the Republican pack early last year, though, that hindrance has gone by the boards. Those who want to observe American class bigotry at its choicest need only listen to what a great many of the public voices of the well-to-do are saying about the people who votes and enthusiasm have sent Trump to the White House.
You see, that’s a massive part of the reason a Trump presidency is so unacceptable to so many affluent Americans:  his candidacy, unlike those of all his rivals, was primarily backed by “those people.”
It’s probably necessary to clarify just who “thosepeople” are. During the election, and even more so afterwards, the mainstream media here in the United States have seemingly been unable to utter the words “working class” without sticking the labels “white” in front and “men” behind. The resulting rhetoric seems to be claiming that the relatively small fraction of the American voting public that’s white, male, and working class somehow managed to hand the election to Donald Trump all by themselves, despite the united efforts of everyone else.
Of course that’s not what happened. A huge majority of white working class women also voted for Trump, for example.  So, according to exit polls, did about a third of Hispanic men and about a quarter of Hispanic women; so did varying fractions of other American minority voting blocs, with African-American voters (the least likely to vote for Trump) still putting something like fourteen per cent in his column. Add it all up, and you’ll find that the majority of people who voted for Trump weren’t white working class men at all—and we don’t even need to talk about the huge number of registered voters of all races and genders who usually turn out for Democratic candidates, but stayed home in disgust this year, and thus deprived Clinton of the turnout that could have given her the victory.
Somehow, though, pundits and activists who fly to their keyboards at a moment’s notice to denounce the erasure of women and people of color in any other context are eagerly cooperating in the erasure of women and people of color in this one case. What’s more, that same erasure went on continuously all through the campaign. Those of my readers who followed the media coverage of the race last year will recall confident proclamations that women wouldn’t vote for Trump because his words and actions had given offense to feminists, that Hispanics (or people of color in general) wouldn’t vote for Trump because social-justice activists denounced his attitudes toward illegal immigrants from Mexico as racist, and so on. The media took these proclamations as simple statements of fact—and of course that was one of the reasons media pundits were blindsided by Trump’s victory.
The facts of the matter are that a great many American women don’t happen to agree with feminists, nor do all people of color agree with the social-justice activists who claim to speak in their name. For that matter, may I point out to my fellow inhabitants of Gringostan that the terms “Hispanic” and “Mexican-American” are not synonyms? Americans of Hispanic descent trace their ancestry to many different nations of origin, each of which has its own distinctive culture and history, and they don’t form a single monolithic electoral bloc. (The Cuban-American community in Florida, to cite only one of the more obvious examples, very often vote Republican and  played a significant role in giving that electoral vote-rich state to Trump.)
Behind the media-manufactured facade of white working class men as the cackling villains who gave the country to Donald Trump, in other words, lies a reality far more in keeping with the complexities of American electoral politics: a ramshackle coalition of many different voting blocs and interest groups, each with its own assortment of reasons for voting for a candidate feared and despised by the US political establishment and the mainstream media.  That coalition included a very large majority of the US working class in general, and while white working class voters of both genders were disproportionately more likely to have voted for Trump than their nonwhite equivalents, it wasn’t simply a matter of whiteness, or for that matter maleness.
It was, however, to a very great extent a matter of social class. This isn’t just because so large a fraction of working class voters generally backed Trump; it’s also because Trump saw this from the beginning, and aimed his campaign squarely at the working class vote. His signature red ball cap was part of that—can you imagine Hillary Clinton wearing so proletarian a garment without absurdity?—but, as I pointed out a year ago, so was his deliberate strategy of saying (and tweeting) things that would get the liberal punditocracy to denounce him. The tones of sneering contempt and condescension they directed at him were all too familiar to his working class audiences, who have been treated to the same tones unceasingly by their soi-disant betters for decades now.
Much of the pushback against Trump’s impending presidency, in turn, is heavily larded with that same sneering contempt and condescension—the unending claims, for example, that the only reason people could possibly have chosen to vote for Trump was because they were racist misogynistic morons, and the like. (These days, terms such as “racist” and “misogynistic,” in the mouths of the affluent, are as often as not class-based insults rather than objective descriptions of attitudes.) The question I’d like to raise at this point, though, is why the affluent don’t seem to be able to bring themselves to come right out and denounce Trump as the candidate of the filthy rabble. Why must they borrow the rhetoric of identity politics and twist it (and themselves) into pretzel shapes instead?
There, dear reader, hangs a tale.
In the aftermath of the social convulsions of the 1960s, the wealthy elite occupying the core positions of power in the United States offered a tacit bargain to a variety of movements for social change.  Those individuals and groups who were willing to give up the struggle to change the system, and settled instead for a slightly improved place within it, suddenly started to receive corporate and government funding, and carefully vetted leaders from within the movements in question were brought into elite circles as junior partners. Those individuals and groups who refused these blandishments were marginalized, generally with the help of their more compliant peers.
If you ever wondered, for example, why environmental groups such as the Sierra Club and Friends of the Earth changed so quickly from scruffy fire-breathing activists to slickly groomed and well-funded corporate enablers, well, now you know. Equally, that’s why mainstream feminist organizations by and large stopped worrying about the concerns of the majority of women and fixated instead on “breaking the glass ceiling”—that is to say, giving women who already belong to the privileged classes access to more privilege than they have already. The core demand placed on former radicals who wanted to cash in on the offer, though, was that they drop their demands for economic justice—and American society being what it is, that meant that they had to stop talking about class issues.
The interesting thing is that a good many American radicals were already willing to meet them halfway on that. The New Left of the 1960s, like the old Left of the between-the-wars era, was mostly Marxist in its theoretical underpinnings, and so was hamstrung by the mismatch between Marxist theory and one of the enduring realities of American politics. According to Marxist theory, socialist revolution is led by the radicalized intelligentsia, but it gets the muscle it needs to overthrow the capitalist system from the working classes. This is the rock on which wave after wave of Marxist activism has broken and gone streaming back out to sea, because the American working classes are serenely uninterested in taking up the world-historical role that Marxist theory assigns to them. All they want is plenty of full time jobs at a living wage.  Give them that, and revolutionary activists can bellow themselves hoarse without getting the least flicker of interest out of them.
Every so often, the affluent classes lose track of this, and try to force the working classes to put up with extensive joblessness and low pay, so that affluent Americans can pocket the proceeds. This never ends well.  After an interval, the working classes pick up whatever implement is handy—Andrew Jackson, the Grange, the Populist movement, the New Deal, Donald Trump—and beat the affluent classes about the head and shoulders with it until the latter finally get a clue. This might seem  promising for Marxist revolutionaries, but it isn’t, because the Marxist revolutionaries inevitably rush in saying, in effect, “No, no, you shouldn’t settle for plenty of full time jobs at a living wage, you should die by the tens of thousands in an orgy of revolutionary violence so that we can seize power in your name.” My readers are welcome to imagine the response of the American working class to this sort of rhetoric.
The New Left, like the other American Marxist movements before its time, thus had a bruising face-first collision with cognitive dissonance: its supposedly infallible theory said one thing, but the facts refused to play along and said something very different. For much of the Sixties and Seventies, New Left theoreticians tried to cope with this by coming up with increasingly Byzantine redefinitions of “working class” that excluded the actual working class, so that they could continue to believe in the inevitability and imminence of the proletarian revolution Marx promised them. Around the time that this effort finally petered out into absurdity, it was replaced by the core concept of the identity politics currently central to the American left: the conviction that the only divisions in American society that matter are those that have some basis in biology.
Skin color, gender, ethnicity, sexual orientation, disability—these are the divisions that the American left likes to talk about these days, to the exclusion of all other social divisions, and especially to the exclusion of social class.  Since the left has dominated public discourse in the United States for many decades now, those have become the divisions that the American right talks about, too. (Please note, by the way, the last four words in the paragraph above: “some basis in biology.” I’m not saying that these categories are purely biological in nature; every one of them is defined in practice by a galaxy of cultural constructs and presuppositions, and the link to biology is an ostensive category marker rather than a definition. I insert this caveat because I’ve noticed that a great many people go out of their way to misunderstand the point I’m trying to make here.)
Are the divisions listed above important when it comes to discriminatory treatment in America today? Of course they are—but social class is also important. It’s by way of the erasure of social class as a major factor in American injustice that we wind up in the absurd situation in which a woman of color who makes a quarter million dollars a year plus benefits as a New York stockbroker can claim to be oppressed by a white guy in Indiana who’s working three part time jobs at minimum wage with no benefits in a desperate effort to keep his kids fed, when the political candidates that she supports and the economic policies from which she profits are largely responsible for his plight.
In politics as in physics, every action produces an equal and opposite reaction, and so absurdities of the sort just described have kindled the inevitable blowback. The Alt-Right scene that’s attracted so much belated attention from politicians and pundits over the last year is in large part a straightforward reaction to the identity politics of the left. Without too much inaccuracy, the Alt-Right can be seen as a network of young white men who’ve noticed that every other identity group in the country is being encouraged to band together to further its own interests at their expense, and responded by saying, “Okay, we can play that game too.” So far, you’ve got to admit, they’ve played it with verve.
That said, on the off chance that any devout worshippers of the great god Kek happen to be within earshot, I have a bit of advice that I hope will prove helpful. The next time you want to goad affluent American liberals into an all-out, fist-pounding, saliva-spraying Donald Duck meltdown, you don’t need the Jew-baiting, the misogyny, the racial slurs, and the rest of it.  All you have to do is call them on their class privilege. You’ll want to have the popcorn popped, buttered, and salted first, though, because if my experience is anything to go by, you’ll be enjoying a world-class hissy fit in seconds.
I’d also like to offer the rest of my readers another bit of advice that, again, I hope will prove helpful. As Donald Trump becomes the forty-fifth president of the United States and begins to push the agenda that got him into the White House, it may be useful to have a convenient way to sort through the mix of signals and noise from the opposition. When you hear people raising reasoned objections to Trump’s policies and appointments, odds are that you’re listening to the sort of thoughtful dissent that’s essential to any semblance of democracy, and it may be worth taking seriously. When you hear people criticizing Trump and his appointees for doing the same thing his rivals would have done, or his predecessors did, odds are that you’re getting the normal hypocrisy of partisan politics, and you can roll your eyes and stroll on.
But when you hear people shrieking that Donald Trump is the illegitimate result of a one-night stand between Ming the Merciless and Cruella de Vil, that he cackles in Russian while barbecuing babies on a bonfire, that everyone who voted for him must be a card-carrying Nazi who hates the human race, or whatever other bit of over-the-top hate speech happens to be fashionable among the chattering classes at the moment—why, then, dear reader, you’re hearing a phenomenon as omnipresent and unmentionable in today’s America as sex was in Victorian England. You’re hearing the voice of class bigotry: the hate that dare not speak its name.

The Embarrassments of Chronocentrism

Wed, 2017-01-11 11:09
It's a curious thing, this attempt of mine to make sense of the future by understanding what’s happened in the past. One of the most curious things about it, at least to me, is the passion with which so many people insist that this isn’t an option at all. In any other context, “Well, what happened the last time someone tried that?” is one of the first and most obviously necessary questions to ask and answer—but heaven help you if you try to raise so straightforward a question about the political, economic, and social phenomena of the present day.
In previous posts here we’ve talked about thoughtstoppers of the “But it’s different this time!” variety, and some of the other means people these days use to protect themselves against the risk of learning anything useful from the hard-earned lessons of the past. This week I want to explore another, subtler method of doing the same thing. As far as I’ve been able to tell, it’s mostly an issue here in the United States, but here it’s played a remarkably pervasive role in convincing people that the only way to open a door marked PULL is to push on it long and hard enough.
It’s going to take a bit of a roundabout journey to make sense of the phenomenon I have in mind, so I’ll have to ask my readers’ forbearance for what will seem at first like several sudden changes of subject.
One of the questions I field tolerably often, when I discuss the societies that will rise after modern industrial civilization finishes its trajectory into history’s compost heap, is whether I think that consciousness evolves. I admit that until fairly recently, I was pretty much at a loss to know how to respond. It rarely took long to find out that the questioner wasn’t thinking about the intriguing theory Julian Jaynes raised in The Origins of Consciousness in the Breakdown of the Bicameral Mind, the Jungian conception Erich Neumann proposed in The Origins and History of Consciousness, or anything of the same kind. Nor, it turned out, was the question usually based on the really rather weird reinterpretations of evolution common in today’s pop-spirituality scene. Rather, it was political.
It took me a certain amount of research, and some puzzled emails to friends more familiar with current left-wing political jargon than I am, to figure out what was behind these questions. Among a good-sized fraction of American leftist circles these days, it turns out it’s become a standard credo that what drives the kind of social changes supported by the left—the abolition of slavery and segregation, the extension of equal (or more than equal) rights to an assortment of disadvantaged groups, and so on—is an ongoing evolution of consciousness, in which people wake up to the fact that things they’ve considered normal and harmless are actually intolerable injustices, and so decide to stop.
Those of my readers who followed the late US presidential election may remember Hillary Clinton’s furious response to a heckler at one of her few speaking gigs:  “We aren’t going back. We’re going forward.” Underlying that outburst is the belief system I’ve just sketched out: the claim that history has a direction, that it moves in a linear fashion from worse to better, and that any given political choice—for example, which of the two most detested people in American public life is going to become the nominal head of a nation in freefall ten days from now—not only can but must be flattened out into a rigidly binary decision between “forward” and “back.”
There’s no shortage of hard questions that could be brought to bear on that way of thinking about history, and we’ll get to a few of them a little later on, but let’s start with the simplest one: does history actually show any such linear movement in terms of social change?
It so happens that I’ve recently finished a round of research bearing on exactly that question, though I wasn’t thinking of politics or the evolution of consciousness when I launched into it. Over the last few years I’ve been working on a sprawling fiction project, a seven-volume epic fantasy titled The Weird of Hali, which takes the horror fantasy of H.P. Lovecraft and stands it on its head, embracing the point of view of the tentacled horrors and multiracial cultists Lovecraft liked to use as images of dread. (The first volume, Innsmouth, is in print in a fine edition and will be out in trade paper this spring, and the second, Kingsport, is available for preorder and will be published later this year.)
One of Lovecraft’s few memorable human characters, the intrepid dream-explorer Randolph Carter, has an important role in the fourth book of my series. According to Lovecraft, Carter was a Boston writer and esthete of the1920s from a well-to-do family, who had no interest in women but a whole series of intimate (and sometimes live-in) friendships with other men, and decidedly outré tastes in interior decoration—well, I could go on. The short version is that he’s very nearly the perfect archetype of an upper-class gay man of his generation. (Whether Lovecraft intended this is a very interesting question that his biographers don’t really answer.) With an eye toward getting a good working sense of Carter’s background, I talked to a couple of gay friends, who pointed me to some friends of theirs, and that was how I ended up reading George Chauncey’s magisterial Gay New York: Gender, Urban Culture, and the Makings of the Gay Male World, 1890-1940.
What Chauncey documents, in great detail and with a wealth of citations from contemporary sources, is that gay men in America had substantially more freedom during the first three decades of the twentieth century than they did for a very long time thereafter. While homosexuality was illegal, the laws against it had more or less the same impact on people’s behavior that the laws against smoking marijuana had in the last few decades of the twentieth century—lots of people did it, that is, and now and then a few of them got busted. Between the beginning of the century and the coming of the Great Depression, in fact, most large American cities had a substantial gay community with its own bars, restaurants, periodicals, entertainment venues, and social events, right out there in public.
Nor did the gay male culture of early twentieth century America conform to current ideas about sexual identity, or the relationship between gay culture and social class, or—well, pretty much anything else, really. A very large number of men who had sex with other men didn’t see that as central to their identity—there were indeed men who embraced what we’d now call a gay identity, but that wasn’t the only game in town by a long shot. What’s more, sex between men was by and large more widely accepted in the working classes than it was further up the social ladder. In turn-of-the-century New York, it was the working class gay men who flaunted the camp mannerisms and the gaudy clothing; upper- and middle-class gay men such as Randolph Carter had to be much more discreet.
So what happened? Did some kind of vast right-wing conspiracy shove the ebullient gay male culture of the early twentieth century into the closet? No, and that’s one of the more elegant ironies of this entire chapter of American cultural history. The crusade against the “lavender menace” (I’m not making that phrase up, by the way) was one of the pet causes of the same Progressive movement responsible for winning women the right to vote and breaking up the fabulously corrupt machine politics of late nineteenth century America. Unpalatable as that fact is in today’s political terms, gay men and lesbians weren’t forced into the closet in the 1930s by the right.  They were driven there by the left.
This is the same Progressive movement, remember, that made Prohibition a central goal of its political agenda, and responded to the total failure of the Prohibition project by refusing to learn the lessons of failure and redirecting its attentions toward banning less popular drugs such as marijuana. That movement was also, by the way, heavily intertwined with what we now call Christian fundamentalism. Some of my readers may have heard of William Jennings Bryan, the supreme orator of the radical left in late nineteenth century America, the man whose “Cross of Gold” speech became the great rallying cry of opposition to the Republican corporate elite in the decades before the First World War.  He was also the prosecuting attorney in the equally famous Scopes Monkey Trial, responsible for pressing charges against a schoolteacher who had dared to affirm in public Darwin’s theory of evolution.
The usual response of people on today’s left to such historical details—well, other than denying or erasing them, which is of course quite common—is to insist that this proves that Bryan et al. were really right-wingers. Not so; again, we’re talking about people who put their political careers on the line to give women the vote and weaken (however temporarily) the grip of corporate money on the US political system. The politics of the Progressive era didn’t assign the same issues to the categories “left” and “right” that today’s politics do, and so all sides in the sprawling political free-for-all of that time embraced some issues that currently belong to the left, others that belong to the right, and still others that have dropped entirely out of the political conversation since then.
I could go on, but let’s veer off in another direction instead. Here’s a question for those of my readers who think they’re well acquainted with American history. The Fifteenth Amendment, which granted the right to vote to all adult men in the United States irrespective of race, was ratified in 1870. Before then, did black men have the right to vote anywhere in the US?
Most people assume as a matter of course that the answer must be no—and they’re wrong. Until the passage of the Fifteenth Amendment, the question of who did and didn’t have voting rights was a matter for each state to decide for itself. Fourteen states either allowed free African-American men to vote in Colonial times or granted them that right when first organized. Later on, ten of them—Delaware in 1792, Kentucky in 1799, Maryland in 1801, New Jersey in 1807, Connecticut in 1814, New York in 1821, Rhode Island in 1822, Tennessee in 1834, North Carolina in 1835, and Pennsylvania in 1838—either denied free black men the vote or raised legal barriers that effectively kept them from voting. Four other states—Massachusetts, Vermont, New Hampshire, and Maine—gave free black men the right to vote in Colonial times and maintained that right until the Fifteenth Amendment made the whole issue moot. Those readers interested in the details can find them in The African American Electorate: A Statistical History by Hanes Walton Jr. et al., which devotes chapter 7 to the subject.
So what happened? Was there a vast right-wing conspiracy to deprive black men of the right to vote? No, and once again we’re deep in irony. The political movements that stripped free American men of African descent of their right to vote were the two great pushes for popular democracy in the early United States, the Democratic-Republican party under Thomas Jefferson and the Democratic party under Andrew Jackson. Read any detailed history of the nineteenth century United States and you’ll learn that before these two movements went to work, each state set a certain minimum level of personal wealth that citizens had to have in order to vote. Both movements forced through reforms in the voting laws, one state at a time, to remove these property requirements and give the right to vote to every adult white man in the state. What you won’t learn, unless you do some serious research, is that in many states these same reforms also stripped adult black men of their right to vote.
Try to explain this to most people on the leftward end of today’s American political spectrum, and you’ll likely end up with a world-class meltdown, because the Jeffersonian Democratic-Republicans and the Jacksonian Democrats, like the Progressive movement, embraced some causes that today’s leftists consider progressive, and others that they consider regressive. The notion that social change is driven by some sort of linear evolution of consciousness, in which people necessarily become “more conscious” (that is to say, conform more closely to the ideology of the contemporary American left) over time, has no room for gay-bashing Progressives and Jacksonian Democrats whose concept of democracy included a strict color bar. The difficulty, of course, is that history is full of Progressives, Jacksonian Democrats, and countless other political movements that can’t be shoehorned into the Procrustean bed of today’s political ideologies.
I could add other examples—how many people remember, for example, that environmental protection was a cause of the far right until the 1960s?—but I think the point has been made. People in the past didn’t divide up the political causes of their time into the same categories left-wing activists like to use today. It’s practically de rigueur for left-wing activists these days to insist that people in the past ought to have seen things in today’s terms rather than the terms of their own time, but that insistence just displays a bad case of chronocentrism.
Chronocentrism? Why, yes.  Most people nowadays are familiar with ethnocentrism, the insistence by members of one ethnic group that the social customs, esthetic notions, moral standards, and so on of that ethnic group are universally applicable, and that anybody who departs from those things is just plain wrong. Chronocentrism is the parallel insistence, on the part of people living in one historical period, that the social customs, esthetic notions, moral standards, and so on of that period are universally applicable, and that people in any other historical period who had different social customs, esthetic notions, moral standards, and so on should have known better.
Chronocentrism is pandemic in our time. Historians have a concept called “Whig history;” it got that moniker from a long line of English historians who belonged to the Whig, i.e., Liberal Party, and who wrote as though all of human history was to be judged according to how well it measured up to the current Liberal Party platform. Such exercises aren’t limited to politics, though; my first exposure to the concept of Whig history came via university courses in the history of science. When I took those courses—this was twenty-five years ago, mind you—historians of science were sharply divided between a majority that judged every scientific activity in every past society on the basis of how well it conformed to our ideas of science, and a minority that tried to point out just how difficult this habit made the already challenging task of understanding the ideas of past thinkers.
To my mind, the minority view in those debates was correct, but at least some of its defenders missed a crucial point. Whig history doesn’t exist to foster understanding of the past.  It exists to justify and support an ideological stance of the present. If the entire history of science is rewritten so that it’s all about how the currently accepted set of scientific theories about the universe rose to their present privileged status, that act of revision makes currently accepted theories look like the inevitable outcome of centuries of progress, rather than jerry-rigged temporary compromises kluged together to cover a mass of recalcitrant data—which, science being what it is, is normally a more accurate description.
In exactly the same sense, the claim that a certain set of social changes in the United States and other industrial countries in recent years result from the “evolution of consciousness,” unfolding on a one-way street from the ignorance of the past to a supposedly enlightened future, doesn’t help make sense of the complicated history of social change. It was never supposed to do that. Rather, it’s an attempt to backstop the legitimacy of a specific set of political agendas here and now by making them look like the inevitable outcome of the march of history. The erasure of the bits of inconvenient history I cited earlier in this essay is part and parcel of that attempt; like all linear schemes of historical change, it falsifies the past and glorifies the future in order to prop up an agenda in the present.
It needs to be remembered in this context that the word “evolution” does not mean “progress.” Evolution is adaptation to changing circumstances, and that’s all it is. When people throw around the phrases “more evolved” and “less evolved,” they’re talking nonsense, or at best engaging in a pseudoscientific way of saying “I like this” and “I don’t like that.” In biology, every organism—you, me, koalas, humpback whales, giant sequoias, pond scum, and all the rest—is equally the product of a few billion years of adaptation to the wildly changing conditions of an unstable planet, with genetic variation shoveling in diversity from one side and natural selection picking and choosing on the other. The habit of using the word “evolution” to mean “progress” is pervasive, and it’s pushed hard by the faith in progress that serves as an ersatz religion in our time, but it’s still wrong.
It’s entirely possible, in fact, to talk about the evolution of political opinion (which is of course what “consciousness” amounts to here) in strictly Darwinian terms. In every society, at every point in its history, groups of people are striving to improve the conditions of their lives by some combination of competition and cooperation with other groups. The causes, issues, and rallying cries that each group uses will vary from time to time as conditions change, and so will the relationships between groups—thus it was to the advantage of affluent liberals of the Progressive era to destroy the thriving gay culture of urban America, just as it was to the advantage of affluent liberals of the late twentieth century to turn around and support the struggle of gay people for civil rights. That’s the way evolution works in the real world, after all.
This sort of thinking doesn’t offer the kind of ideological support that activists of various kinds are used to extracting from various linear schemes of history. On the other hand, that difficulty is more than balanced by a significant benefit, which is that linear predictions inevitably fail, and so by and large do movements based on them. The people who agreed enthusiastically with Hillary Clinton’s insistence that “we aren’t going back, we’re going forward” are still trying to cope with the hard fact that their political agenda will be wandering in the wilderness for at least the next four years. Those who convince themselves that their cause is the wave of the future are constantly being surprised by the embarrassing discovery that waves inevitably break and roll back out to sea. It’s those who remember that history plays no favorites who have a chance of accomplishing their goals.

How Not To Write Like An Archdruid

Wed, 2017-01-04 11:59
Among the occasional amusements I get from writing these weekly essays are earnest comments from people who want to correct my writing style. I field one of them every month or so, and the latest example came in over the electronic transom in response to last week’s post. Like most of its predecessors, it insisted that there’s only one correct way to write for the internet, trotted out a set of canned rules that supposedly encapsulate this one correct way, and assumed as a matter of course that the only reason I didn’t follow those rules is that I’d somehow managed not to hear about them yet.
The latter point is the one I find most amusing, and also most curious. Maybe I’m naive, but it’s always seemed to me that if I ran across someone who was writing in a style I found unusual, the first thing I’d want to do would be to ask the author why he or she had chosen that stylistic option—because, you know, any writer who knows the first thing about his or her craft chooses the style he or she finds appropriate for any given writing project. I field such questions once in a blue moon, and I’m happy to answer them, because I do indeed have reasons for writing these essays in the style I’ve chosen for them. Yet it’s much more common to get the sort of style policing I’ve referenced above—and when that happens, you can bet your bottom dollar that what’s being pushed is the kind of stilted, choppy, dumbed-down journalistic prose that I’ve deliberately chosen not to write.
I’m going to devote a post to all this, partly because I write what I want to write about, the way I want to write about it, for the benefit of those who enjoy reading it, and those who don’t are encouraged to remember that there are thousands of other blogs out there that they’re welcome to read instead. Partly, though, the occasional thudding of what Giordano Bruno called “the battering rams of infants, the catapults of error, the bombards of the inept, and the lightning flashes, thunder, and great tempests of the ignorant”—now there was a man who could write!—raises issues that are central to the occasional series of essays on education I’ve been posting here.
Accepting other people’s advice on writing is a risky business—and yes, that applies to this blog post as well as any other source of such advice. It’s by no means always true that “those who can, do; those who can’t, teach,” but when we’re talking about unsolicited writing advice on the internet, that’s the way to bet.  Thus it’s not enough for some wannabe instructor to tell you “I’ve taught lots of people” (taught them what?) or “I’ve helped lots of people” (to do what?)—the question you need to ask is what the instructor himself or herself has written and where it’s been published.
The second of those matters as much as the first. It so happens, for example, that a great many of the professors who offer writing courses at American universities publish almost exclusively in the sort of little literary quarterlies that have a circulation in three figures and pay contributors in spare copies. (It’s not coincidental that these days, most of the little literary quarterlies in question are published by university English departments.) There’s nothing at all wrong with that, if you dream of writing the sort of stories, essays, and poetry that populate little literary quarterlies.
If you want to write something else, though, it’s worth knowing that these little quarterlies have their own idiosyncratic literary culture. There was a time when the little magazines were one of the standard stepping stones to a successful writing career, but that time went whistling down the wind decades ago. Nowadays, the little magazines have gone one way, the rest of the publishing world has gone another, and many of the habits the little magazines encourage (or even require) in their writers will guarantee prompt and emphatic rejection slips from most other writing venues.
Different kinds of writing, in other words, have their own literary cultures and stylistic customs. In some cases, those can be roughly systematized in the form of rules. That being the case, is there actually some set of rules that are followed by everything good on the internet?
Er, that would be no. I’m by no means a fan of the internet, all things considered—I publish my essays here because most of the older venues I’d prefer no longer exist—but it does have its virtues, and one of them is the remarkable diversity of style to be found there. If you like stilted, choppy, dumbed-down journalistic prose of the sort my commenter wanted to push on me, why, yes, you can find plenty of it online. You can also find lengthy, well-argued essays written in complex and ornate prose, stream-of-consciousness pieces that out-beat the Beat generation, experimental writing of any number of kinds, and more. Sturgeon’s Law (“95% of everything is crap”) applies here as it does to every other human creation, but there are gems to be found online that range across the spectrum of literary forms and styles. No one set of rules applies.
Thus we can dismiss the antics of the style police out of hand. Let’s go deeper, though. If there’s no one set of rules that internet writing ought to follow, are there different rules for each kind of writing? Or are rules themselves the problem? This is where things get interesting.
One of the consistent mental hiccups of American popular culture is the notion that every spectrum consists solely of its two extremes, with no middle ground permitted, and that bit of paralogic gets applied to writing at least as often as to anything else. Thus you have, on the one hand, the claim that the only way to write well is to figure out what the rules are and follow them with maniacal rigidity; on the other, the claim that the only way to write well is to throw all rules into the trash can and let your inner genius, should you happen to have one of those on hand, spew forth the contents of your consciousness all anyhow onto the page. Partisans of those two viewpoints snipe at one another from behind rhetorical sandbags, and neither one of them ever manages more than a partial victory, because neither approach is particularly useful when it comes to the actual practice of writing.
By and large, when people write according to a rigidly applied set of rules—any rigidly applied set of rules—the result is predictable, formulaic, and trite, and therefore boring. By and large, when people write without paying any attention to rules at all, the result is vague, shapeless, and maundering, and therefore boring. Is there a third option? You bet, and it starts by taking the abandoned middle ground: in this case, learning an appropriate set of rules, and using them as a starting point, but departing from them wherever doing so will improve the piece you’re writing.
The set of rules I recommend, by the way, isn’t meant to produce the sort of flat PowerPoint verbiage my commenter insists on. It’s meant to produce good readable English prose, and the source of guidance I recommend to those who are interested in such things is Strunk and White’s deservedly famous The Elements of Style. Those of my readers who haven’t worked with it, who want to improve their writing, and who’ve glanced over what I’ve published and decided that they might be able to learn something useful from me, could do worse than to read it and apply it to their prose.
A note of some importance belongs here, though. There’s a thing called writer’s block, and it happens when you try to edit while you’re writing. I’ve read, though I’ve misplaced the reference, that neurologists have found that the part of the brain that edits and the part of the brain that creates are not only different, they conflict with one another.  If you try to use both of them at once, your brain freezes up in a fairly close neurological equivalent of the Blue Screen of Death, and you stop being able to write at all. That’s writer’s block. To avoid it, NEVER EDIT WHILE YOU’RE WRITING
I mean that quite literally. Don’t even look at the screen if you can’t resist the temptation to second-guess the writing process. If you have to, turn the screen off, so you can’t even see what you’re writing. Eventually, with practice, you’ll learn to move smoothly back and forth between creative mode and editing mode, but if you don’t have a lot of experience writing, leave that for later. For now, just blurt it all out without a second thought, with all its misspellings and garbled grammar intact.
Then, after at least a few hours—or better yet, after a day or so—go back over the mess, cutting, pasting, adding, and deleting as needed, until you’ve turned it into nice clean text that says what you want it to say. Yes, we used to do that back before computers; the process is called “cut and paste” because it was done back then with a pair of scissors and a pot of paste, the kind with a little spatula mounted on the inside of the lid to help you spread the stuff; you’d cut out the good slices of raw prose and stick them onto a convenient sheet of paper, interspersed with handwritten or freshly typed additions. Then you sat down and typed your clean copy from the pasted-up mess thus produced. Now you know how to do it when the internet finally dries up and blows away. (You’re welcome.)
In the same way, you don’t try to write while looking up rules in Strunk & White. Write your piece, set it aside for a while, and then go over it with your well-worn copy of Strunk & White in hand, noting every place you broke one of the rules of style the book suggests you should follow. The first few times, as a learning exercise, you might consider rewriting the whole thing in accordance with those rules—but only the first few times. After that, make your own judgment call: is this a place where you should follow the rules, or is this a place where they need to be bent, broken, or trampled into the dust? Only you, dear reader-turned-writer, can decide.
A second important note deserves to be inserted at this point, though. The contemporary US public school system can be described without too much inaccuracy as a vast mechanism for convincing children that they can’t write. Rigid rules imposed for the convenience of educators rather than the good of the students, part of the industrial mass-production ethos that pervades public schools in this country, leave a great many graduates so bullied, beaten, and bewildered by bad pedagogy that the thought of writing something for anybody else to read makes them turn gray with fear. It’s almost as bad as the terror of public speaking the public schools also go out of their way to inflict, and it plays a comparable role in crippling people’s capacity to communicate outside their narrow circles of friends.
If you suffer from that sort of educational hangover, dear reader, draw a deep breath and relax. The bad grades and nasty little comments in red ink you got from Mrs. Melba McNitpick, your high school English teacher, are no reflection of your actual capacities as a writer. If you can talk, you can write—it’s the same language, after all. For that matter, even if you can’t talk, you may be able to write—there’s a fair number of people out there who are nonverbal for one reason or another, and can still make a keyboard dance.
The reason I mention this here is that the thought of making an independent judgment about when to follow the rules and when to break them fills a great many survivors of American public schools with dread. In far too many cases, students are either expected to follow the rules with mindless obedience and given bad grades if they fail to do so, or given no rules at all and then expected to conform to unstated expectations they have no way to figure out, and either of these forms of bad pedagogy leaves scars. Again, readers who are in this situation should draw a deep breath and relax; having left Mrs. McNitpick’s class, you’re not subject to her opinions any longer, and should ignore them utterly.
So how do you decide where to follow the rules and where to fold, spindle, and mutilate them? That’s where we walk through the walls and into the fire, because what guides you in your decisions regarding the rules of English prose is the factor of literary taste.
Rules can be taught, but taste can only be learned. Does that sound like a paradox? Au contraire, it simply makes the point that only you can learn, refine, and ripen your literary taste—nobody else can do it for you, or even help you to any significant extent—and your sense of taste is therefore going to be irreducibly personal. When it comes to taste, you aren’t answerable to Mrs. McNitpick, to me, to random prose trolls on the internet, or to anyone else. What’s more, you develop your taste for prose the same way you develop your taste for food: by trying lots of different things, figuring out what you like, and paying close attention to what you like, why you like it, and what differentiates it from the things you don’t like as much.
This is applicable, by the way, to every kind of writing, including those kinds at which the snobs among us turn up their well-sharpened noses. I don’t happen to be a fan of the kind of satirical gay pornography that Chuck Tingle has made famous, for example, but friends of mine who are tell me that in that genre, as in all others, there are books that are well written, books that are tolerable, and books that trip over certain overelongated portions of their anatomy and land face first in—well, let’s not go there, shall we? In the same way, if your idea of a good read is nineteenth-century French comedies of manners, you can find a similar spectrum extending from brilliance to bathos.
Every inveterate reader takes in a certain amount of what I call popcorn reading—the sort of thing that’s read once, mildly enjoyed, and then returned to the library, the paperback exchange, or whatever electronic Elysium e-books enter when you hit the delete button. That’s as inevitable as it is harmless. The texts that matter in developing your personal taste, though, are the ones you read more than once, and especially the ones you read over and over again. As you read these for the third or the thirty-third time, step back now and then from the flow of the story or the development of the argument, and notice how the writer uses language. Learn to notice the really well-turned phrases, the figures of speech that are so apt and unexpected that they seize your attention, the moments of humor, the plays on words, the  passages that match tone and pacing to the subject perfectly.
If you’ve got a particular genre in mind—no, let’s stop for a moment and talk about genre, shall we? Those of my readers who endured a normal public school education here in the US probably don’t know that this is pronounced ZHON-ruh (it’s a French word) and it simply means a category of writing. Satirical gay pornography is a genre. The comedy of manners is a genre. The serious contemporary literary novel is a genre.  So are mysteries, romance, science fiction, fantasy, and the list goes on. There are also nonfiction genres—for example, future-oriented social criticism, the genre in which nine of my books from The Long Descent to Dark Age America have their place. Each genre is an answer to the question, “I just read this and I liked it—where can I find something else more or less like it?”
Every genre has its own habits and taboos, and if you want to write for publication, you need to know what those are. That doesn’t mean you have to follow those habits and taboos with the kind of rigid obedience critiqued above—quite the contrary—but you need to know about them, so that when you break the rules you do it deliberately and skillfully, to get the results you want, rather than clumsily, because you didn’t know any better. It also helps to read the classics of the genre—the books that established those habits and taboos—and then go back and read books in the genre written before the classics, to get a sense of what possibilities got misplaced when the classics established the frame through which all later works in that genre would be read.
If you want to write epic fantasy, for example, don’t you dare stop with Tolkien—it’s because so many people stopped with Tolkien that we’ve got so many dreary rehashes of something that was brilliantly innovative in 1949, complete with carbon-copy Dark Lords cackling in chorus and the inevitable and unendearing quest to do something with the Magic McGuffin that alone can save blah blah blah. Read the stuff that influenced Tolkien—William Morris, E.R. Eddison, the Norse sagas, the Kalevala, Beowulf.  Then read something in the way of heroic epic that he probably didn’t get around to reading—the Ramayana, the Heike Monogatari, the Popol Vuh, or what have youand think through what those have to say about the broader genre of heroic wonder tale in which epic fantasy has its place.
The point of this, by the way, isn’t to copy any of these things. It’s to develop your own sense of taste so that you can shape your own prose accordingly. Your goal, if you’re at all serious about writing, isn’t to write like Mrs. McNitpick, like your favorite author of satirical gay pornography or nineteenth-century French comedies of manners, or like me, but to write like yourself.
And that, to extend the same point more broadly, is the goal of any education worth the name. The word “education” itself comes from the Latin word educatio, from ex-ducere, “to lead out or bring out;” it’s about leading or bringing out the undeveloped potentials that exist inside the student, not shoving some indigestible bolus of canned information or technique down the student’s throat. In writing as in all other things that can be learned, that process of bringing out those undeveloped potentials requires the support of rules and examples, but those are means to an end, not ends in themselves—and it’s in the space between the rules and their inevitable exceptions, between the extremes of rigid formalism and shapeless vagueness, that the work of creation takes place.
That’s also true of politics, by the way—and the conventional wisdom of our time fills the same role there that the rules for bad internet prose do for writing. Before we can explore that, though, it’s going to be necessary to take on one of the more pervasive bad habits of contemporary thinking about the relationship between the present and the past. We’ll tackle that next week.
********************In not wholly unrelated news, I’m pleased to announce that Merigan Tales, the anthology of short stories written by Archdruid Report readers set in the world of my novel Star’s Reach, is now in print and available for purchase from Founders House. Those of my readers who enjoyed Star’s Reach and the After Oil anthologies won’t want to miss it.

A Leap in the Dark

Wed, 2016-12-28 17:13
A few days from now, 2016 will have passed into the history books. I know a fair number of people who won’t mourn its departure, but it’s pretty much a given that the New Year celebrations here in the United States, at least, will demonstrate a marked shortage of enthusiasm for the arrival of 2017.
There’s good reason for that, and not just for the bedraggled supporters of Hillary Clinton’s failed and feckless presidential ambitions. None of the pressures that made 2016 a cratered landscape of failed hopes and realized nightmares have gone away. Indeed, many of them are accelerating, as the attempt to maintain a failed model of business as usual in the teeth of political, economic, and environmental realities piles blowback upon blowback onto the loading dock of the new year.
Before we get into that, though, I want to continue the annual Archdruid Report tradition and review the New Year’s predictions that I made at the beginning of 2016. Those of my readers who want to review the original post will find it here. Here’s the gist.
“Thus my core prediction for 2016 is that all the things that got worse in 2015 will keep on getting worse over the year to come. The ongoing depletion of fossil fuels and other nonrenewable resources will keep squeezing the global economy, as the real (i.e., nonfinancial) costs of resource extraction eat up more and more of the world’s total economic output, and this will drive drastic swings in the price of energy and commodities—currently those are still headed down, but they’ll soar again in a few years as demand destruction completes its work. The empty words in Paris a few weeks ago will do nothing to slow the rate at which greenhouse gases are dumped into the atmosphere, raising the economic and human cost of climate-related disasters above 2015’s ghastly totals—and once again, the hard fact that leaving carbon in the ground means giving up the lifestyles that depend on digging it up and burning it is not something that more than a few people will be willing to face.
“Meanwhile, the US economy will continue to sputter and stumble as politicians and financiers try to make up for ongoing declines in real (i.e., nonfinancial) wealth by manufacturing paper wealth at an even more preposterous pace than before, and frantic jerryrigging will keep the stock market from reflecting the actual, increasingly dismal state of the economy.  We’re already in a steep economic downturn, and it’s going to get worse over the year to come, but you won’t find out about that from the mainstream media, which will be full of the usual fact-free cheerleading; you’ll have to watch the rates at which the people you know are being laid off and businesses are shutting their doors instead.” 
It’s almost superfluous to point out that I called it. It’s been noted with much irritation by other bloggers in what’s left of the peak oil blogosphere that it takes no great talent to notice what’s going wrong, and point out that it’s just going to keep on heading the same direction. This I cheerfully admit—but it’s also relevant to note that this method produces accurate predictions. Meanwhile, the world-saving energy breakthroughs, global changes in consciousness, sudden total economic collapses, and other events that get predicted elsewhere year after weary year have been notable by their absence.
I quite understand why it’s still popular to predict these things: after all, they allow people to pretend that they can expect some future other than the one they’re making day after day by their own actions. Nonetheless, the old saying remains true—“if you always do what you’ve always done, you’ll always get what you’ve always gotten”—and I wonder how many of the people who spend each year daydreaming about the energy breakthroughs, changes in consciousness, economic collapses, et al, rather than coming to grips with the rising spiral of crises facing industrial civilization, really want to deal with the future that they’re storing up for themselves by indulging in this habit.
Let’s go on, though.  At the beginning of 2016, I also made four specific predictions, which I admitted at the time were long shots. One of those, specific prediction #3, was that the most likely outcome of the 2016 presidential election would be the inauguration of Donald Trump as President in January 2017. I don’t think I need to say much about that, as it’s already been discussed here at length.  The only thing I’d like to point out here is that much of the Democratic party seems to be fixated on finding someone or something to blame for the debacle, other than the stark incompetence of the Clinton campaign and the failure of Democrats generally to pay attention to anything outside the self-referential echo chambers of affluent liberal opinion. If they keep it up, it’s pretty much a given that Trump will win reelection in 2020.
The other three specific long-shot predictions didn’t pan out, at least not in the way that I anticipated, and it’s only fair—and may be helpful, as we head further into the unknown territory we call 2017—to talk about what didn’t happen, and why.
Specific prediction #1 was that the next tech bust would be under way by the end of 2016.  That’s happening, but not in the way I expected. Back in January I was looking at the maniacally overinflated stock prices of tech companies that have never made a cent in profit and have no meaningful plans to do so, and I expected a repeat of the “tech wreck” of 2000. The difficulty was simply I didn’t take into account the most important economic shift between 2000 and 2016—the de facto policy of negative interest rates being pursued by the Federal Reserve and certain other central banks.
That policy’s going to get a post of its own one of these days, because it marks the arrival of a basic transformation in economic realities that’s as incomprehensible to neoliberal economists as it will be challenging to most of the rest of us. The point I want to discuss here here, though, is a much simpler one. Whenever real interest rates are below zero, those elite borrowers who can get access to money on those terms are being paid to borrow.  Among many other things, this makes it a lot easier to stretch out the downward arc of a failing industry. Cheaper-than-free money is one of the main things that kept the fracking industry from crashing and burning from its own unprofitability once the price of oil plunged in 2013; there’s been a steady string of bankruptcies in the fracking industry and the production of oil from fracked wells has dropped steadily, but it wasn’t the crash many of us expected.
The same thing is happening, in equally slow motion, with the current tech bubble. Real estate prices in San Francisco and other tech hotspots are sliding, overpaid tech employees are being systematically replaced by underpaid foreign workers, the numbers are looking uglier by the week, but the sudden flight of investment money that made the “tech wreck” so colorful sixteen years ago isn’t happening, because tech firms can draw on oceans of relatively cheap funding to turn the sudden popping of the tech bubble into the slow hiss of escaping air. That doesn’t mean that the boom-and-bust cycle has been cancelled—far from it—but it does mean that shoveling bad money after good has just become a lot easier. Exactly how that will impact the economy is a very interesting question that nobody just now knows how to answer.
Let’s move on.  Specific prediction #2 was that the marketing of what would inevitably be called “the PV revolution” would get going in a big way in 2016. Those of my readers who’ve been watching the peak oil scene for more than a few years know that ever since the concept of peak oil clawed its way back out of its long exile in the wilderness of the modern imagination, one energy source after anobter has been trotted out as the reason du jour why the absurdly extravagant lifestyles of today’s privileged classes can roll unhindered into the future.  I figured, based on the way that people in the mainstream environmentalist movement were closing ranks around renewables, that photovoltaic solar energy would be the next beneficiary of that process, and would take off in a big way as the year proceeded.
That this didn’t happen is not the fault of the solar PV industry or its cheerleades in the green media. Naomi Oreskes’ strident insistence a while back that raising questions about the economic viability of renewable energy is just another form of climate denialism seems to have become the party line throughout the privileged end of the green left, and the industrialists are following suit. Elon Musk, whose entire industrial empire has been built on lavish federal subsidies, is back at the feed trough again, announcing a grandiose new plan to manufacture photovoltaic roof shingles; he’s far and away the most colorful of the would-be renewable-energy magnates, but others are elbowing their way toward the trough as well, seeking their own share of the spoils.
The difficulty here is twofold. First, the self-referential cluelessness of the Democratic party since the 2008 election has had the inevitable blowback—something like 1000 state and federal elective offices held by Democrats after that election are held by Republicans today—and the GOP’s traditional hostility toward renewable energy has put a lid on the increased subsidies that would have been needed to kick a solar PV feeding frenzy into the same kind of overdrive we’ve already seen with ethanol and wind. Solar photovoltaic power, like ethanol from corn, has a disastrously low energy return on energy invested—as Pedro Prieto and Charles Hall showed in their 2015 study of real-world data from Spain’s solar PV program, the EROEI on large-scale grid photovoltaic power works out in practice to less than 2.5—and so, like nuclear power, it’s only economically viable if it’s propped up by massive and continuing subsidies. Lacking those, the “PV revolution” is dead in the water.
The second point, though, is the more damaging.  The “recovery” after the 2008-2009 real estate crash was little more than an artifact of statistical manipulation, and even negative interest rates haven’t been able to get a heartbeat going in the economy’s prostrate body. As most economic measurements not subject to fiddling by the enthusiastic accountants of the federal government slide steadily downhill, the economic surplus needed to support any kind of renewables buildout at all is rapidly tricking away. Demand destruction is in the driver’s seat, and the one way of decreasing fossil fuel consumption that affluent environmentalists don’t want to talk about—conservation—is the only viable option just now.
Specific prediction #4 was that the Saudi regime in Arabia would collapse by the end of 2016. As I noted at the time, the replacement of the Saudi monarchy with some other form of government is for all practical purposes a done deal. Of the factors I cited then—the impending bankruptcy of a regime that survives only by buying off dissent with oil money, the military quagmires in Yemen, Syria, and Iraq that have the Saudi military and its foreign mercenaries bogged down inextricably, and the rest of it—none have gone away. Nor has the underlying cause, the ongoing depletion of the once-immense oil reserves that have propped up the Saudi state so far.
That said, as I noted back in January, it’s anyone’s guess what cascade of events will send the Saudi royal family fleeing to refuges overseas while mobs rampage through their abandoned palaces in Riyadh, and some combination of mid-level military officers and Muslim clerics piece together a provisional government in their absence. I thought that it was entirely possible that this would happen in 2016, and of course it didn’t. It’s possible at this point that the price of oil could rise fast enough to give the Saudi regime another lease on life, however brief. That said, the winds are changing across the Middle East; the Russian-Iranian alliance is in the ascendant, and the Saudis have very few options left. It will be interesting, in the sense of the apocryphal Chinese curse, to see how long they survive.
So that’s where we stand, as 2016 stumbles down the ramp into time’s slaughterhouse and 2017 prepares to take its place in the ragged pastures of history. What can we expect in the year ahead?
To some extent, I’ve already answered that question—but only to some extent. Most of the factors that drove events in 2016 are still in place, still pressing in the same direction, and “more of the same” is a fair description of the consequences. Day after day, the remaining fossil fuel reserves of a finite planet are being drawn down to maintain the extravagant and unsustainable lifestyles of the industrial world’s more privileged inmates. Those remaining reserves are increasingly dirty, increasingly costly to extract and process, increasingly laden with a witch’s brew of social, economic, and environmental costs that nobody anywhere is willing to make the fossil fuel industry cover, and those costs don’t go away just because they’re being ignored—they pile up in society, the economy, and the biosphere, producing the rising tide of systemic dysfunction that plays so large and unmentioned a role in daily life today.
Thus we can expect still more social turmoil, more economic instability, and more environmental blowback in 2017. The ferocious populist backlash against the economic status quo that stunned the affluent in Britain and America with the Brexit vote and Trump’s presidential victory respectively, isn’t going away until and unless the valid grievances of the working classes get heard and addressed by political establishments around the industrial world; to judge by examples so far, that’s unlikely to happen any time soon. At the same time, the mismatch between the lifestyles we can afford and the lifestyles that too many of us want to preserve remains immense, and until that changes, the global economy is going to keep on lurching from one crisis to another. Meanwhile the biosphere is responding to the many perturbations imposed on it by human stupidity in the way that systems theory predicts—with ponderous but implacable shifts toward new conditions, many of which don’t augur well for the survival of industrial society.
There are wild cards in the deck, though, and one of them is being played right now over the North Pole. As I write this, air temperatures over the Arctic ice cap are 50°F warmer than usual for this time of year. A destabilized jet stream is sucking masses of warm air north into the Arctic skies, while pushing masses of Arctic air down into the temperate zone. As a result, winter ice formation on the surface of the Arctic ocean has dropped to levels tht were apparently last seen before our species got around to evolving—and a real possibility exists, though it’s by no means a certainty yet, that next summer could see most of the Arctic Ocean free of ice.
Nobody knows what that will do to the global climate. The climatologists who’ve been trying to model the diabolically complex series of cascading feedback loops we call “global climate” have no clue—they have theories and computer models, but so far their ability to predict the rate and consequences of anthropogenic climate change have not exactly been impressive. (For what it’s worth, by the way, most of their computer models have turned out to be far too conservative in their predictions.) Nobody knows yet whether the soaring temperatures over the North Pole this winter are a fluke, a transitory phenomenon driven by the unruly transition between one climate regime and another, or the beginning of a recurring pattern that will restore the north coast of Canada to the conditions it had during the Miocene, when crocodiles sunned themselves on the warm beaches of northern Greenland. We simply don’t know.
In the same way, the populist backlash mentioned above is a wild card whose effects nobody can predict just now. The neoliberal economics that have been welded into place in the industrial world for the last thirty years have failed comprehensively, that’s clear enough.  The abolition of barriers to the flow of goods, capital, and population did not bring the global prosperity that neoliberal economists promised, and now the bill is coming due. The question is what the unraveling of the neoliberal system means for national economies in the years ahead.
There are people—granted, these are mostly neoliberal economists and those who’ve drunk rather too freely of the neoliberal koolaid—who insist that the abandonment of the neoliberal project will inevitably mean economic stagnation and contraction. There are those who insist that the abandonment of the neoliberal project will inevitably mean a return to relative prosperity here in the US, as offshored jobs are forced back stateside by tax policies that penalize imports, and the US balance of trade reverts to something a little closer to parity. The fact of the matter is that nobody knows what the results will be. Here as in Britain, voters faced with a choice between the perpetuation of an intolerable status quo and a leap in the dark chose the latter, and the consequences of that leap can’t be known in advance.
Other examples abound. The US president-elect has claimed repeatedly that the US under his lead will get out of the regime-change business and pursue a less monomaniacally militaristic foreign policy than the one it’s pursued under Bush and Obama, and would have pursued under Clinton. The end of the US neoconservative consensus is a huge change that will send shockwaves through the global political system. Another change, at least as huge, is the rise of Russia as a major player in the Middle East. Another? The remilitarization of Japan and its increasingly forceful pursuit of political and military alliances in East and South Asia. There are others. The familiar order of global politics is changing fast. What will the outcome be? Nobody knows.
As 2017 dawns, in a great many ways, modern industrial civilization has flung itself forward into a darkness where no stars offer guidance and no echoes tell what lies ahead. I suspect that when we look back at the end of this year, the predictable unfolding of ongoing trends will have to be weighed against sudden discontinuities that nobody anywhere saw coming.  We’re not discussing the end of the world, of course; we’re talking events like those that can be found repeated many times in the histories of other failing civilizations.  That said, my guess is that some of those discontinuities are going to be harsh ones.  Those who brace themselves for serious trouble and reduce their vulnerabilities to a brittle and dysfunctional system will be more likely to come through in one piece.
Those who are about to celebrate the end of 2016, in other words, might want to moderate their cheering when it’s over. It’s entirely possible that 2017 will turn out to be rather worse—despite which I hope that the readers of this blog, and the people they care about, will manage to have a happy New Year anyway.

A Season of Consequences

Wed, 2016-12-21 11:33
One of the many advantages of being a Druid is that you get to open your holiday presents four days early. The winter solstice—Alban Arthuan, to use one term for it in the old-fashioned Druid Revival traditions I practice—is one of the four main holy days of the Druid year. Though the actual moment of solstice wobbles across a narrow wedge of the calendar, the celebration traditionally takes place on December 21.  Yes, Druids give each other presents, hang up decorations, and enjoy as sumptuous a meal as resources permit, to celebrate the rekindling of light and hope in the season of darkness.
Come to think of it, I’m far from sure why more people who don’t practice the Christian faith still celebrate Christmas, rather than the solstice. It’s by no means necessary to believe in the Druid gods and goddesses to find the solstice relevant; a simple faith in orbital inclination is sufficient reason for the season, after all—and since a good many Christians in America these days are less than happy about what’s been done to their holy day, it seems to me that it would be polite to leave Christmas to them, have our celebrations four days earlier, and cover their shifts at work on December 25th in exchange for their covering ours on the 21st. (Back before my writing career got going, when I worked in nursing homes to pay the bills, my Christian coworkers and I did this as a matter of course; we also swapped shifts around Easter and the spring equinox. Religious pluralism has its benefits.)
Those of my readers who don’t happen to be Druids, but who are tempted by the prospect just sketched out, will want to be aware of a couple of details. For one thing, you won’t catch Druids killing a tree in order to stick it in their living room for a few weeks as a portable ornament stand and fire hazard. Druids think there should be more trees in the world, not fewer! A live tree or, if you must, an artificial one, would be a workable option, but a lot of Druids simply skip the tree altogether and hang ornaments on the mantel, or what have you.
Oh, and most of us don’t do Santa Claus. I’m not sure why Santa Claus is popular among Christians, for that matter, or among anyone else who isn’t a devout believer in the ersatz religion of Consumerism—which admittedly has no shortage of devotees just now. There was a time when Santa hadn’t yet been turned into a poorly paid marketing consultant to the toy industry; go back several centuries, and he was the Christian figure of St. Nicholas; and before then he may have been something considerably stranger. To those who know their way around the traditions of Siberian shamanism, certainly, the conjunction of flying reindeer and an outfit colored like the famous and perilous hallucinogenic mushroom Amanita muscaria is at least suggestive.
Still, whether he takes the form of salesman, saint, or magic mushroom, Druids tend to give the guy in the red outfit a pass. Solstice symbolism varies from one tradition of Druidry to another—like almost everything else among Druids—but in the end of the tradition I practice, each of the Alban Gates (the solstices and equinoxes) has its own sacred animal, and the animal that corresponds to Alban Arthuan is the bear. If by some bizarre concatenation of circumstances Druidry ever became a large enough faith in America to attract the attention of the crazed marketing minions of consumerdom, you’d doubtless see Hallmark solstice cards for sale with sappy looking cartoon bears on them, bear-themed decorations in windows, bear ornaments to hang from the mantel, and the like.
While I could do without the sappy looking cartoons, I definitely see the point of bears as an emblem of the winter solstice, because there’s something about them that too often gets left out of the symbolism of Christmas and the like—though it used to be there, and relatively important, too. Bears are cute, no question; they’re warm and furry and cuddlesome, too; but they’re also, ahem, carnivores, and every so often, when people get sufficiently stupid in the vicinity of bears, the bears kill and eat them.
That is to say, bears remind us that actions have consequences.
I’m old enough that I still remember the days when the folk mythology surrounding Santa Claus had not quite shed the last traces of a similar reminder. According to the accounts of Santa I learned as a child, naughty little children ran a serious risk of waking up Christmas morning to find no presents at all, and a sorry little lump of coal in their stockings in place of the goodies they expected. I don’t recall any of my playmates having that happen to them, and it never happened to me—though I arguably deserved it rather more than once—but every child I knew took it seriously, and tried to moderate their misbehavior at least a little during the period after Thanksgiving. That detail of the legend may still survive here and there, for all I know, but you wouldn’t know it from the way the big guy in red is retailed by the media these days.
For that matter, the version I learned was a pale shadow of a far more unnerving original. In many parts of Europe, when St. Nicholas does the rounds, he’s accompanied by a frightening figure with various names and forms. In parts of Germany, Switzerland, and Austria, it’s Krampus—a hairy devil with goat’s horns and a long lolling tongue, who prances around with a birch switch in his hand and a wicker basket on his back. While the saint hands out presents to good children, Krampus is there for the benefit of the others; small-time junior malefactors can expect a thrashing with the birch switch, while the legend has it that the shrieking, spoiled little horrors at the far end of the naughty-child spectrum get popped into the wicker basket and taken away, and nobody ever hears from them again.
Yes, I know, that sort of thing’s unthinkable in today’s America, and I have no idea whether anyone still takes it with any degree of seriousness over in Europe. Those of my readers who find the entire concept intolerable, though, may want to stop for a moment and think about the context in which that bit of folk tradition emerged. Before fossil fuels gave the world’s industrial nations the temporary spate of abundance that they now enjoy, the coming of winter in the northern temperate zone was a serious matter. The other three seasons had to be full of hard work and careful husbandry, if you were going to have any particular likelihood of seeing spring before you starved or froze to death.
By the time the solstice came around, you had a tolerably good idea just how tight things were going to be by the time spring arrived and the first wild edibles showed up to pad out the larder a bit. The first pale gleam of dawn after the long solstice night was a welcome reminder that spring was indeed on its way, and so you took whatever stored food you could spare, if you could spare any at all, and turned it into a high-calorie, high-nutrient feast, to provide warm memories and a little additional nourishment for the bleak months immediately ahead.
In those days, remember, children who refused to carry their share of the household economy might indeed expect to be taken away and never be heard from again, though the taking away would normally be done by some combination of hunger, cold, and sickness, rather than a horned and hairy devil with a lolling tongue. Of course a great many children died anyway.  A failed harvest, a longer than usual winter, an epidemic, or the ordinary hazards of life in a nonindustrial society quite regularly put a burst of small graves in the nearest churchyard. It was nonetheless true that good children, meaning here those who paid attention, learned fast, worked hard, and did their best to help keep the household running smoothly, really did have a better shot at survival.
One of the most destructive consequences of the age of temporary abundance that fossil fuels gave to the world’s industrial nations, in turn, is the widespread conviction that consequences don’t matter—that it’s unreasonable, even unfair, to expect anyone to have to deal with the blowback from their own choices. That’s a pervasive notion these days, and its effects show up in an astonishing array of contexts throughout contemporary culture, but yes, it’s particularly apparent when it comes to the way children get raised in the United States these days.
The interesting thing here is that the children aren’t necessarily happy about that. If you’ve ever watched a child systematically misbehave in an attempt to get a parent to react, you already know that kids by and large want to know where the limits are. It’s the adults who want to give tests and then demand that nobody be allowed to fail them, who insist that everybody has to get an equal share of the goodies no matter how much or little they’ve done to earn them, and so on through the whole litany of attempts to erase the reality that actions have consequences.
That erasure goes very deep. Have you noticed, for example, that year after year, at least here in the United States, the Halloween monsters on public display get less and less frightening? These days, far more often than not, the ghosts and witches, vampires and Frankenstein’s monsters splashed over Hallmark cards and window displays in the late October monster ghetto have big goofy grins and big soft eyes. The wholesome primal terrors that made each of these things iconic in the first place—the presence of the unquiet dead, the threat of wicked magic, the ghastly vision of walking corpses, whether risen from the grave to drink your blood or reassembled and reanimated by science run amok—are denied to children, and saccharine simulacra are propped up in their places.
Here again, children aren’t necessarily happy about that. The bizarre modern recrudescence of the Victorian notion that children are innocent little angels tells me, if nothing else, that most adults must go very far out of their way to forget their own childhoods. Children aren’t innocent little angels; they’re fierce little animals, which is of course exactly what they should be, and they need roughly the same blend of gentleness and discipline that wolves use on their pups to teach them to moderate their fierceness and live in relative amity with the other members of the pack.  Being fierce, they like to be scared a little from time to time; that’s why they like to tell each other ghost stories, the more ghoulish the better, and why they run with lolling tongues toward anything that promises them a little vicarious blood and gore. The early twentieth century humorist Ogden Nash nailed it when he titled one of his poems “Don’t Cry, Darling, It’s Blood All Right.”
Traditional fairy tales delighted countless generations of children for three good and sufficient reasons. First of all, they’re packed full of wonderful events. Second, they’re positively dripping with gore, which as already noted is an instant attraction to any self-respecting child. Third, they’ve got a moral—which means, again, that they are about consequences. The selfish, cruel, and stupid characters don’t get patted on the head, given the same prize as everyone else, and shielded from the results of their selfishness, cruelty, and stupidity; instead, they get gobbled up by monsters, turned to stone by witches’ curses, or subjected to some other suitably grisly doom. It’s the characters who are honest, brave, and kind who go on to become King or Queen of Everywhere.
Such things are utterly unacceptable, according to the approved child-rearing notions of our day.  Ask why this should be the case and you can count on being told that expecting a child to have to deal with the consequences of its actions decreases it’s self-esteem. No doubt that’s true, but this is another of those many cases where people in our society manage not to notice that the opposite of one bad thing is usually another bad thing. Is there such a thing as too little self-esteem? Of course—but there is also such a thing as too much self-esteem. In fact, we have a common and convenient English word for somebody who has too much self-esteem. That word is “jerk.”
The cult of self-esteem in contemporary pop psychology has thus produced a bumper crop of jerks in today’s America. I’m thinking here, among many other examples, of the woman who made the news a little while back by strolling right past the boarding desk at an airport, going down the ramp, and taking her seat on the airplane ahead of all the other passengers, just because she felt she was entitled to do so. When the cabin crew asked her to leave and wait her turn like everyone else, she ignored them; security was called, and she ignored them, too. They finally had to drag her down the aisle and up the ramp like a sack of potatoes, and hand her over to the police. I’m pleased to say she’s up on charges now.
That woman had tremendous self-esteem. She esteemed herself so highly that she was convinced that the rules that applied to everyone else surely couldn’t apply to her—and that’s normally the kind of attitude you can count on from someone whose self-esteem has gone up into the toxic-overdose range. Yet the touchstone of excessive self-esteem, the gold standard of jerkdom, is the complete unwillingness to acknowledge the possibility that actions have consequences and you might have to deal with those, whether you want to or not.
That sort of thing is stunningly common in today’s society. It was that kind of overinflated self-esteem that convinced affluent liberals in the United States and Europe that they could spend thirty years backing policies that pandered to their interests while slamming working people face first into the gravel, without ever having to deal with the kind of blowback that arrived so dramatically in the year just past. Now Britain is on its way out of the European Union, Donald Trump is mailing invitations to his inaugural ball, and the blowback’s not finished yet. Try to point this out to the people whose choices made that blowback inevitable, though, and if my experience is anything to go by, you’ll be ignored if you’re not shouted down.
On an even greater scale, of course, there’s the conviction on the part of an astonishing number of people that we can keep on treating this planet as a combination cookie jar to raid and garbage bin to dump wastes in, and never have to deal with the consequences of that appallingly shortsighted set of policies. That’s as true in large swathes of the allegedly green end of things, by the way, as it is among the loudest proponents of smokestacks and strip mines. I’ve long since lost track of the number of people I’ve met who insist loudly on how much they love the Earth and how urgent it is that “we” protect the environment, but who aren’t willing to make a single meaningful change in their own personal consumption of resources and production of pollutants to help that happen.
Consequences don’t go away just because we don’t want to deal with them. That lesson is being taught right now on low-lying seacoasts around the world, where streets that used to be well above the high tide line reliably flood with seawater when a high tide meets an onshore wind; it’s being taught on the ice sheets of Greenland and West Antarctica, which are moving with a decidedly un-glacial rapidity through a trajectory of collapse that hasn’t been seen since the end of the last ice age; it’s being taught in a hundred half-noticed corners of an increasingly dysfunctional global economy, as the externalized costs of technological progress pile up unnoticed and drag economic activity to a halt; and of course it’s being taught, as already noted, in the capitals of the industrial world, where the neoliberal orthodoxy of the last thirty years is reeling under the blows of a furious populist backlash.
It didn’t have to be learned that way. We could have learned it from Krampus or the old Santa Claus, the one who was entirely willing to leave a badly behaved child’s stocking empty on Christmas morning except for that single eloquent lump of coal; we could have learned it from the fairy tales that taught generations of children that consequences matter; we could have learned it from any number of other sources, given a little less single-minded a fixation on maximizing self-esteem right past the red line on the meter—but enough of us didn’t learn it that way, and so here we are.
I’d therefore like to encourage those of my readers who have young children in their lives to consider going out and picking up a good old-fashioned collection of fairy tales, by Charles Perrault or the Brothers Grimm, and use those in place of the latest mass-marketed consequence-free pap when it comes to storytelling time. The children will thank you for it, and so will everyone who has to deal with them in their adult lives. Come to think of it, those of my readers who don’t happen to have young children in their lives might consider doing the same thing for their own benefit, restocking their imaginations with cannibal giants and the other distinctly unmodern conveniences thereof, and benefiting accordingly.
And if, dear reader, you are ever tempted to climb into the lap of the universe and demand that it fork over a long list of goodies, and you glance up expecting to see the jolly and long-suffering face of Santa Claus beaming down at you, don’t be too surprised if you end up staring in horror at the leering yellow eyes and lolling tongue of Krampus instead, as he ponders whether you’ve earned a thrashing with the birch switch or a ride in the wicker basket—or perhaps the great furry face of the Solstice bear, the beast of Alban Arthuan, as she blinks myopically at you for a moment before she either shoves you from her lap with one powerful paw, or tears your arm off and gnaws on it meditatively while you bleed to death on the cold, cold ground.
Because the universe doesn’t care what you think you deserve. It really doesn’t—and, by the way, the willingness of your fellow human beings to take your wants and needs into account will by and large be precisely measured by your willingness to do the same for them.
And on that utterly seasonal note, I wish all my fellow Druids a wonderful solstice; all my Christian friends and readers, a very merry Christmas; and all my readers, whatever their faith or lack thereof, a rekindling of light, hope, and sanity in a dark and troubled time.

Why the Peak Oil Movement Failed

Wed, 2016-12-14 16:12
As I glance back across the trajectory of this blog over the last ten and a half years, one change stands out. When I began blogging in May of 2006, peak oil—the imminent peaking of global production of conventional petroleum, to unpack that gnomic phrase a little—was the central theme of a large, vocal, and tolerably well organized movement. It had its own visible advocacy organizations, it had national and international conferences, it had a small but noticeable presence in the political sphere, and it showed every sign of making its presence felt in the broader conversation of our time.
Today none of that is true. Of the three major peak oil organizations in the US, ASPO-USA—that’s the US branch of the Association for the Study of Peak Oil and Gas, for those who don’t happen to be fluent in acronym—is apparently moribund; Post Carbon Institute, while it still plays a helpful role from time to time as a platform for veteran peak oil researcher Richard Heinberg, has otherwise largely abandoned its former peak oil focus in favor of generic liberal environmentalism; and the US branch of the Transition organization, formerly the Transition Town movement, is spinning its wheels in a rut laid down years back. The conferences ASPO-USA once hosted in Washington DC, with congresscritters in attendance, stopped years ago, and an attempt to host a national conference in southern Pennsylvania fizzled after three years and will apparently not be restarted.
Ten years ago, for that matter, opinion blogs and news aggregators with a peak oil theme were all over the internet. Today that’s no longer the case, either. The fate of the two most influential peak oil sites, The Oil Drum and Energy Bulletin, is indicative. The Oil Drum simply folded, leaving its existing pages up as a legacy of a departed era.  Energy Bulletin, for its part, was taken over by Post Carbon Institute and given a new name and theme as Resilience.org. It then followed PCI in its drift toward the already overcrowded environmental mainstream, replacing the detailed assessment of energy futures that was the staple fare of Energy Bulletin with the sort of uncritical enthusiasm for an assortment of vaguely green causes more typical of the pages of Yes!Magazine.
There are still some peak oil sites soldiering away—notably Peak Oil Barrel, under the direction of former Oil Drum regular Ron Patterson.  There are also a handful of public figures still trying to keep the concept in circulation, with the aforementioned Richard Heinberg arguably first among them. Aside from those few, though, what was once a significant movement is for all practical purposes dead. The question that deserves asking is simple enough: what happened?
One obvious answer is that the peak oil movement was the victim of its own failed predictions. It’s true, to be sure, that failed predictions were a commonplace of the peak oil scene. It wasn’t just the overenthusiastic promoters of alternative energy technologies, who year after year insisted that the next twelve months would see their pet technology leap out of its current obscurity to make petroleum a fading memory; it wasn’t just their exact equivalents, the overenthusiastic promoters of apocalyptic predictions, who year after year insisted that the next twelve months would see the collapse of the global economy, the outbreak of World War III, the imposition of a genocidal police state, or whatever other sudden cataclysm happened to have seized their fancy.
No, the problem with failed predictions ran straight through the movement, even—or especially—in its more serious manifestations. The standard model of the future accepted through most of the peak oil scene started from a set of inescapable facts and an unexamined assumption, and the combination of those things produced consistently false predictions. The inescapable facts were that the Earth is finite, that it contains a finite supply of petroleum, and that various lines of evidence showed conclusively that global production of conventional petroleum was approaching its peak for hard geological reasons, and could no longer keep increasing thereafter.
The unexamined assumption was that geological realities rather than economic forces would govern how fast the remaining reserves of conventional petroleum would be extracted. On that basis, most people in the peak oil movement assumed that as production peaked and began to decline, the price of petroleum would rise rapidly, placing an increasingly obvious burden on the global economy. The optimists in the movement argued that this, in turn, would force nations around the world to recognize what was going on and make the transition to other energy sources, and to the massive conservation programs that would be needed to deal with the gap between the cheap abundant energy that petroleum used to provide and the more expensive and less abundant energy available from other sources. The pessimists, for their part, argued that it was already too late for such a transition, and that industrial civilization would come apart at the seams.
As it turned out, though, the unexamined assumption was wrong. Geological realities imposed, and continue to impose, upper limits on global petroleum production, but economic forces have determined how much less than those upper limits would actually be produced. What happened, as a result, is that when oil prices spiked in 2007 and 2008, and then again in 2014 and 2015, consumers cut back on their use of petroleum products, while producers hurried to bring marginal petroleum sources such as tar sands and oil shales into production to take advantage of the high prices. Both those steps drove prices back down. Low prices, in turn, encouraged consumers to use more petroleum products, and forced producers to shut down marginal sources that couldn’t turn a profit when oil was less than $80 a barrel; both these steps, in turn, sent prices back up.
That doesn’t mean that peak oil has gone away. As oilmen like to say, depletion never sleeps; each time the world passes through the cycle just described, the global economy takes another body blow, and the marginal petroleum sources cost much more to extract and process than the light sweet crude on which the oil industry used to rely. The result, though, is that instead of a sudden upward zoom in prices that couldn’t be ignored, we’ve gotten wild swings in commodity prices, political and social turmoil, and a global economy stuck in creeping dysfunction that stubbornly refuses to behave the way it did when petroleum was still cheap and abundant. The peak oil movement wasn’t prepared for that future.
Granting all this, failed predictions aren’t enough by themselves to stop a movement in its tracks. Here in the United States, especially, we’ve got an astonishing tolerance for predictive idiocy. The economists who insisted that neoliberal policies would surely bring prosperity, for example, haven’t been laughed into obscurity by the mere fact that they were dead wrong; au contraire, they’re still drawing their paychecks and being taken seriously by politicians and the media. The pundits who insisted at the top of their lungs that Britain wouldn’t vote for Brexit and Donald Trump couldn’t possibly win the US presidency are still being taken seriously, too. Nor, to move closer to the activist fringes, has the climate change movement been badly hurt by the embarrassingly linear models of imminent doom it used to deploy with such abandon; the climate change movement is in deep trouble, granted, but its failure has other causes.
It was the indirect impacts of those failed predictions, rather, that helped run the peak oil movement into the ground. The most important of these, to my mind, was the way that those predictions encouraged people in the movement to put their faith in the notion that sometime very soon, governments and businesses would have to take peak oil seriously. That’s what inspired ASPO-USA, for example, to set up a lobbying office in Washington DC with a paid executive director, when the long-term funding for such a project hadn’t yet been secured. On another plane, that’s what undergirded the entire strategy of the Transition Town movement in its original incarnation: get plans drawn up and officially accepted by as many town governments as possible, so that once the arrival of peak oil becomes impossible to ignore, the plan for what to do about it would already be in place.
Of course the difficulty in both cases was that the glorious day of public recognition never arrived. The movement assumed that events would prove its case in the eyes of the general public and the political system alike, and so made no realistic plans about what to do if that didn’t happen. When it didn’t happen, in turn, the movement was left twisting in the wind.
The conviction that politicians, pundits, and the public would be forced by events to acknowledge the truth about peak oil had other consequences that helped hamstring the movement. Outreach to the vast majority that wasn’t yet on board the peak oil bandwagon, for example, got far too little attention or funding. Early on in the movement, several books meant for general audiences—James Howard Kunstler’s The Long Emergency and Richard Heinberg’s The Party’s Over are arguably the best examples—helped lay the foundations for a more effective outreach program, but the organized followup that might have built on those foundations never really happened. Waiting on events took the place of shaping events, and that’s almost always a guarantee of failure.
One particular form of waiting on events that took a particularly steep toll on the movement was its attempts to get funding from wealthy donors. I’ve been told that Post Carbon Institute got itself funded in this way, while as far as I know, ASPO-USA never did. Win or lose, though, begging for scraps at the tables of the rich is a sucker’s game.  In social change as in every other aspect of life, who pays the piper calls the tune, and the rich—who benefit more than anyone else from business as usual—can be counted on to defend their interest by funding only those activities that don’t seriously threaten the continuation of business as usual. Successful movements for social change start by taking effective action with the resources they can muster by themselves, and build their own funding base by attracting people who believe in their mission strongly enough to help pay for it.
There were other reasons why the peak oil movement failed, of course. To its credit, it managed to avoid two of the factors that ran the climate change movement into the ground, as detailed in the essay linked above—it never became a partisan issue, mostly because no political party in the US was willing to touch it with a ten foot pole, and the purity politics that insists that supporters of one cause are only acceptable in its ranks if they also subscribe to a laundry list of other causes never really got a foothold outside of certain limited circles. Piggybacking—the flipside of purity politics, which demands that no movement be allowed to solve one problem without solving every other problem as well—was more of a problem, and so, in a big way, was pandering to the privileged—I long ago lost track of the number of times I heard people in the peak oil scene insist that this or that high-end technology, which was only affordable by the well-to-do, was a meaningful response to the coming of peak oil.
There are doubtless other reasons as well; it’s a feature of all things human that failure is usually overdetermined. At this point, though, I’d like to set that aside for a moment and consider two other points. The first is that the movement didn’t have to fail the way it did. The second is that it could still be revived and gotten back on a more productive track.
To begin with, not everyone in the peak oil scene bought into the unexamined assumption I’ve critiqued above. Well before the movement started running itself into the ground, some of us pointed out that economic factors were going to have a massive impact on the rates of petroleum production and consumption—my first essay on that theme appeared here in April of 2007, and I was far from the first person to notice it. The movement by that time was so invested in its own predictions, with their apparent promise of public recognition and funding, that those concerns didn’t have an impact at the time. Even when the stratospheric oil price spike of 2008 was followed by a bust, though, peak oil organizations by and large don’t seem to have reconsidered their strategies. A mid-course correction at that point, wrenching though it might have been, could have kept the movement alive.
There were also plenty of good examples of effective movements for social change from which useful lessons could have been drawn. One difficulty is that you won’t find such examples in today’s liberal environmental mainstream, which for all practical purposes hasn’t won a battle since Richard Nixon signed the Clean Air Act. The struggle for the right to same-sex marriage, as I’ve noted before, is quite another matter—a grassroots movement that, despite sparse funding and strenuous opposition, played a long game extremely well and achieved its goal. There are other such examples, on both sides of today’s partisan divide, from which useful lessons can be drawn. Pay attention to how movements for change succeed and how they fail, and it’s not hard to figure out how to play the game effectively. That could have been done at any point in the history of the peak oil movement. It could still be done now.
Like same-sex marriage, after all, peak oil isn’t inherently a partisan issue. Like same-sex marriage, it offers plenty of room for compromise and coalition-building. Like same-sex marriage, it’s a single issue, not a fossilized total worldview like those that play so large and dysfunctional a role in today’s political nonconversations. A peak oil movement that placed itself squarely in the abandoned center of contemporary politics, played both sides against each other, and kept its eyes squarely on the prize—educating politicians and the public about the reality of finite fossil fuel reserves, and pushing for projects that will mitigate the cascading environmental and economic impacts of peak oil—could do a great deal to  reshape our collective narrative about energy and, in the process, accomplish quite a bit to make the long road down from peak oil less brutal than it will otherwise be.
I’m sorry to say that the phrase “peak oil,” familiar and convenient as it is, probably has to go.  The failures of the movement that coalesced around that phrase were serious and visible enough that some new moniker will be needed for the time being, to avoid being tarred with a well-used brush. The crucial concept of net energy—the energy a given resource provides once you subtract the energy needed to extract, process, and use it—would have to be central to the first rounds of education and publicity; since it’s precisely equivalent to profit, a concept most people grasp quickly enough, that’s not necessarily a hard thing to accomplish, but it has to be done, because it’s when the concept of net energy is solidly understood that such absurdities as commercial fusion power appear in their true light.
It probably has to be said up front that no such project will keep the end of the industrial age from being an ugly mess. That’s already baked into the cake at this point; what were once problems to be solved have become predicaments that we can, at best, only mitigate. Nor could a project of the sort I’ve very roughly sketched out here expect any kind of overnight success. It would have to play a long game in an era when time is running decidedly short. Challenging? You bet—but I think it’s a possibility worth serious consideration.
***********************In other news, I’m delighted to announce the appearance of two books that will be of interest to readers of this blog. The first is Dmitry Orlov’s latest, Shrinking the Technosphere: Getting a Grip on the Technologies that Limit Our Autonomy, Self-Sufficiency, and Freedom. It’s a trenchant and thoughtful analysis of the gap between the fantasies of human betterment through technological progress and the antihuman mess that’s resulted from the pursuit of those fantasies, and belongs on the same shelf as Theodore Roszak’s Where the Wasteland Ends: Politics and Transcendence in Postindustrial Society and my After Progress: Religion and Reason in the Twilight of the Industrial Age. Copies hot off the press can be ordered from New Society here.
Meanwhile, Space Bats fans will want to know that the anthology of short stories and novellas set in the world of my novel Star’s Reach is now available for preorder from Founders House here. Merigan Tales is a stellar collection, as good as any of the After Oil anthologies, and fans of Star’s Reach won’t want to miss it.

The Fifth Side of the Triangle

Wed, 2016-12-07 11:27
One of the things I’ve had occasion to notice, over the course of the decade or so I’ve put into writing these online essays, is the extent to which repeating patterns in contemporary life go unnoticed by the people who are experiencing them. I’m not talking here about the great cycles of history, which take long enough to roll over that a certain amount of forgetfulness can be expected; the repeating patterns I have in mind come every few years, and yet very few people seem to notice the repetition.
An example that should be familiar to my readers is the way that, until recently, one energy source after another got trotted out on the media and the blogosphere as the excuse du jour for doing nothing about the ongoing depletion of global fossil fuel reserves. When this blog first got under way in 2006, ethanol from corn was the excuse; then it was algal biodiesel; then it was nuclear power from thorium; then it was windfarms and solar PV installations; then it was oil and gas from fracking. In each case, the same rhetorical handwaving about abundance was deployed for the same purpose, the same issues of net energy and concentration were evaded, and the resource in question never managed to live up to the overblown promises made in its name—and yet any attempt to point out the similarities got blank looks and the inevitable refrain, “but this is different.”
The drumbeat of excuses du jour has slackened a bit just now, and that’s also part of a repeating pattern that doesn’t get anything like the scrutiny it deserves. Starting when conventional petroleum production worldwide reached its all-time plateau, in the first years of this century, the price of oil has jolted up and down in a multiyear cycle. The forces driving the cycle are no mystery: high prices encourage producers to bring marginal sources online, but they also decrease demand; the excess inventories of petroleum that result drive down prices; low prices encourage consumers to use more, but they also cause marginal sources to be shut down; the shortfalls of petroleum that result drive prices up, and round and round the mulberry bush we go.
We’re just beginning to come out of the trough following the 2015 price peak, and demand is even lower than it would otherwise be, due to cascading troubles in the global economy. Thus, for the moment, there’s enough petroleum available to supply everyone who can afford to buy it. If the last two cycles are anything to go by, though, oil prices will rise unsteadily from here, reaching a new peak in 2021 or so before slumping down into a new trough. How many people are paying attention to this, and using the current interval of relatively cheap energy to get ready for another period of expensive energy a few years from now? To judge from what I’ve seen, not many.
Just at the moment, though, the example of repetition that comes first to my mind has little to do with energy, except in a metaphorical sense. It’s the way that people committed to a cause—any cause—are so often so flustered when initial successes are followed by something other than repeated triumph forever. Now of course part of the reason that’s on my mind is the contortions still ongoing on the leftward end of the US political landscape, as various people try to understand (or in some cases, do their level best to misunderstand) the implications of last month’s election. Still, that’s not the only reason this particular pattern keeps coming to mind.
I’m also thinking of it as the Eurozone sinks deeper and deeper into political crisis. The project of European unity had its initial successes, and a great many European politicians and pundits seem to have convinced themselves that of course those would be repeated step by step, until a United States of Europe stepped out on the international stage as the world’s next superpower. It’s pretty clear at this point that nothing of the sort is going to happen, because those initial successes were followed by a cascade of missteps and a populist backlash that’s by no means reached its peak yet.
More broadly, the entire project of liberal internationalism that’s guided the affairs of the industrial world since the Berlin Wall came down is in deep trouble. It’s been enormously profitable for the most affluent 20% or so of the industrial world’s population, which is doubtless a core reason why that same 20% insists so strenuously that no other options are possible, but it’s been an ongoing disaster for the other 80% or so, and they are beginning to make their voices heard.
At the heart of the liberal project was the insistence that economics should trump politics—that the free market should determine policy in most matters, leaving governments only an administrative function. Of course that warm and cozy abstraction “the free market” meant in practice the kleptocratic corporate socialism of too-big-to-fail banks and subsidy-guzzling multinationals, which proceeded to pursue their own short-term benefit so recklessly that they’ve driven entire countries into the ground. That’s brought about the inevitable backlash, and the proponents of liberal internationalism are discovering to their bafflement that if enough of the electorate is driven to the wall, the political sphere may just end up holding the Trump card after all.
And of course the same bafflement is on display in the wake of last month’s presidential election, as a great many people who embraced our domestic version of the liberal internationalist idea were left dumbfounded by its defeat at the hands of the electorate—not just by those who voted for Donald Trump, but also by the millions who stayed home and drove Democratic turnout in the 2016 election down to levels disastrously low for Hillary Clinton’s hopes. A great many of the contortions mentioned above have been driven by the conviction on the part of Clinton’s supporters that their candidate’s defeat was caused by a rejection of the ideals of contemporary American liberalism. That some other factor might have been involved is not, at the moment, something many of them are willing to hear.
That’s where the repeating pattern comes in, because movements for social change—whether they come from the grassroots or the summits of power—are subject to certain predictable changes, and if those changes aren’t recognized and countered in advance, they lead to the kind of results I’ve just been discussing. There are several ways to talk about those changes, but the one I’d like to use here unfolds, in a deliberately quirky way, from the Hegelian philosophy of history.
That probably needs an explanation, and indeed an apology, because Georg Wilhelm Friedrich Hegel has been responsible for more sheer political stupidity than any other thinker of modern times. Across the bloodsoaked mess that was the twentieth century, from revolutionary Marxism in its opening years to Francis Fukuyama’s risible fantasy of the End of History in its closing, where you found Hegelian political philosophy, you could be sure that someone was about to make a mistaken prediction.
It may not be entirely fair to blame Hegel personally for this. His writings and lectures are vast heaps of cloudy abstraction in which his students basically had to chase down inkblot patterns of their own making. Hegel’s great rival Arthur Schopenhauer used to insist that Hegel was a deliberate fraud, stringing together meaningless sequences of words in the hope that his readers would mistake obscurity for profundity, and more than once—especially when slogging through the murky prolixities of Hegel’s The Phenomenology of Spirit—I’ve suspected that the old grouch of Frankfurt was right. Still, we can let that pass, because a busy industry of Hegelian philosophers spent the last century and a half churning out theories of their own based, to one extent or another, on Hegel’s vaporings, and it’s this body of work that most people mean when they talk about Hegelian philosophy.
At the core of most Hegelian philosophies of history is a series of words that used to be famous, and still has a certain cachet in some circles: thesis, antithesis, synthesis. (Hegel himself apparently never used those terms in their later sense, but no matter.) That’s the three-step dance to the music of time that, in the Hegelian imagination, shapes human history. You’ve got one condition of being, or state of human consciousness, or economic system, or political system, or what have you; it infallibly generates its opposite; the two collide, and then there’s a synthesis which resolves the initial contradiction. Then the synthesis becomes a thesis, generates its own antithesis, a new synthesis is born, and so on.
One of the oddities about Hegelian philosophies of history is that, having set up this repeating process, their proponents almost always insist that it’s about to stop forever. In the full development of the Marxist theory of history, for example, the alternation of thesis-antithesis-synthesis starts with the primordial state of primitive communism and then chugs merrily, or rather far from merrily, through a whole series of economic systems, until finally true communism appears—and then that’s it; it’s the synthesis that never becomes a thesis and never conjures up an antithesis. In exactly the same way, Fukuyama’s theory of the end of history argued that all history until 1991 or so was a competition between different systems of political economy, of which liberal democratic capitalism and totalitarian Marxism were the last two contenders; capitalism won, Marxism lost, game over.
Now of course that’s part of the reason that Hegelianism so reliably generates false predictions, because in the real world it’s never game over; there’s always another round to play. There’s another dimension of Hegelian mistakenness, though, because the rhythm of the dialectic implies that the gains of one synthesis are never lost. Each synthesis becomes the basis for the next struggle between thesis and antithesis out of which a new synthesis emerges—and the new synthesis is always supposed to embody the best parts of the old.
This is where we move from orthodox Hegelianism to the quirky alternative I have in mind. It didn’t emerge out of the profound ponderings of serious philosophers of history in some famous European university. It first saw the light in a bowling alley in suburban Los Angeles, and the circumstances of its arrival—which, according to the traditional account, involved the miraculous appearance of a dignified elderly chimpanzee and the theophany of a minor figure from Greek mythology—suggest that prodigious amounts of drugs were probably involved.
Yes, we’re talking about Discordianism.
I’m far from sure how many of my readers are familiar with that phenomenon, which exists somewhere on the ill-defined continuum between deadpan put-on and serious philosophical critique. The short form is that it was cooked up by a couple of young men on the fringes of the California Beat scene right as that was beginning its mutation into the first faint adumbrations of the hippie phenomenon. Its original expression was the Principia Discordia, the scripture (more or less) of a religion (more or less) that worships (more or less) Eris, the Greek goddess of chaos, and its central theme is the absurdity of belief systems that treat orderly schemes cooked up in the human mind as though these exist out there in the bubbling, boiling confusion of actual existence.
That may not seem like fertile ground for a philosophy of history, but the Discordians came up with one anyway, probably in mockery of the ultraserious treatment of Hegelian philosophy that was common just then in the Marxist-existentialist end of the Beat scene. Robert Shea and Robert Anton Wilson proceeded to pick up the Discordian theory of history and weave it into their tremendous satire of American conspiracy culture, the Illuminatus!trilogy. That’s where I encountered it originally in the late 1970s; I laughed, and then paused and ran my fingers through my first and very scruffy adolescent beard, realizing that it actually made more sense than any other theory of history I’d encountered.
Here’s how it works. From the Discordian point of view, Hegel went wrong for two reasons. The first was that he didn’t know about the Law of Fives, the basic Discordian principle that all things come in fives, except when they don’t. Thus he left off the final two steps of the dialectical process: after thesis, antithesis, and synthesis, you get parenthesis, and then paralysis.
The second thing Hegel missed is that the synthesis is never actually perfect.  It never succeeds wholly in resolving the conflict between thesis and antithesis; there are always awkward compromises, difficulties that are papered over, downsides that nobody figures out at the time, and so on. Thus it doesn’t take long for the synthesis to start showing signs of strain, and the inevitable response is to try to patch things up without actually changing anything that matters. The synthesis thus never has time to become a thesis and generate its own antithesis; it is its own antithesis, and ever more elaborate arrangements have to be put to work to keep it going despite its increasingly evident flaws; that’s the stage of parenthesis.
The struggle to maintain these arrangements, in turn, gradually usurps so much effort and attention that the original point of the synthesis is lost, and maintaining the arrangements themselves becomes too burdensome to sustain. That’s when you enter the stage of paralysis, when the whole shebang grinds slowly to a halt and then falls apart. Only after paralysis is total do you get a new thesis, which sweeps away the rubble and kickstarts the whole process into motion again.
There are traditional Discordian titles for these stages. The first, thesis, is the state of Chaos, when a group of human beings look out at the bubbling, boiling confusion of actual existence and decide to impose some kind of order on the mess. The second, antithesis, is the state of Discord, when the struggle to impose that order on the mess in question produces an abundance of equal and opposite reactions. The third, synthesis, is the state of Confusion, in which victory is declared over the chaos of mere existence, even though everything’s still bubbling and boiling merrily away as usual. The fourth, parenthesis, is the state of Consternation,* in which the fact that everything’s still bubbling and boiling merrily away as usual becomes increasingly hard to ignore. The fifth and final, paralysis, is the state of Moral Warptitude—don’t blame me, that’s what the Principia Discordiasays—in which everything grinds to a halt and falls to the ground, and everyone stands around in the smoldering wreckage rubbing their eyes and wondering what happened.
*(Yes, I know, Robert Anton Wilson called the last two stages Bureaucracy and Aftermath. He was a heretic. So is every other Discordian, for that matter.)
Let’s apply this to the liberal international order that emerged in the wake of the Soviet Union’s fall, and see how it fits. Thesis, the state of Chaos, was the patchwork of quarrelsome nations into which our species has divided itself, which many people of good will saw as barbarous relics of a violent past that should be restrained by a global economic order. Antithesis, the state of Discord, was the struggle to impose that order by way of trade agreements and the like, in the teeth of often violent resistance—the phrase “WTO Seattle” may come to mind here. Synthesis, the state of Confusion, was the self-satisfied cosmopolitan culture that sprang up among the affluent 20% or so of the industrial world’s population, who became convinced that the temporary ascendancy of policies that favored their interests was not only permanent but self-evidently right and just.
Parenthesis, the state of Consternation, was the decades-long struggle to prop up those policies despite the disastrous economic consequences those policies inflicted on everyone but the affluent. Finally, paralysis, the state of Moral Warptitude, sets in when populist movements, incensed by the unwillingness of the 20% to consider anyone else’s needs but their own, surge into the political sphere and bring the entire project to a halt. It’s worth noting here that the title “moral warptitude” may be bad English, but it’s a good description for the attitude of believers in the synthesis toward the unraveling of their preferred state of affairs. It’s standard, as just noted, for those who benefit from the synthesis to become convinced that it’s not merely advantageous but also morally good, and to see the forces that overthrow it as evil incarnate; this is simply another dimension of their Confusion.
Am I seriously suggesting that the drug-soaked ravings of a bunch of goofy California potheads provide a better guide to history than the serious reflections of Hegelian philosophers? Well, yes, actually, I am. Given the track record of Hegelian thought when it comes to history, a flipped coin is a better guide—use a coin, and you have a 50% better chance of being right. Outside of mainstream macroeconomic theory, it’s hard to think of a branch of modern thought that so consistently turns out false answers once it’s applied to the real world.
No doubt there are more respectable models that also provide a clear grasp of what happens to most movements for social change—the way they lose track of the difference between achieving their goals and pursuing their preferred strategies, and generally end up opting for the latter; the way that their institutional forms become ends in themselves, and gradually absorb the effort and resources that would otherwise have brought about change; the way that they run to extremes, chase off potential and actual supporters, and then busy themselves coming up with increasingly self-referential explanations for the fact that the only tactics they’re willing to consider are those that increase their own marginalization in the wider society, and so on. It’s a familiar litany, and will doubtless become even more familiar in the years ahead.
For what it’s worth, though, it’s not necessary for the two additional steps of the post-Hegelian dialectic, the fourth and fifth sides of his imaginary triangle, to result in the complete collapse of everything that was gained in the first three steps. It’s possible to surf the waves of Consternation and Moral Warptitude—but it’s not easy. Next week, we’ll explore this further, by circling back to the place where this blog began, and having a serious talk about how the peak oil movement failed.
*************In other news, I’m delighted to report that Retrotopia, which originally appeared here as a series of posts, is now in print in book form and available for sale. I’ve revised and somewhat expanded Peter Carr’s journey to the Lakeland Republic, and I hope it meets with the approval of my readers.
Also from Founders House, the first issue of the new science fiction and fantasy quarterly MYTHIC has just been released. Along with plenty of other lively stories, it’s got an essay of mine on the decline and revival of science fiction, and a short story, "The Phantom of the Dust," set in the same fictive universe as my novel The Weird of Hali: Innsmouth, and pitting Owen Merrill and sorceress Jenny Chaudronnier against a sinister mystery from colonial days. Subscriptions and single copies can be ordered here.

The End of the American Century

Wed, 2016-11-30 10:39
I have a bone to pick with the Washington Post. A few days back, as some of my readers may be aware, it published a list of some two hundred blogs that it claimed were circulating Russian propaganda, and I was disappointed to find that The Archdruid Report didn’t make the cut.
Oh, granted, I don’t wait each week for secret orders from Boris Badenov, the mock-iconic Russian spy from the Rocky and Bullwinkle Show of my youth, but that shouldn’t disqualify me.  I’ve seen no evidence that any of the blogs on the list take orders from Moscow, either; certainly the Post offered none worth mentioning. Rather, what seems to have brought down the wrath of “Pravda on the Potomac,” as the Post is unfondly called by many DC locals, is that none of these blogs have been willing to buy into the failed neoconservative consensus that’s guided American foreign policy for the last sixteen years. Of that latter offense, in turn, The Archdruid Report is certainly guilty.
There are at least two significant factors behind the Post’s adoption of the tactics of the late Senator Joe McCarthy, dubious lists and all.  The first is that the failure of Hillary Clinton’s presidential ambitions has thrown into stark relief an existential crisis that has the American news media by the throat. The media sell their services to their sponsors on the assumption that they can then sell products and ideas manufactured by those sponsors to the American people. The Clinton campaign accordingly outspent Trump’s people by a factor of two to one, sinking impressive amounts of the cash she raised from millionaire donors into television advertising and other media buys.
Clinton got the coverage she paid for, too. Nearly every newspaper in the United States endorsed her; pundits from one end of the media to the other solemnly insisted that everyone ought to vote for her; equivocal polls were systematically spun in her favor by a galaxy of talking heads. Pretty much everyone who thought they mattered was on board the bandwagon. The only difficulty, really was that the people who actually mattered—in particular, voters in half a dozen crucial swing states—responded to all this by telling their soi-disant betters, “Thanks, but one turkey this November is enough.”
It turned out that Clinton was playing by a rulebook that was long past its sell-by date, while Trump had gauged the shift in popular opinion and directed his resources accordingly. While she sank her money into television ads on prime time, he concentrated on social media and barnstorming speaking tours through regions that rarely see a presidential candidate. He also figured out early on that the mainstream media was a limitless source of free publicity, and the best way to make use of it was to outrage the tender sensibilities of the media itself and get denounced by media talking heads.
That worked because a very large number of people here in the United States no longer trust the news media to tell them anything remotely resembling the truth. That’s why so many of them have turned to blogs for the services that newspapers and broadcast media used to provide: accurate reporting and thoughtful analysis of the events that affect their lives. Nor is this an unresasonable choice. The issue’s not just that the mainstream news media is biased; it’s not just that it never gets around to mentioning many issues that affect people’s lives in today’s America; it’s not even that it only airs a suffocatingly narrow range of viewpoints, running the gamut of opinion from A to A minus—though of course all these are true.  It’s also that so much of it is so smug, so shallow, and so dull.
The predicament the mainstream media now face is as simple as it is inescapable. After taking billions of dollars from their sponsors, they’ve failed to deliver the goods.  Every source of advertising revenue in the United States has got to be looking at the outcome of the election, thinking, “Fat lot of good all those TV buys did her,” and then pondering their own advertising budgets and wondering how much of that money might as well be poured down a rathole.
Presumably the mainstream news media could earn the trust of the public again by breaking out of the echo chamber that defines the narrow range of acceptable opinions about the equally narrow range of issues open to discussion, but this would offend their sponsors. Worse, it would offend the social strata that play so large a role in defining and enforcing that echo chamber; most mainstream news media employees who have a role in deciding what does and does not appear in print or on the air belong to these same social strata, and are thus powerfully influenced by peer pressure. Talking about supposed Russian plots to try to convince people not to get their news from blogs, though it’s unlikely to work, doesn’t risk trouble from either of those sources.
Why, though, blame it on the Russians? That’s where we move from the first to the second of the factors I want to discuss this week.
A bit of history may be useful here. During the 1990s, the attitude of the American political class toward the rest of the world rarely strayed far from the notions expressed by Francis Fukuyama in his famous and fatuous essay proclaiming the end of history.  The fall of the Soviet Union, according to this line of thought, proved that democracy and capitalism were the best political and economic systems humanity would ever come up with, and the rest of the world would therefore inevitably embrace them in due time. All that was left for the United States and its allies to do was to enforce certain standards of global order on the not-yet-democratic and not-yet-capitalist nations of the world, until they grew up and got with the program.
That same decade, though, saw the emergence of the neoconservative movement.  The neoconservaties were as convinced of the impending triumph of capitalism and democracy as their rivals, but they opposed the serene absurdities of Fukuyama’s thesis with a set of more muscular absurdities of their own. Intoxicated with the collapse of the Soviet Union and its allies, they convinced themselves that identical scenes could be enacted in Baghdad, Tehran, Beijing, and the rest of the world, if only the United States would seize the moment and exploit its global dominance.
During Clinton’s presidency, the neoconservatives formed a pressure group on the fringes of official Washington, setting up lobbying groups such as the Project for a New American Century and bombarding the media with position papers.  The presidency of George W. Bush gave them their chance, and they ran with it. Where the first Iraq war ended with Saddam Hussein beaten but still in power—the appropriate reponse according to the older ideology—the second ended with the US occupying Iraq and a manufactured “democratic” regime installed under its aegis. In the afterglow of victory, neoconservatives talked eagerly about the conquest of Iran and the remaking of the Middle East along the same lines as post-Soviet eastern Europe. Unfortunately for these fond daydreams, what happened instead was a vortex of sectarian warfare and anti-American insurgency.
You might think, dear reader, that the cascading failures of US policy in Iraq might have caused second thoughts in the US political and military elites whose uncritical embrace of neoconservative rhetoric let that happen. You might be forgiven, for that matter, for thinking that the results of US intervention in Afghanistan, where the same assumptions had met with the same disappointment, might have given those second thoughts even more urgency. If so, you’d be quite mistaken. According to the conventional wisdom in today’s America, the only conceivable response to failure is doubling down. 
“If at first you don’t succeed, fail, fail again” thus seems to be the motto of the US political class these days, and rarely has that been so evident as in the conduct of US foreign policy.  The Obama administration embraced the same policies as its feckless predecessor, and the State Department, the CIA, and the Pentagon went their merry way, overthrowing governments right and left, and tossing gasoline onto the flames of ethnic and sectarian strife in various corners of the world, under the serene conviction that the blowback from these actions could never inconvenience the United States.
That would be bad enough. Far worse was the effect of neoconservative policies on certain other nations: Russia, China, and Iran. In the wake of the Soviet Union’s collapse, Russia was a basket case, Iran was a pariah nation isolated from the rest of the world, and China had apparently made its peace with an era of American global dominance, and was concentrating on building up its economy instead of its military. It would have been child’s play for the United States to maintain that state of affairs indefinitely. Russia could have been helped to recover and then integrated economically into Europe; China could have been allowed the same sort of regional primacy the US allows as a matter of course to its former enemies Germany and Japan; and without US intervention in the Middle East to hand it a bumper crop of opening wedges, Iran could have been left to stew in its own juices until it imploded. 
That’s not what happened, though. Instead, two US adminstrations went out of their way to convince Russia and China they had nothing to gain and everything to lose by accepting their assigned places in a US-centric international order. Russia and China have few interests in common and many reasons for conflict; they’ve spent much of their modern history glaring at each other across a long and contentious mutual border; they had no reason to ally with each other, until the United States gave them one. Nor did either nation have any reason to reach out to the Muslim theocracy in Iran—quite the contrary—until they began looking for additional allies to strengthen their hand against the United States.
One of the basic goals of effective foreign policy is to divide your potential enemies against each other, so that they’re so busy worrying about one another that they don’t have the time or resources to bother you. It’s one thing, though, to violate that rule when the enemies you’re driving together lack the power to threaten your interests, and quite another when the resource base, population, and industrial capacity of the nations you’re driving together exceeds your own. The US government’s harebrained pursuit of neoconservative policies has succeeded, against the odds, in creating a sprawling Eurasian alliance with an economic and military potential significantly greater than that of the US.  There have probably been worse foreign policy blunders in the history of the world, but I can’t think of one off hand.
You won’t read about that in the mainstream news media in the United States. At most, you’ll get canned tirades about how Russian president Vladimir Putin is a “brutal tyrant” who is blowing up children in Aleppo or what have you. “Brutal tyrant,” by the way, is a code phrase of the sort you normally get in managed media.  In the US news, it simply means “a head of state who’s insufficiently submissive to the United States.” Putin certainly qualifies as the latter; first in the Caucasus, then in the Ukraine, and now in Syria, he’s deployed military force to advance his country’s interests against those of the United States and its allies. I quite understand that the US political class isn’t pleased by this, but it might be helpful for them to reflect on their own role in making it happen.
The Russian initiative isn’t limited to Syria, though. Those of my readers who only pay attention to US news media probably don’t know yet that Egypt has now joined Russia’s side. Egyptian and Russian troops are carrying out joint military drills, and reports in Middle Eastern news media have it that Egyptian troops will soon join the war in Syria on the side of the Syrian government. If so, that’s a game-changing move, and probably means game over for the murky dealings the United States and its allies have been pursuing in that end of the Middle East.
China and Russia have very different cultural styles when it comes to exerting power. Russian culture celebrates the bold stroke; Chinese culture finds subtle pressure more admirable. Thus the Chinese have been advancing their country’s interests against those of the United States and its allies in a less dramatic but equally effective way. While distracting Washington’s attention with a precisely measured game of “chicken” in the South China Sea, the Chinese have established a line of naval bases along the northern shores of the Indian Ocean from Myanmar to Djibouti, and contracted alliances in East Africa and South Asia. Those of my readers who’ve read Alfred Thayer Mahan and thus know their way around classic maritime strategy will recognize exactly what’s going on here.
Most recently, China has scored two dramatic shifts in the balance of power in the western Pacific. My American readers may have heard of President Rodrigo Duterte of the Phillippines; he’s the one who  got his fifteen minutes of fame in the mainstream media here when he called Barack Obama a son of a whore. The broader context, of course, got left out. Duterte, like the heads of state of many nominal US allies, resents US  interference in his country’s affairs, and at this point he has other options. His outburst was followed in short order by a trip to Beijing, where he and China’s President Xi signed multibillion-dollar aid agreements and talked openly about the end of a US-dominated world order.
A great many Americans seem to think of the Phillippines as a forgettable little country off somewhere unimportant in the Third World. That’s a massive if typical misjudgment. It’s a nation of 100 million people on a sprawling archipelago of more than 7,000 islands, commanding the entire southern end of the South China Sea and a vast swath of the western Pacific, including crucial maritime trade routes. As a US ally, it was a core component of the ring of encirclement holding Chinese maritime forces inside the island ring that walls China’s coastal waters from rest of the Pacific basin. As a Chinese ally, it holds open that southern gate to China’s rapidly expanding navy and air force.
Duterte wasn’t the only Asian head of state to head for Beijing in recent months. Malaysia’s prime minister was there a few weeks later, to sign up for another multibillion-dollar aid package, buy Chinese vessels for the Malaysian navy, and make acid comments about the way that, ahem, former colonial powers keep trying to interfere in Malaysian affairs. Malaysia’s a smaller nation than the Phillippines, but even more strategically placed.  Its territory runs alongside the northern shore of the Malacca Strait:  the most important sea lane in the world, the gateway connecting the Indian Ocean with the Pacific, through which much of the world’s seaborne crude oil transport passes.
All these are opening moves. Those who are familiar with the rise and fall of global powers know what the next moves are; those who don’t might want to consider reading my book Declineand Fall, or my novel Twilight’s Last Gleaming, which makes the same points in narrative form. Had Hillary Clinton won this month’s election, we might have moved into the endgame much sooner.  Her enthusiasm for overthrowing governments during her stint as Secretary of State, and her insistence that the US should impose a no-fly zone over Syria in the teeth of Russian fighters and state-of-the-art antiaircraft defenses, suggests that she could have filled the role of my fictional president Jameson Weed, and sent US military forces into a shooting war they were not realistically prepared to win.
We seem to have dodged that bullet. Even so, the United States remains drastically overextended, with military bases in more than a hundred countries around the world and a military budget nearly equal to all other countries’ put together. Meanwhile, back here at home, our country is falling apart. Leave the bicoastal bubble where the political class and their hangers-on spend their time, and the United States resembles nothing so much as the Soviet Union in its last days: a bleak and dilapidated landscape of economic and social dysfunction, where the enforced cheerfulness of the mainstream media contrasts intolerably with the accelerating disintegration visible all around.
That could have been prevented. If the United States had responded to the end of the Cold War by redirecting the so-called “peace dividend” toward the rebuilding of our national infrastructure and our domestic economy, we wouldn’t be facing the hard choices before us right now—and in all probability, by the way, Donald Trump wouldn’t just have been elected president. Instead, the US political class let itself be caught up in neoconservative fantasies of global dominion, and threw away that opportunity. The one bright spot in that dismal picture is that we have another chance.
History shows that there are two ways that empires end. Their most common fate involves clinging like grim death to their imperial status until it drags them down. Spain’s great age of overseas empire ended that way, with Spain plunging into a long era of economic disarray and civil war. At least it maintained its national unity; the Ottoman and Austro-Hungarian empires both finished their imperial trajectories by being partitioned, as of course did the Soviet Union. There are worse examples; I’m thinking here of the Assyrian Empire of the ancient Middle East, which ceased to exist completely—its nationhood, ethnicity, and language dissolving into those of its neighbors—once it fell.
Then there’s the other option, the one chosen by the Chinese in the fifteenth century and Great Britain in the twentieth. Both nations had extensive overseas empires, and both walked away from them, carrying out a staged withdrawal from imperial overreach. Both nations not only survived the process but came through with their political and cultural institutions remarkably intact. This latter option, with all its benefits, is still available to the United States.
A staged withdrawal of the sort just described would of course be done step by step, giving our allies ample time to step up to the plate and carry the costs of their own defense. Those regions that have little relevance to US national interests, such as the Indian Ocean basin, would see the first round of withdrawals, while more important regions such as Europe and the northwest Pacific would be later on the list. The withdrawal wouldn’t go all the way back to our borders by any means; a strong presence in the Atlantic and eastern Pacific basins and a pivot to our own “near abroad” would be needed, but those would also be more than adequate to maintain our national security.
Meanwhile, the billions upon billions of dollars a year that would be saved could be put to work rebuilding our national infrastructure and economy, with enough left over for a Marshall Plan for Mexico—the most effective way to reduce illegal immigration to the United States, after all, is to help make sure that citizens of the countries near us have plenty of jobs at good wages where they already live. Finally, since the only glue holding the Russo-Chinese alliance together is their mutual opposition to US hegemony, winding up our term as global policeman will let Russia, China and Iran get back to contending with each other rather than with us.
Such projects, on the rare occasions they’re made, get shouted down by today’s US political class as “isolationism.” There’s a huge middle ground between isolationism and empire, though, and that middle ground is where most of the world’s nations stand as they face their neighbors. One way or another, the so-called “American century” is ending; it can end the hard way, the way so many other eras of global hegemony have ended—or it can end with the United States recognizing that it’s a nation among nations, not an overlord among vassals, and acting accordingly.
The mainstream news media here in the United States, if they actually provided the public service they claim, might reasonably be expected to discuss the pros and cons of such a proposal, and of the many other options that face this nation at the end of its era of global hegemony. I can’t say I expect that to happen, though. It’s got to be far more comfortable for them to blame the consequences of their own failure on the supposed Boris Badenovs of the blogosphere, and cling to the rags of their fading role as purveyors of a failed conventional wisdom, until the last of their audience wanders away for good.

The Free Trade Fallacy

Wed, 2016-11-23 11:17
As longtime readers of this blog know, it’s not uncommon for the essays I post here to go veering off on an assortment of tangents, and this week’s post is going to be an addition to that already well-stocked list. Late last week, as the aftermath of the recent election was still spewing all over the media,  I was mulling over one likely consequence of the way things turned out—the end of at least some of the free trade agreements that have played so large and dubious a role in recent economic history
One of the major currents underlying 2016’s political turmoil in Europe and the United States, in fact, has been a sharp disagreement about the value of free trade. The political establishment throughout the modern industrial world insists that free trade policies, backed up by an ever-increasing network of trade agreements, are both inevitable and inevitably good. The movements that have risen up against the status quo—the Brexit campaign in Britain, the populist surge that just made Donald Trump the next US president, and an assortment of similar movements elsewhere—reject both these claims, and argue that free trade is an unwise policy that has a cascade of negative consequences.
It’s important to be clear about what’s under discussion here, since conversations about free trade very often get wrapped up in warm but vague generalities about open borders and the like. Under a system of free trade, goods and capital can pass freely across national borders; there are no tariffs to pay, no quotas to satisfy, no capital restrictions to keep money in one country or out of another. The so-called global economy, in which the consumer goods sold in a nation might be manufactured anywhere on the planet, with funds flowing freely to build a factory here and funnel profits back there, depends on free trade, and the promoters of free trade theory like to insist that this is always a good thing: abolishing trade barriers of all kinds, and allowing the free movement of goods and capital across national boundaries, is supposed to create prosperity for everyone.
That’s the theory, at least. In practice?  Well, not so much. It’s not always remembered that there have been two great eras of free trade in modern history—the first from the 1860s to the beginning of the Great Depression, in which the United States never fully participated; the second from the 1980s to the present, with the United States at dead center—and neither one of them has ushered in a world of universal prosperity. Quite the contrary, both of them have yielded identical results: staggering profits for the rich, impoverishment and immiseration for the working classes, and cascading economic crises. The first such era ended in the Great Depression; the second, just at the moment, looks as though it could end the same way.
Economists—more precisely, the minority of economists who compare their theories to the evidence provided by the real world—like to insist that these unwelcome outcomes aren’t the fault of free trade. As I hope to show, they’re quite mistaken. An important factor has been left out of their analysis, and once that factor has been included, it becomes clear that free trade is bad policy that inevitably produces poverty and economic instability, not prosperity.
To see how this works, let’s imagine a continent with many independent nations, all of which trade with one another. Some of the nations are richer than others; some have valuable natural resources, while others don’t; standards of living and prevailing wages differ from country to country. Under normal conditions, trade barriers of various kinds limit the flow of goods and capital from one nation to another.  Each nation adjusts its trade policy to further its own economic interests.  One nation that’s trying to build up a domestic steel industry, say, may use tariffs, quotas, and the like to shelter that industry from foreign competition.  Another nation with an agricultural surplus may find it necessary to lower tariffs on other products to get neighboring countries to buy its grain.
Outside the two eras of free trade mentioned above, this has been the normal state of affairs, and it has had two reliable results. The first is that the movement of goods and capital between the nations tends toward a rough balance, because every nation uses its trade barriers to police hostile trade policy on the part of its neighbors. Imagine, for example, a nation that tries to monopolize steel production by “dumping”—that is, selling steel on the international market at rock-bottom prices to try to force all other nations’ steel mills into bankruptcy. The other nations respond by slapping tariffs, quotas, or outright bans on imported steel from the dumping country, bringing the project to a screeching halt. Thus trade barriers tend to produce a relative equilibrium between national economies.
Notice that this is an equilibrium, not an equality. When trade barriers exist, it’s usual for some nations to be rich and others to be poor, for a galaxy of reasons having nothing to do with international trade. At the same time, the difficulties this imposes on poor nations are balanced by a relative equilibrium, within nations, between wages and prices.
When the movement of goods and capital across national borders is restricted, the prices of consumer products in each nation will be linked via the law of supply and demand to the purchasing power of consumers in that nation, and thus to the wages paid by employers in that nation. Of course the usual cautions apply; wages and prices fluctuate for a galaxy of reasons, many of which have nothing to do with international trade. Even so, since the wages paid out by employers form the principal income stream that allows consumers to buy the employers’ products, and consumers can have recourse to the political sphere if employers’ attempts to drive down wages get out of hand, there’s a significant pressure toward balance.
Given trade barriers, as a result, people who live in countries that pay low wages generally pay low prices for goods and services, while people who live in countries with high wages face correspondingly high prices when they go shopping. The low prices make life considerably easier for working people in poor countries, just as the tendency of wages to match prices makes life easier for working people in rich countries. Does this always work? Of course not—again, wages and prices fluctuate for countless reasons, and national economies are inherently unstable things—but the factors just enumerated push the economy in the direction of a rough balance between the needs and wants of consumers, on the one hand, and their ability to pay, on the other.
Now let’s imagine that all of the nations we’ve imagined are convinced by a gaggle of neoliberal economists to enact a free trade zone, in which there are no barriers at all to the free movement of goods and capital. What happens?
When there are no trade barriers, the nation that can produce a given good or service at the lowest price will end up with the lion’s share of the market for that good or service. Since labor costs make up so large a portion of the cost of producing goods, those nations with low wages will outbid those with high wages, resulting in high unemployment and decreasing wages in the formerly high-wage countries. The result is a race to the bottom in which wages everywhere decline toward those of the worst-paid labor force in the free trade zone.
When this happens in a single country, as already noted, the labor force can often respond to the economic downdraft by turning to the political sphere. In a free trade zone, though, employers faced with a political challenge to falling wages in one country can simply move elsewhere. It’s the mismatch between economic union and political division that makes free trade unbalanced, and leads to problems we’ll discuss shortly.
Now of course free trade advocates like to insist that jobs lost by wealthier nations to poorer ones will inevitably be replaced by new jobs. History doesn’t support that claim—quite the contrary—and there are good reasons why the jobs that disappear will never be replaced. In a free trade system, it’s more economical for startups in any labor-intensive industry to go straight to one of the countries with low wages; only those industries that are capital-intensive and thus employ comparatively few people have any reason to get under way in the high-wage countries. The computer industry is a classic example—and you’ll notice, I trust, that just as soon as that industry started to become labor-intensive, it moved offshore. Still, there’s another factor at work.
Since wages are a very large fraction of the cost of producing goods, the overall decrease in wages brings about an increase in profits. Thus one result of free trade is a transfer of wealth from the laboring majority, whose income comes from wages, to the affluent minority, whose income comes directly or indirectly from profits. That’s the factor that’s been left out of the picture by the proponents of free trade—its effect on income distribution. Free trade makes the rich richer and the poor poorer, by increasing profits while driving wages down. This no doubt explains why free trade is so popular among the affluent these days, just as it was in the Victorian era. 
There’s a worm in the bud, though, because a skewed income distribution imposes costs of its own, and those costs mount up over time in painfully familiar ways. The difficulty with making the rich richer and the poor poorer, as Henry Ford pointed out a long time ago, is that the wages you pay your employees are also the income stream they use to buy your products. As wages decline, purchasing power declines, and begins to exert downward pressure on returns on investment in every industry that relies on consumer purchases for its income.
Doesn’t the increasing wealth of investors counterbalance the declining wealth of the wage-earning masses? No, because the rich spend a smaller proportion of their incomes on consumer goods than the poor, and divert the rest to investments. Divide a million dollars between a thousand working class family, and the money’s going to be spent to improve the families’ standard of living: better food, a bigger apartment, an extra toy or two around the Christmas tree, and so on. Give the same million to one rich family and it’s a safe bet that much of it’s going to be invested.
This, incidentally, is why the trickle-down economics beloved of Republican politicians of an earlier era simply doesn’t work, and why the Obama administration’s massive handouts of government money to banks in the wake of the 2008-9 financial panic did so little to improve the financial condition of most of the country. When it comes to consumption, the rich simply aren’t as efficient as the poor. If you want to kickstart an economy with consumer expenditures, as a result, you need to make sure that poor and working class people have plenty of money to spend.
There’s a broader principle here as well.  Consumer expenditures and capital for investment are to an economy what sunlight and water are to a plant: you can’t substitute one for the other. You need both. Since free trade policies funnel money away from expenditure toward investment by skewing the income distribution, it causes a shortage of the one and a surplus of the other. As the imbalance builds, it becomes harder for businesses to make a profit because consumers don’t have the cash to buy their products; meanwhile the amount of money available for investment increases steadily. The result is a steady erosion in return on investment, as more and more money chases fewer and fewer worthwhile investment vehicles.
The history of free-trade eras is thus marked by frantic attempts to prop up returns on investment by any means necessary. The offshoring fad that stripped the United States of its manufacturing economy in the 1970s had its exact equivalent in the offshoring of fabric mills from Britain to India in the late Victorian era; in both cases, the move capitalized on remaining disparities in wages and prices between rich and poor areas in a free trade zone. In both cases, offshoring worsened the problem it was meant to fix, by increasing the downward pressure on wages in the richer countries and further decreasing returns on investment across the entire spectrum of consumer industries—then as now, the largest single share of the economy.
A gambit that as far as I know wasn’t tried in the first era of free trade was the attempt to turn capital into ersatz income by convincing consumers to make purchases with borrowed money. That’s been the keystone of economic policy in the United States for most of two decades now.  The housing bubble was only the most exorbitant manifestation of a frantic attempt to get people to spend money they don’t have, and then find some way to pay it all back with interest. It hasn’t worked well, not least because all those interest payments put an additional downward pressure on consumer expenditures.
A variety of other, mostly self-defeating gimmicks have been put in play in both of the modern free trade eras to try to keep consumer expenditures high while wages decline. None of them work, because they don’t address the actual problem—the fact that under free trade, the downward pressure on wages means that consumers can’t afford to spend enough to keep the economy running at a level that will absorb the available investment capital—and so the final solution to the problem of declining returns on investment arrives on schedule: the diversion of capital from productive investment into speculation.
Any of my readers who don’t know how this story ends should get up right now, and go find a copy of John Kenneth Galbraith’s classic The Great Crash 1929. Speculative bubbles, while they last, produce abundant returns; when free trade has driven down wages, forced the consumer economy into stagnation or contraction, and decreased the returns on investment in productive industries to the point of “why bother,” a speculative bubble is very often the only profitable game in town. What’s more, since there are so few investments with decent returns in the late stages of a free trade scheme, there’s a vast amount of money ready to flow into any investment vehicle that can show a decent return, and that’s exactly the environment in which speculative bubbles breed most readily.
So the great free trade era that began tentatively with the repeal of the Corn Laws in 1846, and came into full flower with Gladstone’s abolition of tariffs in 1869, ended in the stock market debacle of 1929 and the Great Depression. The road there was littered with plenty of other crises, too. The economic history of the late nineteenth and early twentieth centuries is a cratered moonscape of speculative busts and stock market crashes, culminating in the Big One in 1929. It resembles, in fact, nothing so much as the economic history of the late twentieth and early twenty-first centuries, which have had their own sequence of busts and crashes: the stock market crash of 1987, the emerging markets crash of 1994, the tech-stock debacle of 2000, the housing bust of 2008, and the beat goes on.
Thus free trade causes the impoverishment and immiseration of the labor force, and a cascading series of economic busts driven by the mismatch between insufficent consumption and excess investment. Those problems aren’t accidental—they’re hardwired into any free trade system—and the only way to stop them in their tracks is to abandon free trade as bad policy, and replace it with sensible trade barriers that ensure that most of the products consumed in each nation are made there.
It’s probably necessary to stop here and point out a couple of things. First of all, the fact that free trade is bad policy doesn’t mean that every kind of trade barrier is good policy.  The habit of insisting that the only possible points along a spectrum are its two ends, common as it is, is an effective way to make really bad decisions; as in most things, there’s a middle ground that yields better results than either of the two extremes. Finding that middle ground isn’t necessarily easy, but the same thing’s true of most economic and political issues.
Second, free trade isn’t the only cause of economic dysfunction, nor is it the only thing that can cause skewed income distribution and the attendant problems that this brings with it. Plenty of factors can cause a national or global economy to run off the rails. What history shows with painful clarity is that free trade inevitably makes this happen. Getting rid of free trade and returning to a normal state of affairs, in which nations provide most of their own needs from within their own borders and trade with other nations to exchange surpluses or get products that aren’t available at home readily, or at all, gets rid of one reliable cause of serious economic dysfunction. That’s all, but arguably it’s enough to make a movement away from free trade a good idea.
Finally, the points I’ve just made suggest that there may be unexpected benefits, even today, to a nation that extracts itself from free trade agreements and puts a well-planned set of trade restrictions in place. There are plenty of factors putting downward pressure on prosperity just now, but the reasoning I’ve just sketched out suggests that the destitution and immiseration so common in the world right now may have been made considerably worse than they would otherwise be by the mania for free trade that’s been so pervasive in recent decades. A country that withdraws from free trade agreements and reorients its economy for the production of goods for domestic consumption might thus expect to see some improvement, not only in the prosperity of its working people, but in rates of return on investment.
That’s the theory I propose. Given the stated policies of the incoming US administration, it’s about to be put to the test—and the results should be apparent over the next few years.
****************On a different and less theoretical note, I’m delighted to report that the third issue of Into The Ruins, the quarterly magazine of deindustrial science fiction, is on its way to subscribers and available for sale to everyone else. The Fall 2016 issue includes stories by regular authors and newcomers alike, including a Matthew Griffiths tale set in the universe of my novel Star’s Reach, along with book reviews, essays, and a letter to the editors column that is turning into one of the liveliest forums in print. If you’re not subscribing yet, you’re missing a treat.
On a less cheery note, it’s been a while now since I proposed a contest, asking readers to write stories about futures that went outside the conventional binary of progress or decline. I think it was a worthwhile project, and some of the stories I received in response were absolutely first-rate—but, I’m sorry to say, there weren’t enough of them to make an anthology. I want to thank everyone who wrote a story in response to my challenge, and since a good many of the stories in question deserve publication, I’m forwarding them to Joel Caris, the editor of Into The Ruins, for his consideration.

When The Shouting Stops

Wed, 2016-11-16 13:08
I've been trying for some time now to understand the reaction of Hillary Clinton’s supporters to her defeat in last week’s election. At first, I simply dismissed it as another round of the amateur theatrics both parties indulge in whenever they lose the White House. Back in 2008, as most of my readers will doubtless recall, Barack Obama’s victory was followed by months of shrieking from Republicans, who insisted—just as a good many Democrats are insisting today—that the election of the other guy meant that democracy had failed, the United States and the world were doomed, and the supporters of the losing party would be rounded up and sent to concentration camps any day now.
That sort of histrionic nonsense has been going on for decades. In 2000, Democrats chewed the scenery in the grand style when George W. Bush was elected president. In 1992, it was the GOP’s turn—I still have somewhere a pamphlet that was circulated by Republicans after the election containing helpful phrases in Russian, so that American citizens would have at least a little preparation when Bill Clinton ran the country into the ground and handed the remains over to the Soviet Union. American politics and popular culture being what it is, this kind of collective hissy fit is probably unavoidable.
Fans of irony have much to savor. You’ve got people who were talking eagerly about how to game the electoral college two weeks ago, who now are denouncing the electoral college root and branch; you’ve got people who insisted that Trump, once he lost, should concede and shut up, who are demonstrating a distinct unwillingness to follow their own advice. You’ve got people in the bluest of blue left coast cities marching in protest as though that’s going to change a single blessed thing—as I’ve pointed out in previous posts here, protest marches that aren’t backed up with effective grassroots political organization are simply a somewhat noisy form of aerobic exercise.
Still, there’s more going on here than that. I know some fairly thoughtful people whose reaction to the election’s outcome wasn’t histrionic at all—it consisted of various degrees of shock, disorientation, and fear. They felt, if the ones I read are typical, that the people who voted for Trump were deliberately rejecting and threatening them personally. That’s something we ought to talk about.
To some extent, to be sure, this was a reflection of the political culture of personal demonization I discussed in last week’s post. Many of Clinton’s supporters convinced themselves, with the help of a great deal of propaganda from the Democratic Party and its bedfellows in the mainstream media, that Donald Trump is a monster of depravity thirsting for their destruction, and anyone who supports him must hate everything good. Now they’re cringing before the bogeyman they imagined, certain that it’s going to act out the role they assigned it and gobble them up.
Another factor at work here is the very strong tendency of people on the leftward end of American politics to believe in what I’ve elsewhere called the religion of progress—the faith that history has an inherent tilt toward improvement, and more to the point, toward the particular kinds of improvement they prefer. Hillary Clinton, in an impromptu response to a heckler at one of her campaign appearances, phrased the central tenet of that religion concisely: “We’re not going to go back. We’re going to go forward.” Like Clinton herself, a great many of her followers saw their cause as another step forward in the direction of progress, and to find themselves “going back” is profoundly disorienting—even though those labels “forward” and “back” are entirely arbitrary when they aren’t the most crassly manipulative sort of propaganda.
That said, there’s another factor driving the reaction of Clinton’s supporters, and the best way I can find to approach it is to consider one of the more thoughtful responses from that side of the political landscape, an incisive essay posted to Livejournal last week by someone who goes by the nom de Web “Ferrett Steinmetz.” The essay’s titled The Cold, Cold Math We’ll Need to Survive the Next Twenty Years, and it comes so close to understanding what happened last Tuesday that the remaining gap offers an unsparing glimpse straight to the heart of the failure of the Left to make its case to the rest of the American people.
At the heart of the essay are two indisputable points. The first is that the core constituencies of the Democratic Party are not large enough by themselves to decide who gets to be president. That’s just as true of the Republican party, by the way, and with few exceptions it’s true in every democratic society.  Each party large enough to matter has a set of core constituencies who can be counted on to vote for it under most circumstances, and then has to figure out how to appeal to enough people outside its own base to win elections. That’s something that both parties in the US tend to forget from time to time, and when they do so, they lose.
The second indisputable point is that if Democrats want to win an election in today’s America, they have to find ways to reach out to people who don’t share the values and interests of the Left. It’s the way that Ferrett Steinmetz frames that second point, though, that shows why the Democratic Party failed to accomplish that necessary task this time. “We have to reach out to people who hate us,” Steinmetz says, and admits that he has no idea at all how to do that.
Let’s take those two assertions one at a time. First, do the people who voted for Donald Trump in this election actually hate Ferrett Steinmetz and his readers—or for that matter, women, people of color, sexual minorities, and so on? Second, how can Steinmetz and his readers reach out to these supposedly hateful people and get them to vote for Democratic candidates?
I have no idea whether Ferrett Steinmetz knows anybody who voted for Donald Trump.  I suspect he doesn’t—or at least, given the number of people I’ve heard from who’ve privately admitted that they voted for Trump but would never let their friends know this, I suspect he doesn’t know anyone who he knows voted for Trump. Here I have a certain advantage. Living in a down-at-the-heels mill town in the north central Appalachians, I know quite a few people who supported Trump; I’ve also heard from a very large number of Trump supporters by way of this blog, and through a variety of other sources.
Are there people among the pro-Trump crowd who are in fact racists, sexists, homophobes, and so on? Of course. I know a couple of thoroughly bigoted racists who cast their votes for him, for example, including at least one bona fide member of the Ku Klux Klan. The point I think the Left tends to miss is that not everyone in flyover country is like that. A few years back, in fact, a bunch of Klansmen came to the town where I live to hold a recruitment rally, and the churches in town—white as well as black—held a counter-rally, stood on the other side of the street, and drowned the Klansmen out, singing hymns at the top of their lungs until the guys in the white robes got back in their cars and drove away.  Surprising? Not at all; in a great deal of middle America, that’s par for the course these days.
To understand why a town that ran off the Klan was a forest of Trump signs in the recent election, it’s necessary to get past the stereotypes and ask a simple question: why did people vote for Trump? I don’t claim to have done a scientific survey, but these are the things I heard Trump voters talking about in the months and weeks leading up to the election:
1. The Risk of War. This was the most common point at issue, especially among women—nearly all the women I know who voted for Trump, in fact, cited it as either the decisive reason for their vote or one of the top two or three. They listened to Hillary Clinton talk about imposing a no-fly zone over Syria in the face of a heavily armed and determined Russian military presence, and looked at the reckless enthusiasm for overthrowing governments she’d displayed during her time as Secretary of State. They compared this to Donald Trump’s advocacy of a less confrontational relationship with Russia, and they decided that Trump was less likely to get the United States into a shooting war.
War isn’t an abstraction here in flyover country. Joining the military is very nearly the only option young people here have if they want a decent income, job training, and the prospect of a college education, and so most families have at least one relative or close friend on active duty.  People here respect the military, but the last two decades of wars of choice in the Middle East have done a remarkably good job of curing middle America of any fondness for military adventurism it might have had.  While affluent feminists swooned over the prospect of a woman taking on another traditionally masculine role, and didn’t seem to care in the least that the role in question was “warmonger,” a great many people in flyover country weighed the other issues against the prospect of having a family member come home in a body bag. Since the Clinton campaign did precisely nothing to reassure them on this point, they voted for Trump.
2. The Obamacare Disaster. This was nearly as influential as Clinton’s reckless militarism. Most of the people I know who voted for Trump make too much money to qualify for a significant federal subsidy, and too little to be able to cover the endlessly rising cost of insurance under the absurdly misnamed “Affordable Care Act.” They recalled, rather too clearly for the electoral prospects of the Democrats, how Obama assured them that the price of health insurance would go down, that they would be able to keep their existing plans and doctors, and so on through all the other broken promises that surrounded Obamacare before it took effect.
It was bad enough that so few of those promises were kept. The real deal-breaker, though, was the last round of double- or triple-digit annual increase in premiums announced this November, on top of increases nearly as drastic a year previously. Even among those who could still afford the new premiums, the writing was on the wall: sooner or later, unless something changed, a lot of people were going to have to choose between losing their health care and being driven into destitution—and then there were the pundits who insisted that everything would be fine, if only the penalties for not getting insurance were raised to equal the cost of insurance! Faced with that, it’s not surprising that a great many people went out and voted for the one candidate who said he’d get rid of Obamacare.
3. Bringing Back Jobs. This is the most difficult one for a lot of people on the Left to grasp, but that’s a measure of the gap between the bicoastal enclaves where the Left’s policies are formed and the hard realities of flyover country. Globalization and open borders sound great when you don’t have to grapple with the economic consequences of shipping tens of millions of manufacturing jobs overseas, on the one hand, and federal policies that flood the labor market with illegal immigrants to drive down wages, on the other. Those two policies, backed by both parties and surrounded by a smokescreen of empty rhetoric about new jobs that somehow never managed to show up, brought about the economic collapse of rural and small town America, driving a vast number of Americans into destitution and misery.
Clinton’s campaign did a really inspired job of rehashing every detail of the empty rhetoric just mentioned, and so gave people out here in flyover country no reason to expect anything but more of the same downward pressure on their incomes, their access to jobs, and the survival of their communities. Trump, by contrast, promised to scrap or renegotiate the trade agreements that played so large a role in encouraging offshoring of jobs, and also promised to put an end to the tacit Federal encouragement of mass illegal immigration that’s driven down wages. That was enough to get a good many voters whose economic survival was on the line to cast their votes for Trump.
4. Punishing the Democratic Party. This one is a bit of an outlier, because the people I know who cast votes for Trump for this reason mostly represented a different demographic from the norm out here: young, politically liberal, and incensed by the way that the Democratic National Committee rigged the nomination process to favor Clinton and shut out Bernie Sanders. They believed that if the campaign for the Democratic nomination had been conducted fairly, Sanders would have been the nominee, and they also believe that Sanders would have stomped Trump in the general election.  For what it’s worth, I think they’re right on both counts.
These voters pointed out to me, often with some heat, that the policies Hillary Clinton supported in her time as senator and secretary of state were all but indistinguishable from those of George W. Bush—you know, the policies Democrats denounced so forcefully a little more than eight years ago.  They argued that voting for Clinton in the general election when she’d been rammed down the throats of the Democratic rank and file by the party’s oligarchy would have signaled the final collapse of the party’s progressive wing into irrelevance. They were willing to accept four years of a Republican in the White House to make it brutally clear to the party hierarchy that the shenanigans that handed the nomination to Clinton were more than they were willing to tolerate.
Those were the reasons I heard people mention when they talked in my hearing about why they were voting for Donald Trump. They didn’t talk about the issues that the media considered important—the email server business, the on-again-off-again FBI investigation, and so on. Again, this isn’t a scientific survey, but I found it interesting that not one Trump voter I knew mentioned those.
What’s more, hatred toward women, people of color, sexual minorities, and the like weren’t among the reasons that people cited for voting for Trump, either. Do a fair number of the people I’m discussing hold attitudes that the Left considers racist, sexist, homophobic, or what have you? No doubt—but the mere fact that such attitudes exist does not prove that those attitudes, rather than the issues just listed, guided their votes.
When I’ve pointed this out to people on the leftward side of the political spectrum, the usual response has been to insist that, well, yes, maybe Trump did address the issues that matter to people in flyover country, but even so, it was utterly wrong of them to vote for a racist, sexist homophobe! We’ll set aside for the moment the question of how far these labels actually apply to Trump, and how much they’re the product of demonizing rhetoric on the part of his political enemies on both sides of the partisan divide. Even accepting the truth of these accusations, what the line of argument just cited claims is that people in the flyover states should have ignored the issues that affect their own lives, and should have voted instead for the issues that liberals think are important.
In some idyllic Utopian world, maybe.  In the real world, that’s not going to happen. People are not going to embrace the current agenda of the American Left if doing so means that they can expect their medical insurance to double in price every couple of years, their wages to continue lurching downward, their communities to sink further in a death spiral of economic collapse, and their kids to come home in body bags from yet another pointless war in the Middle East.
Thus there’s a straightforward answer to both of Ferrett Steinmetz’ baffled questions. Do the people who voted for Trump hate Steinmetz, his readers, or the various groups—women, people of color, sexual minorities—whose concerns are central to the politics of today’s American Left? In many cases, not at all, and in most others, not to any degree that matters politically. They simply don’t care that much about the concerns that the Left considers central—especially when those are weighed against the issues that directly affect their own lives.
As for what Ferrett Steinmetz’s side of the political landscape can offer the people who voted for Trump, that’s at least as simple to answer: listen to those voters, and they’ll tell you. To judge by what I’ve heard them say, they want a less monomaniacally interventionist foreign policy and an end to the endless spiral of wars of choice in the Middle East; they want health insurance that provides reasonable benefits at a price they can afford; they want an end to trade agreements that ship American jobs overseas, and changes to immigration policy that stop the systematic importation of illegal immigrants by big corporate interests to drive down wages and benefits; and they want a means of choosing candidates that actually reflects the will of the people.
The fascinating thing is, of course, that these are things the Democratic Party used to offer. It wasn’t that long ago, in fact, that the Democratic Party made exactly these issues—opposition to reckless military adventurism, government programs that improved the standard of living of working class Americans, and a politics of transparency and integrity—central not only to its platform but to the legislation its congresspeople fought to get passed and its presidents signed into law. Back when that was the case, by the way, the Democratic Party was the majority party in this country, not only in Congress but also in terms of state governorships and legislatures. As the party backed away from offering those things, it lost its majority position. While correlation doesn’t prove causation, I think that in this case a definite case can be made.
More generally, if the Left wants to get the people who voted for Trump to vote for them instead, they’re going to have to address the issues that convinced those voters to cast their ballots the way they did. Oh, and by the way, listening to what the voters in question have to say, rather than loudly insisting that they can only be motivated by hatred, would also help quite a bit. That may be a lot to ask, but once the shouting stops, I hope it’s a possibility.

Reflections on a Democracy in Crisis

Wed, 2016-11-09 16:24
Well, it’s finally over, and I think it’s fair to say I called it. As I predicted back in January of this year, working class Americans—fed up with being treated by the Democratic Party as the one American minority that it’s okay to hate—delivered a stinging rebuke to the politics of business as usual. To the shock and chagrin of the entire US political establishment, and to the tautly focused embarrassment of the pundits, pollsters, and pet intellectuals of the mainstream media, Donald Trump will be the forty-fifth president of the United States of America. 
Like millions of other Americans, I took part in the pleasant civic ritual of the election. My local polling place is in an elementary school on the edge of the poor part of town—the rundown multiracial neighborhood I’ve mentioned here before, where Trump signs blossomed early and often—and I went to vote, as I usually do, in early afternoon, when the lunch rush was over and the torrent of people voting on the way home from work hadn’t yet gotten under way. Thus there was no line; I came in just as two elderly voters on the way out were comparing notes on local restaurants that give discounts to patrons who’ve got the “I Voted” sticker the polls here hand out when you’ve done your civic duty, and left maybe five minutes later as a bottle-blonde housewife was coming in to cast her vote.
Maryland had electronic voting for a while, but did the smart thing and went back to paper ballots this year, so I’m pretty sure my votes got counted the way I cast them. Afterwards I walked home—it was cloudy but warm, as nice a November day as you could ask for—and got back to work on my current writing project. It all made an interesting counterpoint to the nonstop shrieking that’s been emanating for months now from the media and, let’s be fair, from politicians, pundits, and a great many ordinary people all over the world as well.
I don’t see a lot of point just now in talking about what’s going to happen once the dust and the tumult settles, the privileged finish throwing their predictable tantrums, and the Trump administration settles into power in Washington DC.  There will be plenty of time for that later. What I’d like to do here and now is talk about a couple of things that were highlighted by this election, and cast a useful light on the current state of US politics and the challenges that have to be faced as a troubled, beleaguered, and bitterly divided nation staggers on toward its next round of crises.
One of those things showed up with rare clarity in the way that many readers responded to my posts on the election. All along, from my first post on the improbable rise of Donald Trump right up to last week’s pre-election wrapup, I tried to keep the discussion focused on issues: what policies each candidate could be expected to support once the next administration took office.
To my mind, at least, that’s the thing that matters most about an election. Four or eight years from now, after all, the personality of the outgoing president is going to matter less than an average fart in a Category 5 hurricane. The consequences of policy decisions made by the presidency over the next four years, on the other hand, will have implications that extend for years into the future. Should the United States pursue a policy of confrontation with Russia in the Middle East, or should it work out a modus vivendi with the Russians to pursue the common goal of suppressing jihadi terrorism? Should federal policy continue to encourage the offshoring of jobs and the importation of workers to drive down wages, or should it be changed to discourage these things? These are important issues that will affect millions of lives in the United States and elsewhere, and there are other issues of similar importance on which the two candidates had significantly different positions.
Quite a few of the people who responded to those posts, though, displayed no interest in such mundane if important matters. They only wanted to talk about their opinions about the personalities of the candidates: to insist that Clinton was a corrupt stooge, say or that Trump was a hatemongering fascist. (It says something about American politics these days that rather more often than not, the people who did this were too busy slandering the character of the candidate they hated to say much about the one they planned to vote for.) Outside the relatively sheltered waters of The Archdruid Report, in turn, that tendency went into overdrive; for much of the campaign, the only way you could tell the difference between the newspapers of record and the National Enquirer was by noting which candidates they supported, and allegedly serious websites were by and large even worse.
This wasn’t the fault of the candidates, as it happens. Whatever else might be said for or against Hillary Clinton, she tried to avoid a campaign based on content-free sound bites like the one Barack Obama waged against her so cynically and successfully in 2008; the pages of her campaign website displayed a laundry list of things she said she wanted to do if she won the election. While many voters will have had their disagreements with her proposals, she actually tried to talk about the issues, and that’s refreshingly responsible. Trump, for that matter, devoted speech after speech to a range of highly specific policy proposals.
Yet nearly all the talk about both candidates, in and out of the media, focused not on their policy proposals but on their personalities—or rather on nastily distorted parodies of their personalities that defined them, more or less explicitly, as evil incarnate. The Church of Satan, I’m told, has stated categorically that the Devil was not running in this year’s US presidential election, but you’d have a hard time telling that from the rhetoric on both sides. The media certainly worked overtime to foster the fixation on personalities, but I suspect this is one of those cases where the media was simply reflecting something that was already present in the collective consciousness of our society.
All through the campaign I noticed, rather to my surprise, that it wasn’t just those who have nothing in their heads that a television or a website didn’t put there, who ignored the issues and fixated on personalities. I long ago lost track of the number of usually thoughtful people I know who, over the course of the last year, ended up buying into every negative claim about whichever candidate they hated, without even going through the motions of checking the facts. I also lost track months ago of the number of usually thoughtful people I know whose automatic response to an attempt to talk about the issues at stake in this election was to give me a blank look and go right back to ranting about the evilly evil evilness of whichever candidate they hated.
It seems to me that something has been forgotten here.  We didn’t have an election to choose a plaster saint, a new character on My Little Pony, or Miss (or Mister) Goody Two-Shoes 2016. We had an election to choose the official who will head the executive branch of our federal government for the next four years. I’ve read essays by people who know Hillary Clinton and Donald Trump personally, and claim that both of them are actually very pleasant people. You know what? I literally couldn’t care less. I would be just as likely to vote for a surly misanthrope who loathes children, kicks puppies, and has deviant sexual cravings involving household appliances and mayonnaise, if that person supports the policies I want on the issues that matter to me. It really is that simple.
I’d like to suggest, furthermore, that the fixation on personalities—or, again, malicious parodies of personalities—has played a huge role in making politics in the United States so savage, so divisive, and so intractably deadlocked on so many of the things that matter just now. The issues I mentioned a few paragraphs back—US foreign policy toward a resurgent Russia, on the one hand, and US economic policy regarding the offshoring of jobs and the importation of foreign workers—are not only important, they’re issues about which reasonable disagreement is possible. What’s more, they’re issues on which negotiation, compromise, and the working out of a mutually satisfactory modus vivendi between competing interests are also possible, at least in theory.
In practice? Not while each side is insisting at the top of its lungs that the other side is led by a monster of depravity and supported only by people who hate everything good in the world. I’d like to suggest that it’s exactly this replacement of reasoned politics with a pretty close equivalent of the Two Minutes Hate from Orwell’s 1984 that’s among the most important forces keeping this country from solving any of its problems or doing anything to brace itself for the looming crises ahead.
Thus I’d like to encourage all the citizens of my country to turn off the television and the internet for a few moments, take a few deep breaths, and think about the tone of the recent election, and to what extent they might have participated in the bipartisan culture of hatred that filled so much of it. It might be worth pointing out that you’re not likely to convince other people to vote the way you think they ought to vote if you’re simultaneously berating them for being evilly evil with a double helping of evil sauce on the side, or sneering at them for being too ignorant to recognize that voting for your candidate really is in their best interests, or any of the other counterproductive habits that have taken the place of reasonable political discourse in today’s America.
The second point I noticed in the course of the election campaign connects to the one just discussed. That’s the hard fact that the United States at this point in its history may still be a single republic, but it’s not a single nation—and it could be argued on reasonably solid grounds that it never has been. Facile distinctions between “red” and “blue” states barely touch the complexity, much less the depth, of the divisions that separate the great urban centers from the rest of the country, and the different regions from one another.
I think it was Pauline Kael who, in the wake of Richard Nixon’s landslide victory in 1972, commented that she didn’t understand how Nixon could have won—after all, nobody she knew voted for him! The same sentiment is currently being expressed in tones ranging from bewilderment and baffled rage from all corners of the affluent left and their hangers-on among the mainstream media’s well-paid punditry. The 20% or so of Americans who have benefited from the jobless recovery of the last eight years, and the broader neoliberal economic agenda of the last four decades, very rarely leave the echo-chamber environments where they spend their days to find out what the rest of the country is thinking. If they’d done so a bit more often in the last year, they would have watched Trump signs sprouting all over the stark landscapes of poverty that have spread so widely in the America they never see.
But of course the divisions run deeper than this, and considerably more ramified. Compare the political, economic, and social policies that have the approval of people in Massachusetts, say, and those that have the approval of people in Oklahoma, and you’ll find next to no overlap. This isn’t because the people of one state or the other are (insert your insult of choice here); it’s because they belong to different cultures, with incommensurable values, attitudes, and interests. Attempts, well-meaning or otherwise, to impose the mores of either state on the other are guaranteed to result only in hostility and incomprehension—and such attempts have been all too common of late.
Ours is a very diverse country. That may sound like a truism, but it has implications that aren’t usually taken into account. A country with a great deal of cultural uniformity, with a broad consensus of shared values and attitudes, can afford to legislate that consensus on a national basis. A country that doesn’t have that kind of uniformity, that lacks any consensus concerning values and attitudes, very quickly gets into serious trouble if it tries that sort of legislation. If the divergence is serious enough, the only way that reliably allows different nations to function under a single government is a federal system—that is, a system that assigns the national government only those powers and duties that have to be handled on a nationwide basis, while leaving most other questions for local governments and individuals to settle for themselves.
My more historically literate readers will be aware that the United States used to have a federal system—that is, after all, why we still speak of “the federal government.” Under the Constitution as originally written and interpreted, the people of each state had the right to run their own affairs pretty much as they saw fit, within certain very broad limits.  The federal government was assigned certain narrowly defined powers, and all other powers were, in the language of the Tenth Amendment, reserved to the states and the people.
Over the first century and a half of our national history, certain other powers were assigned to the federal government by constitutional amendment, sometimes with good results—the Fourteenth Amendment’s guarantee of equal protection of the laws to all citizens, for example, and the Fifteenth and Nineteenth Amendments’ extension of voting rights to black people and women respectively—and sometimes not—the Eighteenth Amendment’s prohibition of alcohol comes to mind here. The basic federal structure remained intact. Not until the aftermath of the Great Depression and the Second World War did the metastatic growth of the federal government begin in earnest, and so in due time did the various attempts to impose this or that set of moral values on the entire country by force of law.
Those attempts have not worked, and they’re not going to work. I’m not sure how many people have noticed, though, that the election of Donald Trump was not merely a rebuke to the liberal left; it was also a defeat for the religious right. It’s worth recalling that the evangelical wing of the Republican Party had its own favorites in the race for the GOP nomination, and Trump was emphatically not one of them. It has not been a propitious autumn for the movements of left and right whose stock in trade is trying to force their own notion of virtue down the throats of the American people—and maybe, just maybe, that points to the way ahead.
It’s time to consider, I suggest, a renewal of the traditions of American federalism: a systematic devolution of power from the overinflated federal government to the states, and from the states to the people. It’s time for people in Massachusetts to accept that they’re never going to be able to force people in Oklahoma to conform to their notions of moral goodness, and for the people of Oklahoma to accept the same thing about the people of Massachusetts; furthermore, it’s time for government at all levels to give up trying to impose cultural uniformity on the lively diversity of our republic’s many nations, and settle for their proper role of ensuring equal protection under the laws, and those other benefits that governments, by their nature, are best suited to provide for their citizens.
We need a new social compact under which all Americans agree to back away from the politics of personal vilification that dominated all sides in the election just over, let go of the supposed right to force everyone in the country to submit to any one set of social and moral views, and approach the issues that divide us with an eye toward compromise, negotiation, and mutual respect. Most of the problems that face this country could be solved, or at least significantly ameliorated, if our efforts were guided by such a compact—and if that can be done, I suspect that a great many more of us will have the opportunity to experience one of the greatest benefits a political system can bestow: actual, honest-to-goodness liberty. We’ll talk more about that in future posts.

************************
In unrelated and rather less serious news, I’m pleased to announce that the second volume of my Lovecraftian epic fantasy series The Weird of Hali is now available for preorder. Once again, H.P. Lovecraft gets stood on his head, and the tentacled horrors and sinister cultists get the protagonists’ roles; this time the setting is the crumbling seaside town of Kingsport, where Miskatonic University student Jenny Parrish is summoned to attend a certain very ancient festival...
The Weird of Hali: Kingsport, like the first book in the series, The Weird of Hali: Innsmouth, is being released first in two signed and numbered editions, one one merely gorgeous, the other leatherbound, traycased, and utterly over the top for connoisseurs of fine printing and binding. There will be a trade paperback edition in due time, but it’ll be a while. Those of my readers who find eldritch nightmares from the crepuscular beginnings of time itself better company than the current crop of American politicians may find it worth a read.

The Last Gasp of the American Dream

Wed, 2016-11-02 08:47
Just at the moment, many of my readers—and of course a great many others as well—are paying close attention to which of the two most detested people in American public life will put a hand on a Bible in January, and preside thereafter over the next four years of this nation’s accelerating decline and fall. That focus is understandable, and not just because both parties have trotted out the shopworn claim that this election, like every other one in living memory, is the most important in our lifetimes. For a change, there are actual issues involved.
  Barring any of the incidents that could throw the election into the House of Representatives, we’ll know by this time next week whether the bipartisan consensus that’s been welded firmly in place in American politics since the election of George W. Bush will stay intact for the next four years. That consensus, for those of my readers who haven’t been paying attention, supports massive giveaways to big corporations and the already affluent, punitive austerity for the poor, malign neglect for the nation’s infrastructure, the destruction of the American working class through federal subsidies for automation and offshoring and tacit acceptance of mass illegal immigration as a means of driving down wages, and a monomaniacally confrontational foreign policy obsessed with the domination of the Middle East by raw military force. Those are the policies that George W. Bush and Barack Obama pursued through four presidential terms, and they’re the policies that Hillary Clinton has supported throughout her political career.
Donald Trump, by contrast, has been arguing against several core elements of that consensus since the beginning of his run for office. Specifically, he’s calling for a reversal of federal policies that support offshoring of jobs, the enforcement of US immigration law, and a less rigidly confrontational stance toward Russia over the war in Syria. It’s been popular all through the current campaign for Clinton’s supporters to insist that nobody actually cares about these issues, and that Trump’s supporters must by definition be motivated by hateful values instead, but that rhetorical gimmick has been a standard thoughstopper on the left for many years now, and it simply won’t wash. The reason why Trump was able to sweep aside the other GOP candidates, and has a shot at winning next week’s election despite the unanimous opposition of this nation’s political class, is that he’s the first presidential candidate in a generation to admit that the issues just mentioned actually matter.
That was a ticket to the nomination, in turn, because outside the bicoastal echo chamber of the affluent, the US economy has been in freefall for years.  I suspect that a great many financially comfortable people in today’s America have no idea just how bad things have gotten here in the flyover states. The recovery of the last eight years has only benefited the upper 20% or so by income of the population; the rest have been left to get by on declining real wages, while simultaneously having to face skyrocketing rents driven by federal policies that prop up the real estate market, and stunning increases in medical costs driven by Obama’s embarrassingly misnamed “Affordable Care Act.” It’s no accident that death rates from suicide, drug overdose, and alcohol poisoning are soaring just now among working class white people. These are my neighbors, the people I talk with in laundromats and lodge meetings, and they’re being driven to the wall.
Most of the time, affluent liberals who are quick to emote about the sufferings of poor children in conveniently distant corners of the Third World like to brush aside the issues I’ve just raised as irrelevancies. I’ve long since lost track of the number of times I’ve heard people insist that the American working class hasn’t been destroyed, that its destruction doesn’t matter, or that it was the fault of the working classes themselves. (I’ve occasionally heard people attempt to claim all three of these things at once.) On those occasions when the mainstream left deigns to recognize the situation I’ve sketched out, it’s usually in the terms Hillary Clinton used in her infamous “basket of deplorables” speech, in which she admitted that there were people who hadn’t benefited from the recovery and “we need to do something for them.” That the people in question might deserve to have a voice in what’s done for them, or to them, is not part of the vocabulary of the affluent American left.
That’s why, if you pay a visit to the town where I live, you’ll find Trump signs all over the place—and you’ll find the highest concentration of them in the poor neighborhood just south of my home, a bleak rundown zone where there’s a church every few blocks and an abandoned house every few doors, and where the people tipping back beers on a porch of a summer evening rarely all have the same skin color. They know exactly what they need, and what tens of thousands of other economically devastated American communities need: enough full-time jobs at decent wages to give them the chance to lift their families out of poverty. They understand that need, and discuss it in detail among themselves, with a clarity you’ll rarely find in the media. (It’s a source of wry amusement to me that the best coverage of the situation on the ground here in the flyover states appeared, not in any of America’s newspapers of record, nor in any of its allegedly serious magazines, but in a raucous NSFW online humor magazine.)
What’s more, the working class people who point to a lack of jobs as the cause of middle America’s economic collapse are dead right.  The reason why those tens of thousands of American communities are economically devastated is that too few people have enough income to support the small businesses and local economies that used to thrive there. The money that used to keep main streets bustling across the United States, the wages that used to be handed out on Friday afternoons to millions of Americans who’d spent the previous week putting in an honest day’s work for an honest day’s pay, have been siphoned off to inflate the profits of a handful of huge corporations to absurd levels and cater to the kleptocratic feeding frenzy that’s made multimillion-dollar bonuses a matter of course at the top of the corporate food chain. It really is as simple as that. The Trump voters in the neighborhood south of my home may not have a handle on all the details, but they know that their survival depends on getting some of that money flowing back into paychecks to be spent in their community.
It’s an open question whether they’re going to get that if Donald Trump wins the election, and a great many of his supporters know this perfectly well.  It’s as certain as anything can be, though, that they’re not going to get it from Hillary Clinton. The economic policy she’s touted in her speeches, to the extent that this isn’t just the sort of campaign rhetoric that will pass its pull date the moment the last vote is counted, focuses on improving opportunities for the middle class—the people, in other words, who have already reaped the lion’s share of those economic benefits that didn’t go straight into the pockets of the rich. To the working classes, she offers nothing but a repetition of the same empty slogans and disposable promises. What’s more, they know this, and another round of empty slogans and disposable promises isn’t going to change that.
Nor, it probably needs to be said, is it going to be changed by another round of media handwaving designed to make Donald Trump look bad in the eyes of affluent liberals. I’ve noted with some amusement the various news stories on the highbrow end of the media noting, in tones variously baffled and horrified, that when you show Trump supporters videos designed to make them less enthusiastic about their candidate, they double down. Any number of canned theories have been floated to explain why that happens, but none that I’ve heard have dealt with the obvious explanations.
To begin with, it’s not as though that habit is only found on Trump’s side of the fence. In recent weeks, as one Wikileaks email dump after another has forced an assortment of stories about Clinton’s arrogant and corrupt behavior into the news, her followers have doubled down just as enthusiastically as Trump’s; those of my readers who are familiar with the psychology of previous investment will likely notice that emotional investment is just as subject to this law as the financial kind. For that matter, supporters of both candidates are quite sensibly aware that this election is meant to choose a public official rather than a plaster saint, and recognize that a genuine scoundrel who will take the right stands on the issues that matter to them is a better choice than a squeaky-clean innocent who won’t, even if such an animal could actually be found in the grubby ecosystem of contemporary American politics.
That said, there’s another factor that probably plays an even larger role, which is that when working class Americans get told by slickly groomed talking heads in suits that something they believe is wrong, their default assumption is that the talking heads are lying.
Working class Americans, after all, have very good reason for making this their default assumption. Over and over again, that’s the way things have turned out. The talking heads insisted that handing over tax dollars to various corporate welfare queens would bring jobs back to American communities; the corporations in question pocketed the tax dollars and walked away. The talking heads insisted that if working class people went to college at their own expense and got retrained in new skills, that would bring jobs back to American communities; the academic industry profited mightily but the jobs never showed up, leaving tens of millions of people buried so deeply under student loan debt that most of them will never recover financially. The talking heads insisted that this or that or the other political candidate would bring jobs back to American communities by pursuing exactly the same policies that got rid of the jobs in the first place—essentially the same claim that the Clinton campaign is making now—and we know how that turned out.
For that matter, trust in talking heads generally is at an all-time low out here in flyover country. Consider the way that herbal medicine—“God’s medicine” is the usual phrase these days—has become the go-to option for a huge and growing number of devout rural Christians. There are plenty of reasons why that should be happening, but surely one of the most crucial is the cascading loss of faith in the slickly groomed talking heads that sell modern medicine to consumers. Herbs may not be as effective as modern pharmaceuticals in treating major illnesses, to be sure, but they generally don’t have the ghastly side effects that so many pharmaceuticals will give you.  Furthermore, and just as crucially, nobody ever bankrupted their family and ended up on the street because of the high price of herbs.
It used to be, not all that long ago, that the sort of people we’re discussing trusted implicitly in American society and its institutions. They were just as prone as any urban sophisticate to distrust this or that politician or businessperson or cultural figure, to be sure; back in the days when local caucuses and county conventions of the two main political parties still counted for something, you could be sure of hearing raucous debates about a galaxy of personalities and issues. Next to nobody, though, doubted that the basic structures of American society were not merely sound, but superior to all others.
You won’t find that certainty in flyover country these days. Where you hear such claims made at all, they’re phrased in the kind of angry and defensive terms that lets everyone know that the speaker is trying to convince himself of something he doesn’t entirely believe any more, or in the kind of elegaic tones that hearken back to an earlier time when things still seemed to work—when the phrase “the American Dream” still stood for a reality that many people had experienced and many more could expect to achieve for themselves and their children. Very few people out here think of the federal government as anything more than a vast mechanism operated by rich crooks for their own benefit, at the expense of everyone else. What’s more, the same cynical attitude is spreading to embrace the other institutions of American society, and—lethally—the ideals from which those institutions get whatever legitimacy they still hold in the eyes of the people.
Those of my readers who were around in the late 1980s and early 1990s have seen this movie before, though it came with Cyrillic subtitles that time around. By 1985 or so, it had become painfully obvious to most citizens of the Soviet Union that the grand promises of Marxism would not be kept and the glorious future for which their grandparents and great-grandparents had fought and labored was never going to arrive. Glowing articles in Pravda and Izvestia insisted that everything was just fine in the Worker’s Paradise; annual five-year plans presupposed that economic conditions would get steadily better while, for most people, economic conditions got steadily worse; vast May Day parades showed off the Soviet Union’s military might, Soyuz spacecraft circled the globe to show off its technological prowess, and tame intellectuals comfortably situated in the more affluent districts of Moscow and Leningrad, looking forward to their next vacation at their favorite Black Sea resort, chattered in print about the good life under socialism, while millions of ordinary Soviet citizens trudged through a bleak round of long lines, product shortages, and system-wide dysfunction. Then crisis hit, and the great-great-grandchildren of the people who surged to the barricades during the Russian Revolution shrugged, and let the Soviet Union unravel in a matter of days.
I suspect we’re much closer to a similar cascade of events here in the United States than most people realize. My fellow peak oil blogger Dmitry Orlov pointed out a decade or so back, in a series of much-reprinted blog posts and his book Reinventing Collapse, that the differences between the Soviet Union and the United States were far less important than their similarities, and that a Soviet-style collapse was a real possibility here—a possibility for which most Americans are far less well prepared than their Russian equivalents in the early 1990s. His arguments have become even more compelling as the years have passed, and the United States has become mired ever more deeply in a mire of institutional dysfunction and politico-economic kleptocracy all but indistinguishable from the one that eventually swallowed its erstwhile rival.
Point by point, the parallels stand out. We’ve got the news articles insisting, in tones by turns glowing and shrill, that things have never been better in the United States and anyone who says otherwise is just plain wrong; we’ve got the economic pronouncements predicated on continuing growth at a time when the only things growing in the US economy are its total debt load and the number of people who are permanently unemployed; we’ve got the overblown displays of military might and technological prowess, reminiscent of nothing so much as the macho posturing of balding middle-aged former athletes who are trying to pretend that they haven’t lost it; we’ve got the tame intellectuals comfortably situated in the more affluent suburban districts around Boston, New York, Washington, and San Francisco, looking forward to their next vacation in whatever the currently fashionable spot might happen to be, babbling on the internet about the good life under predatory cybercapitalism.
Meanwhile millions of Americans trudge through a bleak round of layoffs, wage cuts, part-time jobs at minimal pay, and system-wide dysfunction. The crisis hasn’t hit yet, but those members of the political class who think that the people who used to be rock-solid American patriots will turn out en masse to keep today’s apparatchiks secure in their comfortable lifestyles have, as the saying goes, another think coming.  Nor is it irrelevant that most of the enlisted personnel in the armed forces, who are the US government’s ultimate bulwark against popular unrest, come from the very classes that have lost faith most drastically in the American system. The one significant difference between the Soviet case and the American one at this stage of the game is that Soviet citizens had no choice but to accept the leaders the Communist Party of the USSR foisted off on them, from Brezhnev to Andropov to Chernenko to Gorbachev, until the system collapsed of its own weight.
American citizens, on the other hand, do at least potentially have a choice. Elections in the United States have been riddled with fraud for most of two centuries, but since both parties are generally up to their eyeballs in voter fraud to a roughly equal degree, fraud mostly swings close elections.  It’s still possible for a sufficiently popular candidate to overwhelm the graveyard vote, the crooked voting machines, and the other crass realities of American elections by sheer force of numbers. That way, an outsider unburdened with the echo-chamber thinking of a dysfunctional elite might just be able to elbow his way into the White House. Will that happen this time? No one knows.
If George W. Bush was our Leonid Brezhnev, as I’d suggest, and Barack Obama is our Yuri Andropov, Hillary Clinton is running for the position of Konstantin Chernenko; her running mate Tim Kaine, in turn, is waiting in the wings as a suitably idealistic and clueless Mikhail Gorbachev, under whom the whole shebang can promptly go to bits. While I don’t seriously expect the trajectory of the United States to parallel that of the Soviet Union anything like as precisely as this satiric metaphor would suggest, the basic pattern of cascading dysfunction ending in political collapse is quite a common thing in history, and a galaxy of parallels suggests that the same thing could very easily happen here within the next decade or so. The serene conviction among the political class and their affluent hangers-on that nothing of the sort could possibly take place is just another factor making it more likely.
It’s by no means certain that a Trump presidency will stop that from happening, and jolt the United States far enough out of its current death spiral to make it possible to salvage something from the American experiment. Even among Trump’s most diehard supporters, it’s common to find people who cheerfully admit that Trump might not change things enough to matter; it’s just that when times are desperate enough—and out here in the flyover states, they are—a leap in the dark is preferable to the guaranteed continuation of the unendurable.
Thus the grassroots movement that propelled Trump to the Republican nomination in the teeth of the GOP establishment, and has brought him to within a couple of aces of the White House in the teeth of the entire US political class, might best be understood as the last gasp of the American dream. Whether he wins or loses next week, this country is moving into the darkness of an uncharted night—and it’s not out of place to wonder, much as Hamlet did, what dreams may come in that darkness.

The Future Hiding in Plain Sight

Wed, 2016-10-19 14:05
Carl Jung used to argue that meaningful coincidences—in his jargon, synchronicity—were as important as cause and effect in shaping the details of human life. Whether that’s true in the broadest sense, I’ll certainly vouch for the fact that they’re a constant presence in The Archdruid Report. Time and again, just as I sit down to write a post on some theme, somebody sends me a bit of data that casts unexpected light on that very theme.

Last week was a case in point. Regular readers will recall that the theme of last week’s post was the way that pop-culture depictions of deep time implicitly erase the future by presenting accounts of Earth’s long history that begin billions of years ago and end right now. I was brooding over that theme a little more than a week ago, chasing down details of the prehistoric past and the posthistoric future, when one of my readers forwarded me a copy of the latest Joint Operating Environment report by the Pentagon—JOE-35, to use the standard jargon—which attempts to predict the shape of the international environment in which US military operations will take place in 2035, and mostly succeeds in providing a world-class example of the same blindness to the future I discussed in my post.

The report can be downloaded in PDF form hereand is worth reading in full. It covers quite a bit of ground, and a thorough response to it would be more the size of a short book than a weekly blog post. The point I want to discuss this week is its identification of six primary “contexts for conflict” that will shape the military environment of the 2030s:
“1. Violent Ideological Competition. Irreconcilable ideas communicated and promoted by identity networks through violence.” That is, states and non-state actors alike will pursue their goals by spreading ideologies hostile to US interests and encouraging violent acts to promote those ideologies.
“2. Threatened U.S. Territory and Sovereignty. Encroachment, erosion, or disregard of U.S. sovereignty and the freedom of its citizens from coercion.” That is, states and non-state actors will attempt to carry out violent acts against US citizens and territory. 
“3. Antagonistic Geopolitical Balancing. Increasingly ambitious adversaries maximizing their own influence while actively limiting U.S. influence.” That is, rival powers will pursue their own interests in conflict with those of the United States.
“4. Disrupted Global Commons. Denial or compulsion in spaces and places available to all but owned by none.”  That is, the US will no longer be able to count on unimpeded access to the oceans, the air, space, and the electromagnetic spectrum in the pursuit of its interests.
“5. A Contest for Cyberspace. A struggle to define and credibly protect sovereignty in cyberspace.” That is, US cyberwarfare measures will increasingly face effective defenses and US cyberspace assets will increasingly face effective hostile incursions.
“6. Shattered and Reordered Regions. States unable to cope with internal political fractures, environmental stressors, or deliberate external interference.”  That is, states will continue to be overwhelmed by the increasingly harsh pressures on national survival in today’s world, and the failed states and stateless zones that will spawn insurgencies and non-state actors hostile to the US.
Apparently nobody at the Pentagon noticed one distinctly odd thing about this outline of the future context of American military operations: it’s not an outline of the future at all. It’s an outline of the present. Every one of these trends is a major factor shaping political and military action around the world right now. JOE-35 therefore assumes, first, that each of these trends will remain locked in place without significant change for the next twenty years, and second, that no new trends of comparable importance will emerge to reshape the strategic landscape between now and 2035. History suggests that both of these are very, very risky assumptions for a great power to make.
It so happens that I have a fair number of readers who serve in the US armed forces just now, and a somewhat larger number who serve in the armed forces of other countries more or less allied with the United States. (I may have readers serving with the armed forces of Russia or China as well, but they haven’t announced themselves—and I suspect, for what it’s worth, that they’re already well acquainted with the points I intend to make.) With those readers in mind, I’d like to suggest a revision to JOE-35, which will take into account the fact that history can’t be expected to stop in its tracks for the next twenty years, just because we want it to. Once that’s included in the analysis, at least five contexts of conflict not mentioned by JOE-35 stand out from the background:
1. A crisis of legitimacy in the United States. Half a century ago, most Americans assumed as a matter of course that the United States had the world’s best, fairest, and most democratic system of government; only a small minority questioned the basic legitimacy of the institutions of government or believed they would be better off under a different system. Since the late 1970s, however, federal policies that subsidized automation and the offshoring of industrial jobs, and tacitly permitted mass illegal immigration to force down wages, have plunged the once-proud American working class into impoverishment and immiseration. While the wealthiest 20% or so of Americans have prospered since then, the other 80% of the population has experienced ongoing declines in standards of living.
The political impact of these policies has been amplified by a culture of contempt toward working class Americans on the part of the affluent minority, and an insistence that any attempt to discuss economic and social impacts of automation, offshoring of jobs, and mass illegal immigration must be dismissed out of hand as mere Luddism, racism, and xenophobia. As a direct consequence, a great many working class Americans—in 1965, by and large, the sector of the public most loyal to American institutions—have lost faith in the US system of government. This shift in values has massive military as well as political implications, since working class Americans are much more likely than others to own guns, to have served in the military, and to see political violence as a potential option.
Thus a domestic insurgency in the United States is a real possibility at this point.  Since, as already noted, working class Americans are disproportionately likely to serve in the military, planning for a domestic insurgency in the United States will have to face the possibility that such an insurgency will include veterans familiar with current counterinsurgency doctrine. It will also have to cope with the risk that National Guard and regular armed forces personnel sent to suppress such an insurgency will go over to the insurgent side, transforming the insurgency into a civil war.
As some wag has pointed out, the US military is very good at fighting insurgencies but not so good at defeating them, and the fate of Eastern Bloc nations after the fall of the Soviet Union shows just how fast a government can unravel once its military personnel turn against it. Furthermore, since the crisis of legitimacy is driven by policies backed by a bipartisan consensus, military planners can only deal with the symptoms of a challenge whose causes are beyond their control.
2. The marginalization of the United States in the global arena. Twenty years ago the United States was the world’s sole superpower, having triumphed over the Soviet Union, established a rapprochement with China, and marginalized such hostile Islamic powers as Iran. Those advantages did not survive two decades of overbearing and unreliable US policy, which not only failed to cement the gains of previous decades but succeeded in driving Russia and China, despite their divergent interests and long history of conflict, into an alliance against the United States. Future scholars will likely consider this to be the worst foreign policy misstep in our nation’s history.
Iran’s alignment with the Sino-Russian alliance and, more recently, overtures from the Philippines and Egypt, track the continuation of this trend, as do the establishment of Chinese naval bases across the Indian Ocean from Myanmar to the Horn of Africa, and most recently, Russian moves to reestablish overseas bases in Syria, Egypt, Vietnam, and Cuba. Russia and China are able to approach foreign alliances on the basis of a rational calculus of mutual interest, rather than the dogmatic insistence on national exceptionalism that guides so much of US foreign policy today. This allows them to offer other nations, including putative US allies, better deals than the US is willing to concede.
As a direct result, barring a radical change in its foreign policy, the United States in 2035 will be marginalized by a new global order centered on Beijing and Moscow, denied access to markets and resources by trade agreements hostile to its interests, and will have to struggle to maintain influence even over its “near abroad.” It is unwise to assume, as some current strategists do, that China’s current economic problems will slow that process.  Some European leaders in the 1930s, Adolf Hitler among them, assumed that the comparable boom-bust cycle the United States experienced in the 1920s and 1930s meant that the US would be a negligible factor in the European balance of power in the 1940s.  I think we all know how that turned out.
Here again, barring a drastic change in US foreign policy, military planners will be forced to deal with the consequences of unwelcome shifts without being able to affect the causes of those shifts.  Careful planning can, however, redirect resources away from global commitments that will not survive the process of marginalization, and toward securing the “near abroad” of the United States and withdrawing assets to the continental US to keep them from being compromised by former allies.
3. The rise of “monkeywrenching” warfare. The United States has the most technologically complex military in the history of war. While this is normally considered an advantage, it brings with it no shortage of liabilities. The most important of these is the vulnerability of complex technological systems to “monkeywrenching”—that is, strategies and tactics targeting technological weak points in order to degrade the capacities of a technologically superior force.  The more complex a technology is, as a rule, the wider the range of monkeywrenching attacks that can interfere with it; the more integrated a technology is with other technologies, the more drastic the potential impacts of such attacks. The complexity and integration of US military technology make it a monkeywrencher’s dream target, and current plans for increased complexity and integration will only heighten the risks.
The risks created by the emergence of monkeywrenching warfare are heightened by an attitude that has deep roots in the culture of US military procurement:  the unquestioned assumption that innovation is always improvement. This assumption has played a central role in producing weapons systems such as the F-35 Joint Strike Fighter, which is so heavily burdened with assorted innovations that it has a much shorter effective range, a much smaller payload, and much higher maintenance costs than competing Russian and Chinese fighters. In effect, the designers of the F-35 were so busy making it innovative that they forgot to make it work. The same thing can be said about many other highly innovative but dubiously effective US military technologies.
Problems caused by excessive innovation can to some extent be anticipated and countered by US military planners. What makes monkeywrenching attacks by hostile states and non-state actors so serious a threat is that it may not be possible to predict them in advance. While US intelligence assets should certainly make every effort to identify monkeywrenching technologies and tactics before they are used, US forces must be aware that at any moment, critical technologies may be put out of operation or turned to the enemy’s advantage without warning. Rigorous training in responding to technological failure, and redundant systems that can operate independently of existing networks, may provide some protection against monkeywrenching, but the risk remains grave.
4. The genesis of warband culture in failed states.While JOE-35 rightly identifies the collapse of weak states into failed-state conditions as a significant military threat, a lack of attention to the lessons of history leads its authors to neglect the most serious risk posed by the collapse of states in a time of general economic retrenchment and cultural crisis. That risk is the emergence of warband culture—a set of cultural norms that dominate the terminal periods of most recorded civilizations and the dark ages that follow them, and play a central role in the historical transformation to dark age conditions.
Historians use the term “warband” to describe a force of young men whose only trade is violence, gathered around a charismatic leader and supporting itself by pillage. While warbands tend to come into being whenever public order collapses or has not yet been imposed, the rise of a self-sustaining warband culture requires a prolonged period of disorder in which governments either do not exist or cannot establish their legitimacy in the eyes of the governed, and warbands become accepted as the de facto governments of territories of various size. Once this happens, the warbands inevitably begin to move outward; the ethos and the economics of the warband alike require access to plunder, and this can best be obtained by invading regions not yet reduced to failed-state conditions, thus spreading the state of affairs that fosters warband culture in the first place.
Most civilizations have had to contend with warbands in their last years, and the record of attempts to quell them by military force is not good. At best, a given massing of warbands can be defeated and driven back into whatever stateless area provides them with their home base; a decade or two later, they can be counted on to return in force. Systematic attempts to depopulate their home base simply drive them into other areas, causing the collapse of public order there. Once warband culture establishes itself solidly on the fringes of a civilization, history suggests, the entire civilized area will eventually be reduced to failed-state conditions by warband incursions, leading to a dark age. Nothing guarantees that the modern industrial world is immune from this same process.
The spread of failed states around the periphery of the industrial world is thus an existential thread not only to the United States but to the entire project of modern civilization. What makes this a critical issue is that US foreign policy and military actions have repeatedly created failed states in which warband culture can flourish:  Afghanistan, Iraq, Syria, Libya, and Ukraine are only the most visible examples. Elements of US policy toward Mexico—for example, the “Fast and Furious” gunrunning scheme—show worrisome movement in the same direction. Unless these policies are reversed, the world of 2035 may face conditions like those that have ended civilization more than once in the past.
5. The end of the Holocene environmental optimum. All things considered, the period since the final melting of the great ice sheets some six millennia ago has been extremely propitious for the project of human civilization. Compared to previous epochs, the global climate has been relatively stable, and sea levels have changed only slowly. Furthermore, the globe six thousand years ago was stocked with an impressive array of natural resources, and the capacity of its natural systems to absorb sudden shocks had not been challenged on a global level for some sixty-five million years.
None of those conditions remains the case today. Ongoing dumping of greenhouse gases into the atmosphere is rapidly destabilizing the global climate, and triggering ice sheet melting in Greenland and Antarctica that promises to send sea levels up sharply in the decades and centuries ahead. Many other modes of pollution are disrupting natural systems in a galaxy of ways, triggering dramatic environmental changes. Meanwhile breakneck extraction is rapidly depleting the accessible stocks of hundreds of different nonrenewable resources, each of them essential to some aspect of contemporary industrial society, and the capacity of natural systems to cope with the cascading burdens placed upon them by human action has already reached the breaking point in many areas.
The end of the Holocene environmental optimum—the era of relative ecological stability in which human civilization has flourished—is likely to be a prolonged process. By 2035, however, current estimates suggest that the initial round of impacts will be well under way. Shifting climate belts causing agricultural failure, rising sea levels imposing drastic economic burdens on coastal communities and the nations to which they belong, rising real costs for resource extraction driving price spikes and demand destruction, and increasingly intractable conflicts pitting states, non-state actors, and refugee populations against one another for remaining supplies of fuel, raw materials, topsoil, food, and water.
US military planners will need to take increasingly hostile environmental conditions into account. They will also need to prepare for mass movements of refugees out of areas of flooding, famine, and other forms of environmental disruption, on a scale exceeding current refugee flows by orders of magnitude. Finally, since the economic impact of these shifts on the United States will affect the nation’s ability to provide necessary resources for its military, plans for coping with cascading environmental crises will have to take into account the likelihood that the resources needed to do so may be in short supply.
Those are the five contexts for conflict I foresee. What makes them even more challenging than they would otherwise be, of course, is that none of them occur in a vacuum, and each potentially feeds into the others. Thus, for example, it would be in the national interest of Russia and/or China to help fund and supply a domestic insurgency in the United States (contexts 1 and 2); emergent warbands may well be able to equip themselves with the necessary gear to engage in monkeywrenching attacks against US forces sent to contain them (contexts 4 and 3); disruptions driven by environmental change will likely help foster the process of warband formation (contexts 5 and 4), and so on.
That’s the future hiding in plain sight: the implications of US policies in the present and recent past, taken to their logical conclusions. The fact that current Pentagon assessments of the future remain so tightly fixed on the phenomena of the present, with no sense of where those phenomena lead, gives me little hope that any of these bitter outcomes will be avoided.

****************
There will be no regularly scheduled Archdruid Report next week. Blogger's latest security upgrade has made it impossible for me to access this blog while I'm out of town, and I'll be on the road (and my backup moderator unavailable) for a good part of what would be next week's comment cycle. I've begun the process of looking for a new platform for my blogs, and I'd encourage any of my readers who rely on Blogger or any other Google product to look for alternatives before you, too, get hit by an "upgrade" that makes it more trouble to use than it's worth.

An Afternoon in Early Autumn

Wed, 2016-10-12 14:43
I think it was the late science writer Stephen Jay Gould who coined the term “deep time” for the vast panorama opened up to human eyes by the last three hundred years or so of discoveries in geology and astronomy. It’s a useful label for an even more useful concept. In our lives, we deal with time in days, seasons, years, decades at most; decades, centuries and millennia provide the yardsticks by which the life cycles of human societies—that is to say, history, in the usual sense of that word—are traced.
Both these, the time frame of individual lives and the time frame of societies, are anthropocentric, as indeed they should be; lives and societies are human things and require a human measure. When that old bamboozler Protagoras insisted that “man is the measure of all things,” though, he uttered a subtle truth wrapped in a bald-faced lie.* The subtle truth is that since we are what we are—that is to say, social primates whow have learned a few interesting tricks—our capacity to understand the cosmos is strictly limited by the perceptions that human nervous systems are capable of processing and the notions that human minds are capable of thinking. The bald-faced lie is the claim that everything in the cosmos must fit inside the perceptions human beings can process and the notions they can think.
(*No, none of this has to do with gender politics. The Greek language, unlike modern English, had a common gender-nonspecific noun for “human being,” anthropos, which was distinct from andros, “man,” and gyne, “woman.” The word Protagoras used was anthropos.)
It took the birth of modern geology to tear through the veil of human time and reveal the stunningly inhuman scale of time that measures the great cycles of the planet on which we live. Last week’s post sketched out part of the process by which people in Europe and the European diaspora, once they got around to noticing that the Book of Genesis is about the Rock of Ages rather than the age of rocks, struggled to come to terms with the immensities that geological strata revealed. To my mind, that was the single most important discovery our civilization has made—a discovery with which we’re still trying to come to terms, with limited success so far, and one that I hope we can somehow manage to hand down to our descendants in the far future.
The thing that makes deep time difficult for many people to cope with is that it makes self-evident nonsense out of any claim that human beings have any uniquely important place in the history of the cosmos. That wouldn’t be a difficulty at all, except that the religious beliefs most commonly held in Europe and the European diaspora make exactly that claim.
That last point deserves some expansion here, not least because a minority among the current crop of “angry atheists” have made a great deal of rhetorical hay by insisting that all religions, across the board, point for point, are identical to whichever specific religion they themselves hate the most—usually, though not always, whatever Christian denomination they rebelled against in their adolescent years. That insistence is a fertile source of nonsense, and never so much as when it turns to the religious implications of time.
The conflict between science and religion over the age of the Earth is a purely Western phenomenon.  Had the great geological discoveries of the eighteenth and nineteenth centuries taken place in Japan, say, or India, the local religious authorities wouldn’t have turned a hair. On the one hand, most Asian religious traditions juggle million-year intervals as effortlessly as any modern cosmologist; on the other, Asian religious traditions have by and large avoided the dubious conviction, enshrined in most (though not all) versions of Christianity, that the Earth and everything upon it exists solely as a stage on which the drama of humanity’s fall and redemption plays out over a human-scaled interval of time. The expansive Hindu cosmos with its vast ever-repeating cycles of time, the Shinto concept of Great Nature as a continuum within which every category of being has its rightful place, and other non-Western worldviews offer plenty of room for modern geology to find a home.
Ironically, though, the ongoing decline of mainstream Christianity as a cultural influence in the Western world hasn’t done much to lessen the difficulty most people in the industrial world feel when faced with the abysses of deep time. The reason here is simply that the ersatz religion that’s taken the place of Christianity in the Western imagination also tries to impose a rigid ideological scheme not only on the ebb and flow of human history, but on the great cycles of the nonhuman cosmos as well. Yes, that would be the religion of progress—the faith-based conviction that human history is, or at least ought to be, a straight line extending onward and upward from the caves to the stars.
You might think, dear reader, that a belief system whose followers like to wallow in self-praise for their rejection of the seven-day creation scheme of the Book of Genesis and their embrace of deep time in the past would have a bit of a hard time evading its implications for the future. Let me assure you that this seems to give most of them no trouble at all. From Ray Kurzweil’s pop-culture mythology of the Singularity—a straightforward rewrite of Christian faith in the Second Coming dolled up in science-fiction drag—straight through to the earnest space-travel advocates who insist that we’ve got to be ready to abandon the solar system when the sun turns into a red giant four billion years from now, a near-total aversion to thinking about the realities deep time ahead of us is astonishingly prevalent among those who think they’ve grasped the vastness of Earth’s history.
I’ve come to think that one of the things that feeds this curious quirk of collective thinking is a bit of trivia to be found in a great many books on geology and the like—the metaphor that turns the Earth’s entire history into a single year, starting on January 1 with the planet’s formation out of clouds of interstellar dust and ending at midnight on December 31, which is always right now.
That metaphor has been rehashed more often than the average sitcom plot. A quick check of the books in the study where I’m writing this essay finds three different versions, one written in the 1960s, one in the 1980s, and one a little more than a decade ago. The dates of various events dance around the calendar a bit as new discoveries rewrite this or that detail of the planet’s history, to be sure; when I was a dinosaur-crazed seven-year-old, the Earth was only three and a half billion years old and the dinosaurs died out seventy million years ago, while the latest research I know of revises those dates to 4.6 billion years and 65 million years respectively, moving the date of the end-Cretaceous extinction from December 24 to December 26—in either case, a wretched Christmas present for small boys. Such details aside, the basic metaphor remains all but unchanged.
There’s only one problem with it, but it’s a whopper. Ask yourself this: what has gotten left out of that otherwise helpful metaphor? The answer, of course, is the future.
Let’s imagine, by contrast, a metaphor that maps the entire history of life on earth, from the first living thing on this planet to the last, onto a single year. We don’t know exactly when life will go extinct on this planet, but then we don’t know exactly when it emerged, either; the most recent estimate I know of puts the origin of  terrestrial life somewhere a little more than 3.7 billion years ago, and the point at which the sun’s increasing heat will finally sterilize the planet somewhere a little more than 1.2 billion years from now. Adding in a bit of rounding error, we can set the lifespan of our planetary biosphere at a nice round five billion years. On that scale, a month of thirty days is 411 million years, a single day is 13.7 million years, an hour is around 571,000 years, a minute is around 9514 years, and a second is 158 years and change. Our genus, Homo,* evolved maybe two hours ago, and all of recorded human history so far has taken up a little less than 32 seconds.
(*Another gender-nonspecific word for “human being,” this one comes from Latin, and is equally distinct from vir, “man,” and femina, “woman.” English really does need to get its act together.)
That all corresponds closely to the standard metaphor. The difference comes in when you glance at the calendar and find out that the present moment in time falls not on December 31 or any other similarly momentous date, but on an ordinary, undistinguished day—by my back-of-the-envelope calculation, it would be September 26.
I like to imagine our time, along these lines, as an instant during an early autumn afternoon in the great year of Earth’s biosphere. Like many another late September day, it’s becoming uncomfortably hot, and billowing dark clouds stand on the horizon, heralds of an oncoming storm. We human mayflies, with a lifespan averaging maybe half a second, dart here and there, busy with our momentary occupations; a few of us now and then lift our gaze from our own affairs and try to imagine the cold bare fields of early spring, the sultry air of summer evenings, or the rigors of a late autumn none of us will ever see.
With that in mind, let’s put some other dates onto the calendar. While life began on January 1, multicellular life didn’t get started until sometime in the middle of August—for almost two-thirds of the history of life, Earth was a planet of bacteria and blue-green algae, and in terms of total biomass, it arguably still is.  The first primitive plants and invertebrate animals ventured onto the land around August 25; the terrible end-Permian extinction crisis, the worst the planet has yet experienced, hit on September 8; the dinosaurs perished in the small hours of September 22, and the last ice age ended just over a minute ago, having taken place over some twelve and a half minutes.
Now let’s turn and look in the other direction. The last ice age was part of a glacial era that began a little less than two hours ago and can be expected to continue through the morning of the 27th—on our time scale, they happen every two and a half weeks or so, and the intervals between them are warm periods when the Earth is a jungle planet and glaciers don’t exist. Our current idiotic habit of treating the atmosphere as a gaseous sewer will disrupt that cycle for only a very short time; our ability to dump greenhouse gases into the atmosphere will end in less than a second as readily accessible fossil fuel reserves are exhausted, and it will take rather less than a minute thereafter for natural processes to scrub the excess CO2 from the atmosphere and return the planet’s climate to its normal instability.
Certain other consequences of our brief moment of absurd extravagance will last longer.  On our timescale, the process of radioactive decay will take around half an hour (that is to say, a quarter million years or so) to reduce high-level nuclear waste all the way to harmlessness. It will take an interval of something like the same order of magnitude before all the dead satellites in high orbits have succumbed to the complex processes that will send them to a fiery fate in Earth’s atmosphere, and quite possibly longer for the constant rain of small meteorites onto the lunar surface to pound the Apollo landers and other space junk there to unrecognizable fragments. Given a few hours of the biosphere’s great year, though, everything we are and everything we’ve done will be long gone.
Beyond that, the great timekeeper of Earth’s biosphere is the Sun. Stars increase in their output of heat over most of their life cycle, and the Sun is no exception. The single-celled chemosynthetic organisms that crept out of undersea hot springs in February or March of the great year encountered a frozen world, lit by a pale white Sun whose rays gave far less heat than today; the oldest currently known ice age, the Cryogenian glaciation of the late Precambrian period, was apparently cold enough to freeze the oceans solid and wrap most of the planet in ice. By contrast, toward the middle of November in the distant Neozoic Era, the Sun will be warmer and yellower than it is today, and glacial eras will likely involve little more than the appearance of snow on a few high mountains normally covered in jungle.
Thus the Earth will gradually warm through October and November.  Temperatures will cycle up and down with the normal cycles of planetary climate, but each warm period will tend to be a little warmer than the last, and each cold period a little less frigid. Come December, most of a billion years from now, as the heat climbs past one threshold after another, more and more of the Earth’s water will evaporate and, as dissociated oxygen and hydrogen atoms, boil off into space; the Earth will become a desert world, with life clinging to existence at the poles and in fissures deep underground, until finally the last salt-crusted seas run dry and the last living things die out.
And humanity? The average large vertebrate genus lasts something like ten million years—in our scale, something over seventeen hours. As already noted, our genus has only been around for about two hours so far, so it’s statistically likely that we still have a good long run ahead of us. I’ve discussed in these essays several times already the hard physical facts that argue that we aren’t going to go to the stars, or even settle other planets in this solar system, but that’s nothing we have to worry about. Even if we have an improbably long period of human existence ahead of us—say, the fifty million years that bats of the modern type have been around, some three and a half days in our scale, or ten thousand times the length of all recorded human history to date—the Earth will be burgeoning with living things, and perfectly capable of supporting not only intelligent life but rich, complex, unimaginably diverse civilizations, long after we’ve all settled down to our new careers as fossils.
This does not mean, of course, that the Earth will be capable of supporting the kind of civilization we have today. It’s arguably not capable of supporting that kind of civilization now.  Certainly the direct and indirect consequences of trying to maintain the civilization we’ve got, even for the short time we’ve made that attempt so far, are setting off chains of consequences that don’t seem likely to leave much of it standing for long. That doesn’t mean we’re headed back to the caves, or for that matter, back to the Middle Ages—these being the two bogeymen that believers in progress like to use when they’re trying to insist that we have no alternative but to keep on stumbling blindly ahead on our current trajectory, no matter what.
What it means, instead, is that we’re headed toward something that’s different—genuinely, thoroughly, drastically different. It won’t just be different from what we have now; it’ll also be different from the rigidly straight-line extrapolations and deus ex machina fauxpocalypses that people in industrial society like to use to keep from thinking about the future we’re making for ourselves. Off beyond the dreary Star Trekfantasy of metastasizing across the galaxy, and the equally hackneyed Mad Max fantasy of pseudomedieval savagery, lies the astonishing diversity of the future before us: a future potentially many orders of magnitude longer than all of recorded history to date, in which human beings will live their lives and understand the world in ways we can’t even imagine today.
It’s tolerably common, when points like the one I’ve tried to make here get raised at all, for people to insist that paying attention to the ultimate fate of the Earth and of our species is a recipe for suicidal depression or the like. With all due respect, that claim seems silly to me. Each one of us, as we get out of bed in the morning, realizes at some level that the day just beginning will bring us one step closer to old age and death, and yet most of us deal with that reality without too much angst.
In the same way, I’d like to suggest that it’s past time for the inmates of modern industrial civilization to grow up, sprout some gonads—either kind, take your pick—and deal with the simple, necessary, and healthy realization that our species is not going to be around forever. Just as maturity in the individual arrives when it sinks in that human life is finite, collective maturity may just wait for a similar realization concerning the life of the species. That kind of maturity would be a valuable asset just now, not least because it might help us grasp some of the extraordinary possibilities that will open up as industrial civilization finishes its one-way trip down the chute marked “decline and fall” and the deindustrial future ahead of us begins to take shape.

The Myth of the Anthropocene

Wed, 2016-10-05 20:00
To explore the messy future that modern industrial society is making for itself, it’s necessary now and again to stray into some of the odd corners of human thought. Over the decade and a bit that this blog has been engaged in that exploration, accordingly, my readers and I have gone roaming through quite an assortment of topics—politics, religion, magic, many different areas of history, at least as many sciences, and the list goes on. This week, it’s time to ramble through geology, for reasons that go back to some of the basic presuppositions of our culture, and reach forward from there to the far future.
Over the last few years, a certain number of scientists, climate activists, and talking heads in the media have been claiming that the Earth has passed out of its previous geological epoch, the Holocene, into a new epoch, the Anthropocene. Their argument is straightforward: human beings have become a major force shaping geology, and that unprecedented reality requires a new moniker. Last I heard, the scholarly body that authorizes formal changes to that end of scientific terminology hasn’t yet approved the new term for official use, but it’s seeing increasing use in less formal settings.
I’d like to suggest that the proposed change is a mistake, and that the label “Anthropocene” should go into whatever circular file holds phlogiston, the luminiferous ether, and other scientific terms that didn’t turn out to represent realities. That’s not because I doubt that human beings are having a major impact on geology just now, far from it.  My reasons are somewhat complex, and will require a glance back over part of the history of geology—specifically, the evolution of the labels we use to talk about portions of the past. It’s going to be a bit of a long journey, but bear with me; it matters.
Back in the seventeenth century, when the modern study of geology first got under way, the Book of Genesis was considered to be an accurate account of the Earth’s early history, and so geologists looked for evidence of the flood that plopped Noah’s ark on Mount Ararat. They found it, too, or that’s what people believed at the time. By and large, anywhere you go in western Europe, you’ll be standing on one of three things; the first is rock, the second is an assortment of gravels and compact tills, and the third is soil. With vanishingly few exceptions, where they overlap, the rock is on the bottom, the gravels and tills are in the middle, and the soil is on top. Noting that some of the gravels and tills look like huge versions of the sandbars and other features shaped by moving water, the early geologists decided the middle layed had been left by the Flood—that’s diluvium in Latin—and so the three layers were named Antediluvian (“before the flood”), Diluvian, and Postdiluvian (“after the flood”).
So far, so good—except then they started looking at the Antediluvian layer, and found an assortment of evidence that seemed to imply that really vast amounts of time had passed between different layers of rock. During the early eighteenth century, as this sank in, the Book of Genesis lost its status as a geology textbook, and geologists came up with a new set of four labels: Primary, Secondary, Tertiary, and Quaternary. (These are fancy ways of saying “First, Second, Third, and Fourth,” in case you were wondering.) The Quaternary layer consisted of the former Diluvian and Postdiluvian gravels, tills, and soil; the Tertiary consisted of rocks and fossils that were found under those; the Secondary was the rocks and fossils below that, and the Primary was at the bottom.
It was a good scheme for the time; on the surface of the Earth, if you happen to live in western Europe and walk around a lot, you’ll see very roughly equal amounts of all four layers. What’s more, they  always occur in the order just given.  Where they overlap, the Primary is always under the Secondary, and so on; you never find Secondary rocks under Primary ones, except when the rock layers have obviously been folded by later geological forces. So geologists assigned them to four different periods of time, named after the layers—the Primary Era, the Secondary Era, and so on.
It took quite a bit of further work for geologists to get a handle on how much time was involved in each of these eras, and as the results of that line of research started to become clear, there was a collective gulp loud enough to echo off the Moon. Outside of India and a few Native American civilizations, nobody anywhere had imagined that the history of the Earth might involve not thousands of years, but billions of them. As this sank in, the geologists also realized that their four eras were of absurdly different lengths. The Quaternary was only two million years long; the Tertiary, around sixty-three million years; the Secondary, around one hundred eighty-six million years; and the Primary, from there back to the Earth’s origin, or better than four billion years.
So a new scheme was worked out. The Quaternary era became the Quaternary period, and it’s still the Quaternary today, even though it’s not the fourth of anything any more. The Tertiary also became a period—it later got broken up into the Paleogene and Neogene periods—and the Tertiary (or Paleogene and Neogene) and Quaternary between them made up the Cenozoic (Greek for “recent life”) era. The former Secondary era became the Mesozoic (“middle life”) era, and was divided into three periods; starting with the most recent, these are the Cretaceous, Jurassic, and Triassic. The former Primary era became the Paleozoic (“old life”) era, and was divided into six periods; again, starting with the most recent, these were are the Permian, Carboniferous, Devonian, Silurian, Ordovician, and Cambrian. The Cambrian started around 542 million years ago, and everything before then—all three billion years and change—was tossed into the vast dark basement of the Precambrian.
It was a pretty good system, and one of the things that was pretty good about it is that the periods were of very roughly equal length. Thus the Paleozoic had twice as many periods as the Mesozoic, and it lasted around twice as long. The Mesozoic, in turn, had three times as many complete periods as the Cenozoic did (in pre-Paleogene and Neogene days)—the Quaternary has just gotten started, remember—and it’s around three times as long. I don’t know how many of my readers, as children, delighted in the fact that the whole Cenozoic era—the Age of Mammals, as it was often called—could be dropped into the Cretaceous period with room to spare on either end, but I did. I decorated one of my school notebooks with a crisp little drawing of a scoreboard that read DINOSAURS 3, MAMMALS 1. No, nobody else got the joke.
In recent decades, things have been reshuffled a bit more.  The Precambrian basement has been explored in quite some detail, and what used to be deliciously named the Cryptozoic eon has now sadly been broken up into Proterozoic and Archean eons, and divided into periods to boot. We can let that pass, though, because it’s the other end of the time scale that concerns us. Since Cenozoic rock makes up so much of the surface—being the most recently laid down, after all—geologists soon broke up the Tertiary and Quaternary periods into six shorter units, called epochs: from first to last, Eocene, Oligocene, Miocene, Pliocene, Pleistocene, and Holocene. (These are Greek again, and mean “dawn recent, few recent, some recent, many recent, most recent,” and “entirely recent”—the reference is to how many living things in each epoch look like the ones running around today.) Later, the Eocene got chopped in two to yield the Paleocene (“old recent”) and Eocene. Yes, that “-cene” ending—also the first syllable in Cenozoic—is the second half of the label “Anthropocene,” the human-recent.
The thing to keep in mind is that an epoch is a big chunk of time. The six of them that are definitely over with at this point lasted an average of almost eleven million years a piece. (For purposes of comparison, eleven million years is around 2200 times the length of all recorded human history.) The exception is the Holocene, which is only 11,700 years old at present, or only about 0.001% of the average length of an epoch. It makes sense to call the Holocene an epoch, in other words, if it’s just beginning and still has millions of years to run.
If in fact the Holocene is over and the Anthropocene is under way, though, the Holocene isn’t an epoch at all in any meaningful sense. It’s the tag-end of the Pleistocene, or a transition between the Pleistocene and whichever epoch comes next, whether that be labeled Anthropocene or something else. You can find such transitions between every epoch and the next, every period and the next, and every era and the next. They’re usually quite distinctive, because these different geological divisions aren’t mere abstractions; the change from one to another is right there in the rock strata, usually well marked by sharp changes in a range of markers, including fossils. Some long-vanished species trickle out in the middle of an epoch, to be sure, but one of the things that often marks the end of an epoch, a period, or an era is that a whole mess of extinctions all happen in the transition from one unit of time to the next.
Let’s look at a few examples to sharpen that last point. The Pleistocene epoch was short as epochs go, only a little more than two and a half million years; it was a period of severe global cooling, which is why it’s better known as the ice age; and a number of its typical animals—mammoths, sabertooth tigers, and woolly rhinoceri in North America, giant ground sloths and glyptodons in South America, cave bears and mastodons in Europe, and so on—went extinct all at once during the short transition period at its end, when the climate warmed abruptly and a wave of invasive generalist predators (i.e., your ancestors and mine) spread through ecosystems that were already in extreme turmoil. That’s a typical end-of-epoch mess.
Periods are bigger than epochs, and the end of a period is accordingly a bigger deal. Let’s take the end of the Triassic as a good example. Back in the day, the whole Mesozoic era routinely got called “the Age of Reptiles,” but until the Triassic ended it was anybody’s guess whether the dinosaurs or the therapsid almost-mammals would end up at the top of the ecological heap. The end-Triassic extinction crisis put an end to the struggle by putting an end to most of the therapsids, along with a lot of other living things. The biggest of the early dinosaurs died off as well, but the smaller ones thrived, and their descendants went on to become the huge and remarkably successful critters of the Jurassic and Cretaceous. That’s a typical end-of-period mess.
Eras are bigger than periods, and they always end with whopping crises. The most recent example, of course, is the end of the Mesozoic era 65 million years ago. Forty per cent of the animal families on the planet, including species that had been around for hundreds of millions of years, died pretty much all at once. (The current theory, well backed up by the data, is that a good-sized comet slammed into what’s now the Yucatan peninsula, and the bulk of the dieoff was over in just a few years.) Was that the worst extinction crisis ever? Not a chance; the end of the Paleozoic 251 million years ago was slower but far more ghastly, with around ninety-five per cent of all species on the casualty list. Some paleontologists, without undue exaggeration, describe the end-Paleozoic crisis as the time Earth nearly died.
So the landscape of time revealed to us by geology shows intervals of relative stability—epochs, periods, and eras—broken up by short transition periods. If you go for a walk in country where the rock formations have been exposed, you can literally see the divisions in front of you: here’s a layer of one kind of rock a foot or two thick, laid down as sediment over millions of years and then compressed into stone over millions more; here’s a thin boundary layer, or simply an abrupt line of change, and above it there’s a different kind of rock, consisting of sediment laid down under different climatic and environmental conditions.
If you’ve got a decent geological laboratory handy and apply the usual tests to a couple of rock samples, one from the middle of an epoch and the other from a boundary layer, the differences are hard to miss. The boundary layer made when the Mesozoic ended and the Cenozoic began is a good example. The Cretaceous-Paleogene boundary layer is spiked with iridium, from space dust brought to earth by the comet; it’s full of carbon from fires that were kindled by the impact over many millions of square miles; and the one trace of life you’ll find is a great many fungal spores—dust blown into the upper atmosphere choked out the sun and left most plants on Earth dead and rotting, with results that rolled right up the food chain to the tyrannosaurs and their kin. You won’t find such anomalies clustering in the rock sample from the middle of the epoch; what you’ll find in nearly every case is evidence of gradual change and ordinary geological processes at work.
Now ask yourself this, dear reader: which of these most resembles the trace that human industrial civilization is in the process of leaving for the rock formations of the far future?
It’s crucial to remember that the drastic geological impacts that have inspired some scientists to make use of the term “Anthropocene” are self-terminating in at least two senses. On the one hand, those impacts are possible because, and only because, our species is busily burning through stores of fossil carbon that took half a billion years for natural processes to stash in the rocks, and ripping through equally finite stores of other nonrenewable resources, some of which took even longer to find their way into the deposits we mine so greedily. On the other hand, by destabilizing the climate and sending cascading disturbances in motion through a good-sized collection of other natural cycles, those impacts are in the process of wrecking the infrastructure that industrial society needs to go its merry way.
Confronted with the tightening vise between accelerating resource depletion and accelerating biosphere disruption, the vast majority of people in the industrial world seem content to insist that they can have their planet and eat it too. The conventional wisdom holds that someone, somewhere, will think of something that will allow us to replace Earth’s rapidly emptying fuel tanks and resource stocks, on the one hand, and stabilize its increasingly violent climatic and ecological cycles, on the other.  That blind faith remains welded in place even as decade after decade slips past, one supposed solution after another fails, and the stark warnings of forty years ago have become the front page news stories of today. Nothing is changing, except that the news just keeps getting worse.
That’s the simple reality of the predicament in which we find ourselves today. Our way of life, here in the world’s industrial nations, guarantees that in the fairly near future, no one anywhere on the planet will be able to live the way we do. As resources run out, alternatives fail, and the destructive impacts of climate change pile up, our ability to influence geological processes will go away, and leave us once more on the receiving end of natural cycles we can do little to change.
A hundred million years from now, as a result, if another intelligent species happens to be around on Earth at that time and takes an interest in geology, its members won’t find a nice thick stratum of rock marked with the signs of human activity, corresponding to an Anthropocene epoch. They’ll find a thin boundary layer, laid down over a few hundred years, and laced with exotic markers: decay products of radioactive isotopes splashed into the atmosphere by twentieth-century nuclear bomb testing and nuclear reactor meltdowns; chemical markers showing a steep upward jolt in atmospheric carbon dioxide; and scattered freely through the layer, micron-thick streaks of odd carbon compounds that are all that’s left of our vast production of plastic trash. That’s our geological legacy: a slightly odd transition layer a quarter of an inch thick, with the usual discontinuity between the species in the rock just below, many of whom vanish at the transition, and the species in the rock just above, who proliferate into empty ecological niches and evolve into new forms.
In place of the misleading label “Anthropocene,” then, I’d like to propose that we call the geological interval we’re now in the Pleistocene-Neocene transition. Neocene? That’s Greek for “new recent,” representing the “new normal” that will emerge when our idiotic maltreatment of the planet that keeps us all alive brings the “old normal” crashing down around our ears. We don’t call the first epoch after the comet impact 65 million years ago the “Cometocene,” so there’s no valid reason to use a label like “Anthropocene” for the epoch that will dawn when the current transition winds down. Industrial civilization’s giddy rise and impending fall are the trigger for the transition, and nothing more; the shape of the Neocene epoch will be determined not by us, but by the ordinary processes of planetary change and evolution.
Those processes have been responding to the end of the so-called Holocene—let’s rename it the Late Pleistocene, given how extremely short it turned out to be—in the usual manner.  Around the world, ice caps are melting, climate belts are shifting, acid-intolerant species in the ocean are being replaced by acid-tolerant ones, and generalist species of animals such as cats, coyotes, and feral pigs are spreading rapidly through increasingly chaotic ecosystems, occupying vacant ecological niches or elbowing less flexible competitors out of the way. By the time the transition winds down a few centuries from now, the species that have been able to adapt to new conditions and spread into new environments will be ready for evolutionary radiation; another half a million years or so, and the Neocene will be stocked with the first preliminary draft of its typical flora and fauna.
It’s entertaining, at least to me, to speculate about what critters will roam the desert sands of Kansas and Nebraska or stalk its prey in the forests of postglacial Greenland. To many of my readers, though, I suspect a more pressing question is whether a certain primate called Homo sapiens will be among the common fauna of the Neocene. I suspect so, though of course none of us can be sure—but giving up on the fantasy that’s embodied in the label “Anthropocene,” the delusion that what our civilization is doing just now is going to keep on long enough to fill a geological epoch, is a good step in the direction of our survival.