Wednesday, January 11, 2017

The Embarrassments of Chronocentrism

It's a curious thing, this attempt of mine to make sense of the future by understanding what’s happened in the past. One of the most curious things about it, at least to me, is the passion with which so many people insist that this isn’t an option at all. In any other context, “Well, what happened the last time someone tried that?” is one of the first and most obviously necessary questions to ask and answer—but heaven help you if you try to raise so straightforward a question about the political, economic, and social phenomena of the present day.

In previous posts here we’ve talked about thoughtstoppers of the “But it’s different this time!” variety, and some of the other means people these days use to protect themselves against the risk of learning anything useful from the hard-earned lessons of the past. This week I want to explore another, subtler method of doing the same thing. As far as I’ve been able to tell, it’s mostly an issue here in the United States, but here it’s played a remarkably pervasive role in convincing people that the only way to open a door marked PULL is to push on it long and hard enough.

It’s going to take a bit of a roundabout journey to make sense of the phenomenon I have in mind, so I’ll have to ask my readers’ forbearance for what will seem at first like several sudden changes of subject.

One of the questions I field tolerably often, when I discuss the societies that will rise after modern industrial civilization finishes its trajectory into history’s compost heap, is whether I think that consciousness evolves. I admit that until fairly recently, I was pretty much at a loss to know how to respond. It rarely took long to find out that the questioner wasn’t thinking about the intriguing theory Julian Jaynes raised in The Origins of Consciousness in the Breakdown of the Bicameral Mind, the Jungian conception Erich Neumann proposed in The Origins and History of Consciousness, or anything of the same kind. Nor, it turned out, was the question usually based on the really rather weird reinterpretations of evolution common in today’s pop-spirituality scene. Rather, it was political.

It took me a certain amount of research, and some puzzled emails to friends more familiar with current left-wing political jargon than I am, to figure out what was behind these questions. Among a good-sized fraction of American leftist circles these days, it turns out it’s become a standard credo that what drives the kind of social changes supported by the left—the abolition of slavery and segregation, the extension of equal (or more than equal) rights to an assortment of disadvantaged groups, and so on—is an ongoing evolution of consciousness, in which people wake up to the fact that things they’ve considered normal and harmless are actually intolerable injustices, and so decide to stop.

Those of my readers who followed the late US presidential election may remember Hillary Clinton’s furious response to a heckler at one of her few speaking gigs:  “We aren’t going back. We’re going forward.” Underlying that outburst is the belief system I’ve just sketched out: the claim that history has a direction, that it moves in a linear fashion from worse to better, and that any given political choice—for example, which of the two most detested people in American public life is going to become the nominal head of a nation in freefall ten days from now—not only can but must be flattened out into a rigidly binary decision between “forward” and “back.”

There’s no shortage of hard questions that could be brought to bear on that way of thinking about history, and we’ll get to a few of them a little later on, but let’s start with the simplest one: does history actually show any such linear movement in terms of social change?

It so happens that I’ve recently finished a round of research bearing on exactly that question, though I wasn’t thinking of politics or the evolution of consciousness when I launched into it. Over the last few years I’ve been working on a sprawling fiction project, a seven-volume epic fantasy titled The Weird of Hali, which takes the horror fantasy of H.P. Lovecraft and stands it on its head, embracing the point of view of the tentacled horrors and multiracial cultists Lovecraft liked to use as images of dread. (The first volume, Innsmouth, is in print in a fine edition and will be out in trade paper this spring, and the second, Kingsport, is available for preorder and will be published later this year.)

One of Lovecraft’s few memorable human characters, the intrepid dream-explorer Randolph Carter, has an important role in the fourth book of my series. According to Lovecraft, Carter was a Boston writer and esthete of the1920s from a well-to-do family, who had no interest in women but a whole series of intimate (and sometimes live-in) friendships with other men, and decidedly outrĂ© tastes in interior decoration—well, I could go on. The short version is that he’s very nearly the perfect archetype of an upper-class gay man of his generation. (Whether Lovecraft intended this is a very interesting question that his biographers don’t really answer.) With an eye toward getting a good working sense of Carter’s background, I talked to a couple of gay friends, who pointed me to some friends of theirs, and that was how I ended up reading George Chauncey’s magisterial Gay New York: Gender, Urban Culture, and the Makings of the Gay Male World, 1890-1940.

What Chauncey documents, in great detail and with a wealth of citations from contemporary sources, is that gay men in America had substantially more freedom during the first three decades of the twentieth century than they did for a very long time thereafter. While homosexuality was illegal, the laws against it had more or less the same impact on people’s behavior that the laws against smoking marijuana had in the last few decades of the twentieth century—lots of people did it, that is, and now and then a few of them got busted. Between the beginning of the century and the coming of the Great Depression, in fact, most large American cities had a substantial gay community with its own bars, restaurants, periodicals, entertainment venues, and social events, right out there in public.

Nor did the gay male culture of early twentieth century America conform to current ideas about sexual identity, or the relationship between gay culture and social class, or—well, pretty much anything else, really. A very large number of men who had sex with other men didn’t see that as central to their identity—there were indeed men who embraced what we’d now call a gay identity, but that wasn’t the only game in town by a long shot. What’s more, sex between men was by and large more widely accepted in the working classes than it was further up the social ladder. In turn-of-the-century New York, it was the working class gay men who flaunted the camp mannerisms and the gaudy clothing; upper- and middle-class gay men such as Randolph Carter had to be much more discreet.

So what happened? Did some kind of vast right-wing conspiracy shove the ebullient gay male culture of the early twentieth century into the closet? No, and that’s one of the more elegant ironies of this entire chapter of American cultural history. The crusade against the “lavender menace” (I’m not making that phrase up, by the way) was one of the pet causes of the same Progressive movement responsible for winning women the right to vote and breaking up the fabulously corrupt machine politics of late nineteenth century America. Unpalatable as that fact is in today’s political terms, gay men and lesbians weren’t forced into the closet in the 1930s by the right.  They were driven there by the left.

This is the same Progressive movement, remember, that made Prohibition a central goal of its political agenda, and responded to the total failure of the Prohibition project by refusing to learn the lessons of failure and redirecting its attentions toward banning less popular drugs such as marijuana. That movement was also, by the way, heavily intertwined with what we now call Christian fundamentalism. Some of my readers may have heard of William Jennings Bryan, the supreme orator of the radical left in late nineteenth century America, the man whose “Cross of Gold” speech became the great rallying cry of opposition to the Republican corporate elite in the decades before the First World War.  He was also the prosecuting attorney in the equally famous Scopes Monkey Trial, responsible for pressing charges against a schoolteacher who had dared to affirm in public Darwin’s theory of evolution.

The usual response of people on today’s left to such historical details—well, other than denying or erasing them, which is of course quite common—is to insist that this proves that Bryan et al. were really right-wingers. Not so; again, we’re talking about people who put their political careers on the line to give women the vote and weaken (however temporarily) the grip of corporate money on the US political system. The politics of the Progressive era didn’t assign the same issues to the categories “left” and “right” that today’s politics do, and so all sides in the sprawling political free-for-all of that time embraced some issues that currently belong to the left, others that belong to the right, and still others that have dropped entirely out of the political conversation since then.

I could go on, but let’s veer off in another direction instead. Here’s a question for those of my readers who think they’re well acquainted with American history. The Fifteenth Amendment, which granted the right to vote to all adult men in the United States irrespective of race, was ratified in 1870. Before then, did black men have the right to vote anywhere in the US?

Most people assume as a matter of course that the answer must be no—and they’re wrong. Until the passage of the Fifteenth Amendment, the question of who did and didn’t have voting rights was a matter for each state to decide for itself. Fourteen states either allowed free African-American men to vote in Colonial times or granted them that right when first organized. Later on, ten of them—Delaware in 1792, Kentucky in 1799, Maryland in 1801, New Jersey in 1807, Connecticut in 1814, New York in 1821, Rhode Island in 1822, Tennessee in 1834, North Carolina in 1835, and Pennsylvania in 1838—either denied free black men the vote or raised legal barriers that effectively kept them from voting. Four other states—Massachusetts, Vermont, New Hampshire, and Maine—gave free black men the right to vote in Colonial times and maintained that right until the Fifteenth Amendment made the whole issue moot. Those readers interested in the details can find them in The African American Electorate: A Statistical History by Hanes Walton Jr. et al., which devotes chapter 7 to the subject.

So what happened? Was there a vast right-wing conspiracy to deprive black men of the right to vote? No, and once again we’re deep in irony. The political movements that stripped free American men of African descent of their right to vote were the two great pushes for popular democracy in the early United States, the Democratic-Republican party under Thomas Jefferson and the Democratic party under Andrew Jackson. Read any detailed history of the nineteenth century United States and you’ll learn that before these two movements went to work, each state set a certain minimum level of personal wealth that citizens had to have in order to vote. Both movements forced through reforms in the voting laws, one state at a time, to remove these property requirements and give the right to vote to every adult white man in the state. What you won’t learn, unless you do some serious research, is that in many states these same reforms also stripped adult black men of their right to vote.

Try to explain this to most people on the leftward end of today’s American political spectrum, and you’ll likely end up with a world-class meltdown, because the Jeffersonian Democratic-Republicans and the Jacksonian Democrats, like the Progressive movement, embraced some causes that today’s leftists consider progressive, and others that they consider regressive. The notion that social change is driven by some sort of linear evolution of consciousness, in which people necessarily become “more conscious” (that is to say, conform more closely to the ideology of the contemporary American left) over time, has no room for gay-bashing Progressives and Jacksonian Democrats whose concept of democracy included a strict color bar. The difficulty, of course, is that history is full of Progressives, Jacksonian Democrats, and countless other political movements that can’t be shoehorned into the Procrustean bed of today’s political ideologies.

I could add other examples—how many people remember, for example, that environmental protection was a cause of the far right until the 1960s?—but I think the point has been made. People in the past didn’t divide up the political causes of their time into the same categories left-wing activists like to use today. It’s practically de rigueur for left-wing activists these days to insist that people in the past ought to have seen things in today’s terms rather than the terms of their own time, but that insistence just displays a bad case of chronocentrism.

Chronocentrism? Why, yes.  Most people nowadays are familiar with ethnocentrism, the insistence by members of one ethnic group that the social customs, esthetic notions, moral standards, and so on of that ethnic group are universally applicable, and that anybody who departs from those things is just plain wrong. Chronocentrism is the parallel insistence, on the part of people living in one historical period, that the social customs, esthetic notions, moral standards, and so on of that period are universally applicable, and that people in any other historical period who had different social customs, esthetic notions, moral standards, and so on should have known better.

Chronocentrism is pandemic in our time. Historians have a concept called “Whig history;” it got that moniker from a long line of English historians who belonged to the Whig, i.e., Liberal Party, and who wrote as though all of human history was to be judged according to how well it measured up to the current Liberal Party platform. Such exercises aren’t limited to politics, though; my first exposure to the concept of Whig history came via university courses in the history of science. When I took those courses—this was twenty-five years ago, mind you—historians of science were sharply divided between a majority that judged every scientific activity in every past society on the basis of how well it conformed to our ideas of science, and a minority that tried to point out just how difficult this habit made the already challenging task of understanding the ideas of past thinkers.

To my mind, the minority view in those debates was correct, but at least some of its defenders missed a crucial point. Whig history doesn’t exist to foster understanding of the past.  It exists to justify and support an ideological stance of the present. If the entire history of science is rewritten so that it’s all about how the currently accepted set of scientific theories about the universe rose to their present privileged status, that act of revision makes currently accepted theories look like the inevitable outcome of centuries of progress, rather than jerry-rigged temporary compromises kluged together to cover a mass of recalcitrant data—which, science being what it is, is normally a more accurate description.

In exactly the same sense, the claim that a certain set of social changes in the United States and other industrial countries in recent years result from the “evolution of consciousness,” unfolding on a one-way street from the ignorance of the past to a supposedly enlightened future, doesn’t help make sense of the complicated history of social change. It was never supposed to do that. Rather, it’s an attempt to backstop the legitimacy of a specific set of political agendas here and now by making them look like the inevitable outcome of the march of history. The erasure of the bits of inconvenient history I cited earlier in this essay is part and parcel of that attempt; like all linear schemes of historical change, it falsifies the past and glorifies the future in order to prop up an agenda in the present.

It needs to be remembered in this context that the word “evolution” does not mean “progress.” Evolution is adaptation to changing circumstances, and that’s all it is. When people throw around the phrases “more evolved” and “less evolved,” they’re talking nonsense, or at best engaging in a pseudoscientific way of saying “I like this” and “I don’t like that.” In biology, every organism—you, me, koalas, humpback whales, giant sequoias, pond scum, and all the rest—is equally the product of a few billion years of adaptation to the wildly changing conditions of an unstable planet, with genetic variation shoveling in diversity from one side and natural selection picking and choosing on the other. The habit of using the word “evolution” to mean “progress” is pervasive, and it’s pushed hard by the faith in progress that serves as an ersatz religion in our time, but it’s still wrong.

It’s entirely possible, in fact, to talk about the evolution of political opinion (which is of course what “consciousness” amounts to here) in strictly Darwinian terms. In every society, at every point in its history, groups of people are striving to improve the conditions of their lives by some combination of competition and cooperation with other groups. The causes, issues, and rallying cries that each group uses will vary from time to time as conditions change, and so will the relationships between groups—thus it was to the advantage of affluent liberals of the Progressive era to destroy the thriving gay culture of urban America, just as it was to the advantage of affluent liberals of the late twentieth century to turn around and support the struggle of gay people for civil rights. That’s the way evolution works in the real world, after all.

This sort of thinking doesn’t offer the kind of ideological support that activists of various kinds are used to extracting from various linear schemes of history. On the other hand, that difficulty is more than balanced by a significant benefit, which is that linear predictions inevitably fail, and so by and large do movements based on them. The people who agreed enthusiastically with Hillary Clinton’s insistence that “we aren’t going back, we’re going forward” are still trying to cope with the hard fact that their political agenda will be wandering in the wilderness for at least the next four years. Those who convince themselves that their cause is the wave of the future are constantly being surprised by the embarrassing discovery that waves inevitably break and roll back out to sea. It’s those who remember that history plays no favorites who have a chance of accomplishing their goals.

Wednesday, January 04, 2017

How Not To Write Like An Archdruid

Among the occasional amusements I get from writing these weekly essays are earnest comments from people who want to correct my writing style. I field one of them every month or so, and the latest example came in over the electronic transom in response to last week’s post. Like most of its predecessors, it insisted that there’s only one correct way to write for the internet, trotted out a set of canned rules that supposedly encapsulate this one correct way, and assumed as a matter of course that the only reason I didn’t follow those rules is that I’d somehow managed not to hear about them yet.

The latter point is the one I find most amusing, and also most curious. Maybe I’m naive, but it’s always seemed to me that if I ran across someone who was writing in a style I found unusual, the first thing I’d want to do would be to ask the author why he or she had chosen that stylistic option—because, you know, any writer who knows the first thing about his or her craft chooses the style he or she finds appropriate for any given writing project. I field such questions once in a blue moon, and I’m happy to answer them, because I do indeed have reasons for writing these essays in the style I’ve chosen for them. Yet it’s much more common to get the sort of style policing I’ve referenced above—and when that happens, you can bet your bottom dollar that what’s being pushed is the kind of stilted, choppy, dumbed-down journalistic prose that I’ve deliberately chosen not to write.

I’m going to devote a post to all this, partly because I write what I want to write about, the way I want to write about it, for the benefit of those who enjoy reading it, and those who don’t are encouraged to remember that there are thousands of other blogs out there that they’re welcome to read instead. Partly, though, the occasional thudding of what Giordano Bruno called “the battering rams of infants, the catapults of error, the bombards of the inept, and the lightning flashes, thunder, and great tempests of the ignorant”—now there was a man who could write!—raises issues that are central to the occasional series of essays on education I’ve been posting here.

Accepting other people’s advice on writing is a risky business—and yes, that applies to this blog post as well as any other source of such advice. It’s by no means always true that “those who can, do; those who can’t, teach,” but when we’re talking about unsolicited writing advice on the internet, that’s the way to bet.  Thus it’s not enough for some wannabe instructor to tell you “I’ve taught lots of people” (taught them what?) or “I’ve helped lots of people” (to do what?)—the question you need to ask is what the instructor himself or herself has written and where it’s been published.

The second of those matters as much as the first. It so happens, for example, that a great many of the professors who offer writing courses at American universities publish almost exclusively in the sort of little literary quarterlies that have a circulation in three figures and pay contributors in spare copies. (It’s not coincidental that these days, most of the little literary quarterlies in question are published by university English departments.) There’s nothing at all wrong with that, if you dream of writing the sort of stories, essays, and poetry that populate little literary quarterlies.

If you want to write something else, though, it’s worth knowing that these little quarterlies have their own idiosyncratic literary culture. There was a time when the little magazines were one of the standard stepping stones to a successful writing career, but that time went whistling down the wind decades ago. Nowadays, the little magazines have gone one way, the rest of the publishing world has gone another, and many of the habits the little magazines encourage (or even require) in their writers will guarantee prompt and emphatic rejection slips from most other writing venues.

Different kinds of writing, in other words, have their own literary cultures and stylistic customs. In some cases, those can be roughly systematized in the form of rules. That being the case, is there actually some set of rules that are followed by everything good on the internet?

Er, that would be no. I’m by no means a fan of the internet, all things considered—I publish my essays here because most of the older venues I’d prefer no longer exist—but it does have its virtues, and one of them is the remarkable diversity of style to be found there. If you like stilted, choppy, dumbed-down journalistic prose of the sort my commenter wanted to push on me, why, yes, you can find plenty of it online. You can also find lengthy, well-argued essays written in complex and ornate prose, stream-of-consciousness pieces that out-beat the Beat generation, experimental writing of any number of kinds, and more. Sturgeon’s Law (“95% of everything is crap”) applies here as it does to every other human creation, but there are gems to be found online that range across the spectrum of literary forms and styles. No one set of rules applies.

Thus we can dismiss the antics of the style police out of hand. Let’s go deeper, though. If there’s no one set of rules that internet writing ought to follow, are there different rules for each kind of writing? Or are rules themselves the problem? This is where things get interesting.

One of the consistent mental hiccups of American popular culture is the notion that every spectrum consists solely of its two extremes, with no middle ground permitted, and that bit of paralogic gets applied to writing at least as often as to anything else. Thus you have, on the one hand, the claim that the only way to write well is to figure out what the rules are and follow them with maniacal rigidity; on the other, the claim that the only way to write well is to throw all rules into the trash can and let your inner genius, should you happen to have one of those on hand, spew forth the contents of your consciousness all anyhow onto the page. Partisans of those two viewpoints snipe at one another from behind rhetorical sandbags, and neither one of them ever manages more than a partial victory, because neither approach is particularly useful when it comes to the actual practice of writing.

By and large, when people write according to a rigidly applied set of rules—any rigidly applied set of rules—the result is predictable, formulaic, and trite, and therefore boring. By and large, when people write without paying any attention to rules at all, the result is vague, shapeless, and maundering, and therefore boring. Is there a third option? You bet, and it starts by taking the abandoned middle ground: in this case, learning an appropriate set of rules, and using them as a starting point, but departing from them wherever doing so will improve the piece you’re writing.

The set of rules I recommend, by the way, isn’t meant to produce the sort of flat PowerPoint verbiage my commenter insists on. It’s meant to produce good readable English prose, and the source of guidance I recommend to those who are interested in such things is Strunk and White’s deservedly famous The Elements of Style. Those of my readers who haven’t worked with it, who want to improve their writing, and who’ve glanced over what I’ve published and decided that they might be able to learn something useful from me, could do worse than to read it and apply it to their prose.

A note of some importance belongs here, though. There’s a thing called writer’s block, and it happens when you try to edit while you’re writing. I’ve read, though I’ve misplaced the reference, that neurologists have found that the part of the brain that edits and the part of the brain that creates are not only different, they conflict with one another.  If you try to use both of them at once, your brain freezes up in a fairly close neurological equivalent of the Blue Screen of Death, and you stop being able to write at all. That’s writer’s block. To avoid it, NEVER EDIT WHILE YOU’RE WRITING. 

I mean that quite literally. Don’t even look at the screen if you can’t resist the temptation to second-guess the writing process. If you have to, turn the screen off, so you can’t even see what you’re writing. Eventually, with practice, you’ll learn to move smoothly back and forth between creative mode and editing mode, but if you don’t have a lot of experience writing, leave that for later. For now, just blurt it all out without a second thought, with all its misspellings and garbled grammar intact.

Then, after at least a few hours—or better yet, after a day or so—go back over the mess, cutting, pasting, adding, and deleting as needed, until you’ve turned it into nice clean text that says what you want it to say. Yes, we used to do that back before computers; the process is called “cut and paste” because it was done back then with a pair of scissors and a pot of paste, the kind with a little spatula mounted on the inside of the lid to help you spread the stuff; you’d cut out the good slices of raw prose and stick them onto a convenient sheet of paper, interspersed with handwritten or freshly typed additions. Then you sat down and typed your clean copy from the pasted-up mess thus produced. Now you know how to do it when the internet finally dries up and blows away. (You’re welcome.)

In the same way, you don’t try to write while looking up rules in Strunk & White. Write your piece, set it aside for a while, and then go over it with your well-worn copy of Strunk & White in hand, noting every place you broke one of the rules of style the book suggests you should follow. The first few times, as a learning exercise, you might consider rewriting the whole thing in accordance with those rules—but only the first few times. After that, make your own judgment call: is this a place where you should follow the rules, or is this a place where they need to be bent, broken, or trampled into the dust? Only you, dear reader-turned-writer, can decide.

A second important note deserves to be inserted at this point, though. The contemporary US public school system can be described without too much inaccuracy as a vast mechanism for convincing children that they can’t write. Rigid rules imposed for the convenience of educators rather than the good of the students, part of the industrial mass-production ethos that pervades public schools in this country, leave a great many graduates so bullied, beaten, and bewildered by bad pedagogy that the thought of writing something for anybody else to read makes them turn gray with fear. It’s almost as bad as the terror of public speaking the public schools also go out of their way to inflict, and it plays a comparable role in crippling people’s capacity to communicate outside their narrow circles of friends.

If you suffer from that sort of educational hangover, dear reader, draw a deep breath and relax. The bad grades and nasty little comments in red ink you got from Mrs. Melba McNitpick, your high school English teacher, are no reflection of your actual capacities as a writer. If you can talk, you can write—it’s the same language, after all. For that matter, even if you can’t talk, you may be able to write—there’s a fair number of people out there who are nonverbal for one reason or another, and can still make a keyboard dance.

The reason I mention this here is that the thought of making an independent judgment about when to follow the rules and when to break them fills a great many survivors of American public schools with dread. In far too many cases, students are either expected to follow the rules with mindless obedience and given bad grades if they fail to do so, or given no rules at all and then expected to conform to unstated expectations they have no way to figure out, and either of these forms of bad pedagogy leaves scars. Again, readers who are in this situation should draw a deep breath and relax; having left Mrs. McNitpick’s class, you’re not subject to her opinions any longer, and should ignore them utterly.

So how do you decide where to follow the rules and where to fold, spindle, and mutilate them? That’s where we walk through the walls and into the fire, because what guides you in your decisions regarding the rules of English prose is the factor of literary taste.

Rules can be taught, but taste can only be learned. Does that sound like a paradox? Au contraire, it simply makes the point that only you can learn, refine, and ripen your literary taste—nobody else can do it for you, or even help you to any significant extent—and your sense of taste is therefore going to be irreducibly personal. When it comes to taste, you aren’t answerable to Mrs. McNitpick, to me, to random prose trolls on the internet, or to anyone else. What’s more, you develop your taste for prose the same way you develop your taste for food: by trying lots of different things, figuring out what you like, and paying close attention to what you like, why you like it, and what differentiates it from the things you don’t like as much.

This is applicable, by the way, to every kind of writing, including those kinds at which the snobs among us turn up their well-sharpened noses. I don’t happen to be a fan of the kind of satirical gay pornography that Chuck Tingle has made famous, for example, but friends of mine who are tell me that in that genre, as in all others, there are books that are well written, books that are tolerable, and books that trip over certain overelongated portions of their anatomy and land face first in—well, let’s not go there, shall we? In the same way, if your idea of a good read is nineteenth-century French comedies of manners, you can find a similar spectrum extending from brilliance to bathos.

Every inveterate reader takes in a certain amount of what I call popcorn reading—the sort of thing that’s read once, mildly enjoyed, and then returned to the library, the paperback exchange, or whatever electronic Elysium e-books enter when you hit the delete button. That’s as inevitable as it is harmless. The texts that matter in developing your personal taste, though, are the ones you read more than once, and especially the ones you read over and over again. As you read these for the third or the thirty-third time, step back now and then from the flow of the story or the development of the argument, and notice how the writer uses language. Learn to notice the really well-turned phrases, the figures of speech that are so apt and unexpected that they seize your attention, the moments of humor, the plays on words, the  passages that match tone and pacing to the subject perfectly.

If you’ve got a particular genre in mind—no, let’s stop for a moment and talk about genre, shall we? Those of my readers who endured a normal public school education here in the US probably don’t know that this is pronounced ZHON-ruh (it’s a French word) and it simply means a category of writing. Satirical gay pornography is a genre. The comedy of manners is a genre. The serious contemporary literary novel is a genre.  So are mysteries, romance, science fiction, fantasy, and the list goes on. There are also nonfiction genres—for example, future-oriented social criticism, the genre in which nine of my books from The Long Descent to Dark Age America have their place. Each genre is an answer to the question, “I just read this and I liked it—where can I find something else more or less like it?”

Every genre has its own habits and taboos, and if you want to write for publication, you need to know what those are. That doesn’t mean you have to follow those habits and taboos with the kind of rigid obedience critiqued above—quite the contrary—but you need to know about them, so that when you break the rules you do it deliberately and skillfully, to get the results you want, rather than clumsily, because you didn’t know any better. It also helps to read the classics of the genre—the books that established those habits and taboos—and then go back and read books in the genre written before the classics, to get a sense of what possibilities got misplaced when the classics established the frame through which all later works in that genre would be read.

If you want to write epic fantasy, for example, don’t you dare stop with Tolkien—it’s because so many people stopped with Tolkien that we’ve got so many dreary rehashes of something that was brilliantly innovative in 1949, complete with carbon-copy Dark Lords cackling in chorus and the inevitable and unendearing quest to do something with the Magic McGuffin that alone can save blah blah blah. Read the stuff that influenced Tolkien—William Morris, E.R. Eddison, the Norse sagas, the Kalevala, Beowulf.  Then read something in the way of heroic epic that he probably didn’t get around to reading—the Ramayana, the Heike Monogatari, the Popol Vuh, or what have youand think through what those have to say about the broader genre of heroic wonder tale in which epic fantasy has its place.

The point of this, by the way, isn’t to copy any of these things. It’s to develop your own sense of taste so that you can shape your own prose accordingly. Your goal, if you’re at all serious about writing, isn’t to write like Mrs. McNitpick, like your favorite author of satirical gay pornography or nineteenth-century French comedies of manners, or like me, but to write like yourself.

And that, to extend the same point more broadly, is the goal of any education worth the name. The word “education” itself comes from the Latin word educatio, from ex-ducere, “to lead out or bring out;” it’s about leading or bringing out the undeveloped potentials that exist inside the student, not shoving some indigestible bolus of canned information or technique down the student’s throat. In writing as in all other things that can be learned, that process of bringing out those undeveloped potentials requires the support of rules and examples, but those are means to an end, not ends in themselves—and it’s in the space between the rules and their inevitable exceptions, between the extremes of rigid formalism and shapeless vagueness, that the work of creation takes place.

That’s also true of politics, by the way—and the conventional wisdom of our time fills the same role there that the rules for bad internet prose do for writing. Before we can explore that, though, it’s going to be necessary to take on one of the more pervasive bad habits of contemporary thinking about the relationship between the present and the past. We’ll tackle that next week.

********************
In not wholly unrelated news, I’m pleased to announce that Merigan Tales, the anthology of short stories written by Archdruid Report readers set in the world of my novel Star’s Reach, is now in print and available for purchase from Founders House. Those of my readers who enjoyed Star’s Reach and the After Oil anthologies won’t want to miss it.

Wednesday, December 28, 2016

A Leap in the Dark

A few days from now, 2016 will have passed into the history books. I know a fair number of people who won’t mourn its departure, but it’s pretty much a given that the New Year celebrations here in the United States, at least, will demonstrate a marked shortage of enthusiasm for the arrival of 2017.

There’s good reason for that, and not just for the bedraggled supporters of Hillary Clinton’s failed and feckless presidential ambitions. None of the pressures that made 2016 a cratered landscape of failed hopes and realized nightmares have gone away. Indeed, many of them are accelerating, as the attempt to maintain a failed model of business as usual in the teeth of political, economic, and environmental realities piles blowback upon blowback onto the loading dock of the new year.

Before we get into that, though, I want to continue the annual Archdruid Report tradition and review the New Year’s predictions that I made at the beginning of 2016. Those of my readers who want to review the original post will find it here. Here’s the gist.

“Thus my core prediction for 2016 is that all the things that got worse in 2015 will keep on getting worse over the year to come. The ongoing depletion of fossil fuels and other nonrenewable resources will keep squeezing the global economy, as the real (i.e., nonfinancial) costs of resource extraction eat up more and more of the world’s total economic output, and this will drive drastic swings in the price of energy and commodities—currently those are still headed down, but they’ll soar again in a few years as demand destruction completes its work. The empty words in Paris a few weeks ago will do nothing to slow the rate at which greenhouse gases are dumped into the atmosphere, raising the economic and human cost of climate-related disasters above 2015’s ghastly totals—and once again, the hard fact that leaving carbon in the ground means giving up the lifestyles that depend on digging it up and burning it is not something that more than a few people will be willing to face.

“Meanwhile, the US economy will continue to sputter and stumble as politicians and financiers try to make up for ongoing declines in real (i.e., nonfinancial) wealth by manufacturing paper wealth at an even more preposterous pace than before, and frantic jerryrigging will keep the stock market from reflecting the actual, increasingly dismal state of the economy.  We’re already in a steep economic downturn, and it’s going to get worse over the year to come, but you won’t find out about that from the mainstream media, which will be full of the usual fact-free cheerleading; you’ll have to watch the rates at which the people you know are being laid off and businesses are shutting their doors instead.” 

It’s almost superfluous to point out that I called it. It’s been noted with much irritation by other bloggers in what’s left of the peak oil blogosphere that it takes no great talent to notice what’s going wrong, and point out that it’s just going to keep on heading the same direction. This I cheerfully admit—but it’s also relevant to note that this method produces accurate predictions. Meanwhile, the world-saving energy breakthroughs, global changes in consciousness, sudden total economic collapses, and other events that get predicted elsewhere year after weary year have been notable by their absence.

I quite understand why it’s still popular to predict these things: after all, they allow people to pretend that they can expect some future other than the one they’re making day after day by their own actions. Nonetheless, the old saying remains true—“if you always do what you’ve always done, you’ll always get what you’ve always gotten”—and I wonder how many of the people who spend each year daydreaming about the energy breakthroughs, changes in consciousness, economic collapses, et al, rather than coming to grips with the rising spiral of crises facing industrial civilization, really want to deal with the future that they’re storing up for themselves by indulging in this habit.

Let’s go on, though.  At the beginning of 2016, I also made four specific predictions, which I admitted at the time were long shots. One of those, specific prediction #3, was that the most likely outcome of the 2016 presidential election would be the inauguration of Donald Trump as President in January 2017. I don’t think I need to say much about that, as it’s already been discussed here at length.  The only thing I’d like to point out here is that much of the Democratic party seems to be fixated on finding someone or something to blame for the debacle, other than the stark incompetence of the Clinton campaign and the failure of Democrats generally to pay attention to anything outside the self-referential echo chambers of affluent liberal opinion. If they keep it up, it’s pretty much a given that Trump will win reelection in 2020.

The other three specific long-shot predictions didn’t pan out, at least not in the way that I anticipated, and it’s only fair—and may be helpful, as we head further into the unknown territory we call 2017—to talk about what didn’t happen, and why.

Specific prediction #1 was that the next tech bust would be under way by the end of 2016.  That’s happening, but not in the way I expected. Back in January I was looking at the maniacally overinflated stock prices of tech companies that have never made a cent in profit and have no meaningful plans to do so, and I expected a repeat of the “tech wreck” of 2000. The difficulty was simply I didn’t take into account the most important economic shift between 2000 and 2016—the de facto policy of negative interest rates being pursued by the Federal Reserve and certain other central banks.

That policy’s going to get a post of its own one of these days, because it marks the arrival of a basic transformation in economic realities that’s as incomprehensible to neoliberal economists as it will be challenging to most of the rest of us. The point I want to discuss here here, though, is a much simpler one. Whenever real interest rates are below zero, those elite borrowers who can get access to money on those terms are being paid to borrow.  Among many other things, this makes it a lot easier to stretch out the downward arc of a failing industry. Cheaper-than-free money is one of the main things that kept the fracking industry from crashing and burning from its own unprofitability once the price of oil plunged in 2013; there’s been a steady string of bankruptcies in the fracking industry and the production of oil from fracked wells has dropped steadily, but it wasn’t the crash many of us expected.

The same thing is happening, in equally slow motion, with the current tech bubble. Real estate prices in San Francisco and other tech hotspots are sliding, overpaid tech employees are being systematically replaced by underpaid foreign workers, the numbers are looking uglier by the week, but the sudden flight of investment money that made the “tech wreck” so colorful sixteen years ago isn’t happening, because tech firms can draw on oceans of relatively cheap funding to turn the sudden popping of the tech bubble into the slow hiss of escaping air. That doesn’t mean that the boom-and-bust cycle has been cancelled—far from it—but it does mean that shoveling bad money after good has just become a lot easier. Exactly how that will impact the economy is a very interesting question that nobody just now knows how to answer.

Let’s move on.  Specific prediction #2 was that the marketing of what would inevitably be called “the PV revolution” would get going in a big way in 2016. Those of my readers who’ve been watching the peak oil scene for more than a few years know that ever since the concept of peak oil clawed its way back out of its long exile in the wilderness of the modern imagination, one energy source after anobter has been trotted out as the reason du jour why the absurdly extravagant lifestyles of today’s privileged classes can roll unhindered into the future.  I figured, based on the way that people in the mainstream environmentalist movement were closing ranks around renewables, that photovoltaic solar energy would be the next beneficiary of that process, and would take off in a big way as the year proceeded.

That this didn’t happen is not the fault of the solar PV industry or its cheerleades in the green media. Naomi Oreskes’ strident insistence a while back that raising questions about the economic viability of renewable energy is just another form of climate denialism seems to have become the party line throughout the privileged end of the green left, and the industrialists are following suit. Elon Musk, whose entire industrial empire has been built on lavish federal subsidies, is back at the feed trough again, announcing a grandiose new plan to manufacture photovoltaic roof shingles; he’s far and away the most colorful of the would-be renewable-energy magnates, but others are elbowing their way toward the trough as well, seeking their own share of the spoils.

The difficulty here is twofold. First, the self-referential cluelessness of the Democratic party since the 2008 election has had the inevitable blowback—something like 1000 state and federal elective offices held by Democrats after that election are held by Republicans today—and the GOP’s traditional hostility toward renewable energy has put a lid on the increased subsidies that would have been needed to kick a solar PV feeding frenzy into the same kind of overdrive we’ve already seen with ethanol and wind. Solar photovoltaic power, like ethanol from corn, has a disastrously low energy return on energy invested—as Pedro Prieto and Charles Hall showed in their 2015 study of real-world data from Spain’s solar PV program, the EROEI on large-scale grid photovoltaic power works out in practice to less than 2.5—and so, like nuclear power, it’s only economically viable if it’s propped up by massive and continuing subsidies. Lacking those, the “PV revolution” is dead in the water.

The second point, though, is the more damaging.  The “recovery” after the 2008-2009 real estate crash was little more than an artifact of statistical manipulation, and even negative interest rates haven’t been able to get a heartbeat going in the economy’s prostrate body. As most economic measurements not subject to fiddling by the enthusiastic accountants of the federal government slide steadily downhill, the economic surplus needed to support any kind of renewables buildout at all is rapidly tricking away. Demand destruction is in the driver’s seat, and the one way of decreasing fossil fuel consumption that affluent environmentalists don’t want to talk about—conservation—is the only viable option just now.

Specific prediction #4 was that the Saudi regime in Arabia would collapse by the end of 2016. As I noted at the time, the replacement of the Saudi monarchy with some other form of government is for all practical purposes a done deal. Of the factors I cited then—the impending bankruptcy of a regime that survives only by buying off dissent with oil money, the military quagmires in Yemen, Syria, and Iraq that have the Saudi military and its foreign mercenaries bogged down inextricably, and the rest of it—none have gone away. Nor has the underlying cause, the ongoing depletion of the once-immense oil reserves that have propped up the Saudi state so far.

That said, as I noted back in January, it’s anyone’s guess what cascade of events will send the Saudi royal family fleeing to refuges overseas while mobs rampage through their abandoned palaces in Riyadh, and some combination of mid-level military officers and Muslim clerics piece together a provisional government in their absence. I thought that it was entirely possible that this would happen in 2016, and of course it didn’t. It’s possible at this point that the price of oil could rise fast enough to give the Saudi regime another lease on life, however brief. That said, the winds are changing across the Middle East; the Russian-Iranian alliance is in the ascendant, and the Saudis have very few options left. It will be interesting, in the sense of the apocryphal Chinese curse, to see how long they survive.

So that’s where we stand, as 2016 stumbles down the ramp into time’s slaughterhouse and 2017 prepares to take its place in the ragged pastures of history. What can we expect in the year ahead?

To some extent, I’ve already answered that question—but only to some extent. Most of the factors that drove events in 2016 are still in place, still pressing in the same direction, and “more of the same” is a fair description of the consequences. Day after day, the remaining fossil fuel reserves of a finite planet are being drawn down to maintain the extravagant and unsustainable lifestyles of the industrial world’s more privileged inmates. Those remaining reserves are increasingly dirty, increasingly costly to extract and process, increasingly laden with a witch’s brew of social, economic, and environmental costs that nobody anywhere is willing to make the fossil fuel industry cover, and those costs don’t go away just because they’re being ignored—they pile up in society, the economy, and the biosphere, producing the rising tide of systemic dysfunction that plays so large and unmentioned a role in daily life today.

Thus we can expect still more social turmoil, more economic instability, and more environmental blowback in 2017. The ferocious populist backlash against the economic status quo that stunned the affluent in Britain and America with the Brexit vote and Trump’s presidential victory respectively, isn’t going away until and unless the valid grievances of the working classes get heard and addressed by political establishments around the industrial world; to judge by examples so far, that’s unlikely to happen any time soon. At the same time, the mismatch between the lifestyles we can afford and the lifestyles that too many of us want to preserve remains immense, and until that changes, the global economy is going to keep on lurching from one crisis to another. Meanwhile the biosphere is responding to the many perturbations imposed on it by human stupidity in the way that systems theory predicts—with ponderous but implacable shifts toward new conditions, many of which don’t augur well for the survival of industrial society.

There are wild cards in the deck, though, and one of them is being played right now over the North Pole. As I write this, air temperatures over the Arctic ice cap are 50°F warmer than usual for this time of year. A destabilized jet stream is sucking masses of warm air north into the Arctic skies, while pushing masses of Arctic air down into the temperate zone. As a result, winter ice formation on the surface of the Arctic ocean has dropped to levels tht were apparently last seen before our species got around to evolving—and a real possibility exists, though it’s by no means a certainty yet, that next summer could see most of the Arctic Ocean free of ice.

Nobody knows what that will do to the global climate. The climatologists who’ve been trying to model the diabolically complex series of cascading feedback loops we call “global climate” have no clue—they have theories and computer models, but so far their ability to predict the rate and consequences of anthropogenic climate change have not exactly been impressive. (For what it’s worth, by the way, most of their computer models have turned out to be far too conservative in their predictions.) Nobody knows yet whether the soaring temperatures over the North Pole this winter are a fluke, a transitory phenomenon driven by the unruly transition between one climate regime and another, or the beginning of a recurring pattern that will restore the north coast of Canada to the conditions it had during the Miocene, when crocodiles sunned themselves on the warm beaches of northern Greenland. We simply don’t know.

In the same way, the populist backlash mentioned above is a wild card whose effects nobody can predict just now. The neoliberal economics that have been welded into place in the industrial world for the last thirty years have failed comprehensively, that’s clear enough.  The abolition of barriers to the flow of goods, capital, and population did not bring the global prosperity that neoliberal economists promised, and now the bill is coming due. The question is what the unraveling of the neoliberal system means for national economies in the years ahead.

There are people—granted, these are mostly neoliberal economists and those who’ve drunk rather too freely of the neoliberal koolaid—who insist that the abandonment of the neoliberal project will inevitably mean economic stagnation and contraction. There are those who insist that the abandonment of the neoliberal project will inevitably mean a return to relative prosperity here in the US, as offshored jobs are forced back stateside by tax policies that penalize imports, and the US balance of trade reverts to something a little closer to parity. The fact of the matter is that nobody knows what the results will be. Here as in Britain, voters faced with a choice between the perpetuation of an intolerable status quo and a leap in the dark chose the latter, and the consequences of that leap can’t be known in advance.

Other examples abound. The US president-elect has claimed repeatedly that the US under his lead will get out of the regime-change business and pursue a less monomaniacally militaristic foreign policy than the one it’s pursued under Bush and Obama, and would have pursued under Clinton. The end of the US neoconservative consensus is a huge change that will send shockwaves through the global political system. Another change, at least as huge, is the rise of Russia as a major player in the Middle East. Another? The remilitarization of Japan and its increasingly forceful pursuit of political and military alliances in East and South Asia. There are others. The familiar order of global politics is changing fast. What will the outcome be? Nobody knows.

As 2017 dawns, in a great many ways, modern industrial civilization has flung itself forward into a darkness where no stars offer guidance and no echoes tell what lies ahead. I suspect that when we look back at the end of this year, the predictable unfolding of ongoing trends will have to be weighed against sudden discontinuities that nobody anywhere saw coming.  We’re not discussing the end of the world, of course; we’re talking events like those that can be found repeated many times in the histories of other failing civilizations.  That said, my guess is that some of those discontinuities are going to be harsh ones.  Those who brace themselves for serious trouble and reduce their vulnerabilities to a brittle and dysfunctional system will be more likely to come through in one piece.

Those who are about to celebrate the end of 2016, in other words, might want to moderate their cheering when it’s over. It’s entirely possible that 2017 will turn out to be rather worse—despite which I hope that the readers of this blog, and the people they care about, will manage to have a happy New Year anyway.

Wednesday, December 21, 2016

A Season of Consequences

One of the many advantages of being a Druid is that you get to open your holiday presents four days early. The winter solstice—Alban Arthuan, to use one term for it in the old-fashioned Druid Revival traditions I practice—is one of the four main holy days of the Druid year. Though the actual moment of solstice wobbles across a narrow wedge of the calendar, the celebration traditionally takes place on December 21.  Yes, Druids give each other presents, hang up decorations, and enjoy as sumptuous a meal as resources permit, to celebrate the rekindling of light and hope in the season of darkness.

Come to think of it, I’m far from sure why more people who don’t practice the Christian faith still celebrate Christmas, rather than the solstice. It’s by no means necessary to believe in the Druid gods and goddesses to find the solstice relevant; a simple faith in orbital inclination is sufficient reason for the season, after all—and since a good many Christians in America these days are less than happy about what’s been done to their holy day, it seems to me that it would be polite to leave Christmas to them, have our celebrations four days earlier, and cover their shifts at work on December 25th in exchange for their covering ours on the 21st. (Back before my writing career got going, when I worked in nursing homes to pay the bills, my Christian coworkers and I did this as a matter of course; we also swapped shifts around Easter and the spring equinox. Religious pluralism has its benefits.)

Those of my readers who don’t happen to be Druids, but who are tempted by the prospect just sketched out, will want to be aware of a couple of details. For one thing, you won’t catch Druids killing a tree in order to stick it in their living room for a few weeks as a portable ornament stand and fire hazard. Druids think there should be more trees in the world, not fewer! A live tree or, if you must, an artificial one, would be a workable option, but a lot of Druids simply skip the tree altogether and hang ornaments on the mantel, or what have you.

Oh, and most of us don’t do Santa Claus. I’m not sure why Santa Claus is popular among Christians, for that matter, or among anyone else who isn’t a devout believer in the ersatz religion of Consumerism—which admittedly has no shortage of devotees just now. There was a time when Santa hadn’t yet been turned into a poorly paid marketing consultant to the toy industry; go back several centuries, and he was the Christian figure of St. Nicholas; and before then he may have been something considerably stranger. To those who know their way around the traditions of Siberian shamanism, certainly, the conjunction of flying reindeer and an outfit colored like the famous and perilous hallucinogenic mushroom Amanita muscaria is at least suggestive.

Still, whether he takes the form of salesman, saint, or magic mushroom, Druids tend to give the guy in the red outfit a pass. Solstice symbolism varies from one tradition of Druidry to another—like almost everything else among Druids—but in the end of the tradition I practice, each of the Alban Gates (the solstices and equinoxes) has its own sacred animal, and the animal that corresponds to Alban Arthuan is the bear. If by some bizarre concatenation of circumstances Druidry ever became a large enough faith in America to attract the attention of the crazed marketing minions of consumerdom, you’d doubtless see Hallmark solstice cards for sale with sappy looking cartoon bears on them, bear-themed decorations in windows, bear ornaments to hang from the mantel, and the like.

While I could do without the sappy looking cartoons, I definitely see the point of bears as an emblem of the winter solstice, because there’s something about them that too often gets left out of the symbolism of Christmas and the like—though it used to be there, and relatively important, too. Bears are cute, no question; they’re warm and furry and cuddlesome, too; but they’re also, ahem, carnivores, and every so often, when people get sufficiently stupid in the vicinity of bears, the bears kill and eat them.

That is to say, bears remind us that actions have consequences.

I’m old enough that I still remember the days when the folk mythology surrounding Santa Claus had not quite shed the last traces of a similar reminder. According to the accounts of Santa I learned as a child, naughty little children ran a serious risk of waking up Christmas morning to find no presents at all, and a sorry little lump of coal in their stockings in place of the goodies they expected. I don’t recall any of my playmates having that happen to them, and it never happened to me—though I arguably deserved it rather more than once—but every child I knew took it seriously, and tried to moderate their misbehavior at least a little during the period after Thanksgiving. That detail of the legend may still survive here and there, for all I know, but you wouldn’t know it from the way the big guy in red is retailed by the media these days.

For that matter, the version I learned was a pale shadow of a far more unnerving original. In many parts of Europe, when St. Nicholas does the rounds, he’s accompanied by a frightening figure with various names and forms. In parts of Germany, Switzerland, and Austria, it’s Krampus—a hairy devil with goat’s horns and a long lolling tongue, who prances around with a birch switch in his hand and a wicker basket on his back. While the saint hands out presents to good children, Krampus is there for the benefit of the others; small-time junior malefactors can expect a thrashing with the birch switch, while the legend has it that the shrieking, spoiled little horrors at the far end of the naughty-child spectrum get popped into the wicker basket and taken away, and nobody ever hears from them again.

Yes, I know, that sort of thing’s unthinkable in today’s America, and I have no idea whether anyone still takes it with any degree of seriousness over in Europe. Those of my readers who find the entire concept intolerable, though, may want to stop for a moment and think about the context in which that bit of folk tradition emerged. Before fossil fuels gave the world’s industrial nations the temporary spate of abundance that they now enjoy, the coming of winter in the northern temperate zone was a serious matter. The other three seasons had to be full of hard work and careful husbandry, if you were going to have any particular likelihood of seeing spring before you starved or froze to death.

By the time the solstice came around, you had a tolerably good idea just how tight things were going to be by the time spring arrived and the first wild edibles showed up to pad out the larder a bit. The first pale gleam of dawn after the long solstice night was a welcome reminder that spring was indeed on its way, and so you took whatever stored food you could spare, if you could spare any at all, and turned it into a high-calorie, high-nutrient feast, to provide warm memories and a little additional nourishment for the bleak months immediately ahead.

In those days, remember, children who refused to carry their share of the household economy might indeed expect to be taken away and never be heard from again, though the taking away would normally be done by some combination of hunger, cold, and sickness, rather than a horned and hairy devil with a lolling tongue. Of course a great many children died anyway.  A failed harvest, a longer than usual winter, an epidemic, or the ordinary hazards of life in a nonindustrial society quite regularly put a burst of small graves in the nearest churchyard. It was nonetheless true that good children, meaning here those who paid attention, learned fast, worked hard, and did their best to help keep the household running smoothly, really did have a better shot at survival.

One of the most destructive consequences of the age of temporary abundance that fossil fuels gave to the world’s industrial nations, in turn, is the widespread conviction that consequences don’t matter—that it’s unreasonable, even unfair, to expect anyone to have to deal with the blowback from their own choices. That’s a pervasive notion these days, and its effects show up in an astonishing array of contexts throughout contemporary culture, but yes, it’s particularly apparent when it comes to the way children get raised in the United States these days.

The interesting thing here is that the children aren’t necessarily happy about that. If you’ve ever watched a child systematically misbehave in an attempt to get a parent to react, you already know that kids by and large want to know where the limits are. It’s the adults who want to give tests and then demand that nobody be allowed to fail them, who insist that everybody has to get an equal share of the goodies no matter how much or little they’ve done to earn them, and so on through the whole litany of attempts to erase the reality that actions have consequences.

That erasure goes very deep. Have you noticed, for example, that year after year, at least here in the United States, the Halloween monsters on public display get less and less frightening? These days, far more often than not, the ghosts and witches, vampires and Frankenstein’s monsters splashed over Hallmark cards and window displays in the late October monster ghetto have big goofy grins and big soft eyes. The wholesome primal terrors that made each of these things iconic in the first place—the presence of the unquiet dead, the threat of wicked magic, the ghastly vision of walking corpses, whether risen from the grave to drink your blood or reassembled and reanimated by science run amok—are denied to children, and saccharine simulacra are propped up in their places.

Here again, children aren’t necessarily happy about that. The bizarre modern recrudescence of the Victorian notion that children are innocent little angels tells me, if nothing else, that most adults must go very far out of their way to forget their own childhoods. Children aren’t innocent little angels; they’re fierce little animals, which is of course exactly what they should be, and they need roughly the same blend of gentleness and discipline that wolves use on their pups to teach them to moderate their fierceness and live in relative amity with the other members of the pack.  Being fierce, they like to be scared a little from time to time; that’s why they like to tell each other ghost stories, the more ghoulish the better, and why they run with lolling tongues toward anything that promises them a little vicarious blood and gore. The early twentieth century humorist Ogden Nash nailed it when he titled one of his poems “Don’t Cry, Darling, It’s Blood All Right.”

Traditional fairy tales delighted countless generations of children for three good and sufficient reasons. First of all, they’re packed full of wonderful events. Second, they’re positively dripping with gore, which as already noted is an instant attraction to any self-respecting child. Third, they’ve got a moral—which means, again, that they are about consequences. The selfish, cruel, and stupid characters don’t get patted on the head, given the same prize as everyone else, and shielded from the results of their selfishness, cruelty, and stupidity; instead, they get gobbled up by monsters, turned to stone by witches’ curses, or subjected to some other suitably grisly doom. It’s the characters who are honest, brave, and kind who go on to become King or Queen of Everywhere.

Such things are utterly unacceptable, according to the approved child-rearing notions of our day.  Ask why this should be the case and you can count on being told that expecting a child to have to deal with the consequences of its actions decreases it’s self-esteem. No doubt that’s true, but this is another of those many cases where people in our society manage not to notice that the opposite of one bad thing is usually another bad thing. Is there such a thing as too little self-esteem? Of course—but there is also such a thing as too much self-esteem. In fact, we have a common and convenient English word for somebody who has too much self-esteem. That word is “jerk.”

The cult of self-esteem in contemporary pop psychology has thus produced a bumper crop of jerks in today’s America. I’m thinking here, among many other examples, of the woman who made the news a little while back by strolling right past the boarding desk at an airport, going down the ramp, and taking her seat on the airplane ahead of all the other passengers, just because she felt she was entitled to do so. When the cabin crew asked her to leave and wait her turn like everyone else, she ignored them; security was called, and she ignored them, too. They finally had to drag her down the aisle and up the ramp like a sack of potatoes, and hand her over to the police. I’m pleased to say she’s up on charges now.

That woman had tremendous self-esteem. She esteemed herself so highly that she was convinced that the rules that applied to everyone else surely couldn’t apply to her—and that’s normally the kind of attitude you can count on from someone whose self-esteem has gone up into the toxic-overdose range. Yet the touchstone of excessive self-esteem, the gold standard of jerkdom, is the complete unwillingness to acknowledge the possibility that actions have consequences and you might have to deal with those, whether you want to or not.

That sort of thing is stunningly common in today’s society. It was that kind of overinflated self-esteem that convinced affluent liberals in the United States and Europe that they could spend thirty years backing policies that pandered to their interests while slamming working people face first into the gravel, without ever having to deal with the kind of blowback that arrived so dramatically in the year just past. Now Britain is on its way out of the European Union, Donald Trump is mailing invitations to his inaugural ball, and the blowback’s not finished yet. Try to point this out to the people whose choices made that blowback inevitable, though, and if my experience is anything to go by, you’ll be ignored if you’re not shouted down.

On an even greater scale, of course, there’s the conviction on the part of an astonishing number of people that we can keep on treating this planet as a combination cookie jar to raid and garbage bin to dump wastes in, and never have to deal with the consequences of that appallingly shortsighted set of policies. That’s as true in large swathes of the allegedly green end of things, by the way, as it is among the loudest proponents of smokestacks and strip mines. I’ve long since lost track of the number of people I’ve met who insist loudly on how much they love the Earth and how urgent it is that “we” protect the environment, but who aren’t willing to make a single meaningful change in their own personal consumption of resources and production of pollutants to help that happen.

Consequences don’t go away just because we don’t want to deal with them. That lesson is being taught right now on low-lying seacoasts around the world, where streets that used to be well above the high tide line reliably flood with seawater when a high tide meets an onshore wind; it’s being taught on the ice sheets of Greenland and West Antarctica, which are moving with a decidedly un-glacial rapidity through a trajectory of collapse that hasn’t been seen since the end of the last ice age; it’s being taught in a hundred half-noticed corners of an increasingly dysfunctional global economy, as the externalized costs of technological progress pile up unnoticed and drag economic activity to a halt; and of course it’s being taught, as already noted, in the capitals of the industrial world, where the neoliberal orthodoxy of the last thirty years is reeling under the blows of a furious populist backlash.

It didn’t have to be learned that way. We could have learned it from Krampus or the old Santa Claus, the one who was entirely willing to leave a badly behaved child’s stocking empty on Christmas morning except for that single eloquent lump of coal; we could have learned it from the fairy tales that taught generations of children that consequences matter; we could have learned it from any number of other sources, given a little less single-minded a fixation on maximizing self-esteem right past the red line on the meter—but enough of us didn’t learn it that way, and so here we are.

I’d therefore like to encourage those of my readers who have young children in their lives to consider going out and picking up a good old-fashioned collection of fairy tales, by Charles Perrault or the Brothers Grimm, and use those in place of the latest mass-marketed consequence-free pap when it comes to storytelling time. The children will thank you for it, and so will everyone who has to deal with them in their adult lives. Come to think of it, those of my readers who don’t happen to have young children in their lives might consider doing the same thing for their own benefit, restocking their imaginations with cannibal giants and the other distinctly unmodern conveniences thereof, and benefiting accordingly.

And if, dear reader, you are ever tempted to climb into the lap of the universe and demand that it fork over a long list of goodies, and you glance up expecting to see the jolly and long-suffering face of Santa Claus beaming down at you, don’t be too surprised if you end up staring in horror at the leering yellow eyes and lolling tongue of Krampus instead, as he ponders whether you’ve earned a thrashing with the birch switch or a ride in the wicker basket—or perhaps the great furry face of the Solstice bear, the beast of Alban Arthuan, as she blinks myopically at you for a moment before she either shoves you from her lap with one powerful paw, or tears your arm off and gnaws on it meditatively while you bleed to death on the cold, cold ground.

Because the universe doesn’t care what you think you deserve. It really doesn’t—and, by the way, the willingness of your fellow human beings to take your wants and needs into account will by and large be precisely measured by your willingness to do the same for them.

And on that utterly seasonal note, I wish all my fellow Druids a wonderful solstice; all my Christian friends and readers, a very merry Christmas; and all my readers, whatever their faith or lack thereof, a rekindling of light, hope, and sanity in a dark and troubled time.

Wednesday, December 14, 2016

Why the Peak Oil Movement Failed

As I glance back across the trajectory of this blog over the last ten and a half years, one change stands out. When I began blogging in May of 2006, peak oil—the imminent peaking of global production of conventional petroleum, to unpack that gnomic phrase a little—was the central theme of a large, vocal, and tolerably well organized movement. It had its own visible advocacy organizations, it had national and international conferences, it had a small but noticeable presence in the political sphere, and it showed every sign of making its presence felt in the broader conversation of our time.

Today none of that is true. Of the three major peak oil organizations in the US, ASPO-USA—that’s the US branch of the Association for the Study of Peak Oil and Gas, for those who don’t happen to be fluent in acronym—is apparently moribund; Post Carbon Institute, while it still plays a helpful role from time to time as a platform for veteran peak oil researcher Richard Heinberg, has otherwise largely abandoned its former peak oil focus in favor of generic liberal environmentalism; and the US branch of the Transition organization, formerly the Transition Town movement, is spinning its wheels in a rut laid down years back. The conferences ASPO-USA once hosted in Washington DC, with congresscritters in attendance, stopped years ago, and an attempt to host a national conference in southern Pennsylvania fizzled after three years and will apparently not be restarted.

Ten years ago, for that matter, opinion blogs and news aggregators with a peak oil theme were all over the internet. Today that’s no longer the case, either. The fate of the two most influential peak oil sites, The Oil Drum and Energy Bulletin, is indicative. The Oil Drum simply folded, leaving its existing pages up as a legacy of a departed era.  Energy Bulletin, for its part, was taken over by Post Carbon Institute and given a new name and theme as Resilience.org. It then followed PCI in its drift toward the already overcrowded environmental mainstream, replacing the detailed assessment of energy futures that was the staple fare of Energy Bulletin with the sort of uncritical enthusiasm for an assortment of vaguely green causes more typical of the pages of Yes! Magazine.

There are still some peak oil sites soldiering away—notably Peak Oil Barrel, under the direction of former Oil Drum regular Ron Patterson.  There are also a handful of public figures still trying to keep the concept in circulation, with the aforementioned Richard Heinberg arguably first among them. Aside from those few, though, what was once a significant movement is for all practical purposes dead. The question that deserves asking is simple enough: what happened?

One obvious answer is that the peak oil movement was the victim of its own failed predictions. It’s true, to be sure, that failed predictions were a commonplace of the peak oil scene. It wasn’t just the overenthusiastic promoters of alternative energy technologies, who year after year insisted that the next twelve months would see their pet technology leap out of its current obscurity to make petroleum a fading memory; it wasn’t just their exact equivalents, the overenthusiastic promoters of apocalyptic predictions, who year after year insisted that the next twelve months would see the collapse of the global economy, the outbreak of World War III, the imposition of a genocidal police state, or whatever other sudden cataclysm happened to have seized their fancy.

No, the problem with failed predictions ran straight through the movement, even—or especially—in its more serious manifestations. The standard model of the future accepted through most of the peak oil scene started from a set of inescapable facts and an unexamined assumption, and the combination of those things produced consistently false predictions. The inescapable facts were that the Earth is finite, that it contains a finite supply of petroleum, and that various lines of evidence showed conclusively that global production of conventional petroleum was approaching its peak for hard geological reasons, and could no longer keep increasing thereafter.

The unexamined assumption was that geological realities rather than economic forces would govern how fast the remaining reserves of conventional petroleum would be extracted. On that basis, most people in the peak oil movement assumed that as production peaked and began to decline, the price of petroleum would rise rapidly, placing an increasingly obvious burden on the global economy. The optimists in the movement argued that this, in turn, would force nations around the world to recognize what was going on and make the transition to other energy sources, and to the massive conservation programs that would be needed to deal with the gap between the cheap abundant energy that petroleum used to provide and the more expensive and less abundant energy available from other sources. The pessimists, for their part, argued that it was already too late for such a transition, and that industrial civilization would come apart at the seams.

As it turned out, though, the unexamined assumption was wrong. Geological realities imposed, and continue to impose, upper limits on global petroleum production, but economic forces have determined how much less than those upper limits would actually be produced. What happened, as a result, is that when oil prices spiked in 2007 and 2008, and then again in 2014 and 2015, consumers cut back on their use of petroleum products, while producers hurried to bring marginal petroleum sources such as tar sands and oil shales into production to take advantage of the high prices. Both those steps drove prices back down. Low prices, in turn, encouraged consumers to use more petroleum products, and forced producers to shut down marginal sources that couldn’t turn a profit when oil was less than $80 a barrel; both these steps, in turn, sent prices back up.

That doesn’t mean that peak oil has gone away. As oilmen like to say, depletion never sleeps; each time the world passes through the cycle just described, the global economy takes another body blow, and the marginal petroleum sources cost much more to extract and process than the light sweet crude on which the oil industry used to rely. The result, though, is that instead of a sudden upward zoom in prices that couldn’t be ignored, we’ve gotten wild swings in commodity prices, political and social turmoil, and a global economy stuck in creeping dysfunction that stubbornly refuses to behave the way it did when petroleum was still cheap and abundant. The peak oil movement wasn’t prepared for that future.

Granting all this, failed predictions aren’t enough by themselves to stop a movement in its tracks. Here in the United States, especially, we’ve got an astonishing tolerance for predictive idiocy. The economists who insisted that neoliberal policies would surely bring prosperity, for example, haven’t been laughed into obscurity by the mere fact that they were dead wrong; au contraire, they’re still drawing their paychecks and being taken seriously by politicians and the media. The pundits who insisted at the top of their lungs that Britain wouldn’t vote for Brexit and Donald Trump couldn’t possibly win the US presidency are still being taken seriously, too. Nor, to move closer to the activist fringes, has the climate change movement been badly hurt by the embarrassingly linear models of imminent doom it used to deploy with such abandon; the climate change movement is in deep trouble, granted, but its failure has other causes.

It was the indirect impacts of those failed predictions, rather, that helped run the peak oil movement into the ground. The most important of these, to my mind, was the way that those predictions encouraged people in the movement to put their faith in the notion that sometime very soon, governments and businesses would have to take peak oil seriously. That’s what inspired ASPO-USA, for example, to set up a lobbying office in Washington DC with a paid executive director, when the long-term funding for such a project hadn’t yet been secured. On another plane, that’s what undergirded the entire strategy of the Transition Town movement in its original incarnation: get plans drawn up and officially accepted by as many town governments as possible, so that once the arrival of peak oil becomes impossible to ignore, the plan for what to do about it would already be in place.

Of course the difficulty in both cases was that the glorious day of public recognition never arrived. The movement assumed that events would prove its case in the eyes of the general public and the political system alike, and so made no realistic plans about what to do if that didn’t happen. When it didn’t happen, in turn, the movement was left twisting in the wind.

The conviction that politicians, pundits, and the public would be forced by events to acknowledge the truth about peak oil had other consequences that helped hamstring the movement. Outreach to the vast majority that wasn’t yet on board the peak oil bandwagon, for example, got far too little attention or funding. Early on in the movement, several books meant for general audiences—James Howard Kunstler’s The Long Emergency and Richard Heinberg’s The Party’s Over are arguably the best examples—helped lay the foundations for a more effective outreach program, but the organized followup that might have built on those foundations never really happened. Waiting on events took the place of shaping events, and that’s almost always a guarantee of failure.

One particular form of waiting on events that took a particularly steep toll on the movement was its attempts to get funding from wealthy donors. I’ve been told that Post Carbon Institute got itself funded in this way, while as far as I know, ASPO-USA never did. Win or lose, though, begging for scraps at the tables of the rich is a sucker’s game.  In social change as in every other aspect of life, who pays the piper calls the tune, and the rich—who benefit more than anyone else from business as usual—can be counted on to defend their interest by funding only those activities that don’t seriously threaten the continuation of business as usual. Successful movements for social change start by taking effective action with the resources they can muster by themselves, and build their own funding base by attracting people who believe in their mission strongly enough to help pay for it.

There were other reasons why the peak oil movement failed, of course. To its credit, it managed to avoid two of the factors that ran the climate change movement into the ground, as detailed in the essay linked above—it never became a partisan issue, mostly because no political party in the US was willing to touch it with a ten foot pole, and the purity politics that insists that supporters of one cause are only acceptable in its ranks if they also subscribe to a laundry list of other causes never really got a foothold outside of certain limited circles. Piggybacking—the flipside of purity politics, which demands that no movement be allowed to solve one problem without solving every other problem as well—was more of a problem, and so, in a big way, was pandering to the privileged—I long ago lost track of the number of times I heard people in the peak oil scene insist that this or that high-end technology, which was only affordable by the well-to-do, was a meaningful response to the coming of peak oil.

There are doubtless other reasons as well; it’s a feature of all things human that failure is usually overdetermined. At this point, though, I’d like to set that aside for a moment and consider two other points. The first is that the movement didn’t have to fail the way it did. The second is that it could still be revived and gotten back on a more productive track.

To begin with, not everyone in the peak oil scene bought into the unexamined assumption I’ve critiqued above. Well before the movement started running itself into the ground, some of us pointed out that economic factors were going to have a massive impact on the rates of petroleum production and consumption—my first essay on that theme appeared here in April of 2007, and I was far from the first person to notice it. The movement by that time was so invested in its own predictions, with their apparent promise of public recognition and funding, that those concerns didn’t have an impact at the time. Even when the stratospheric oil price spike of 2008 was followed by a bust, though, peak oil organizations by and large don’t seem to have reconsidered their strategies. A mid-course correction at that point, wrenching though it might have been, could have kept the movement alive.

There were also plenty of good examples of effective movements for social change from which useful lessons could have been drawn. One difficulty is that you won’t find such examples in today’s liberal environmental mainstream, which for all practical purposes hasn’t won a battle since Richard Nixon signed the Clean Air Act. The struggle for the right to same-sex marriage, as I’ve noted before, is quite another matter—a grassroots movement that, despite sparse funding and strenuous opposition, played a long game extremely well and achieved its goal. There are other such examples, on both sides of today’s partisan divide, from which useful lessons can be drawn. Pay attention to how movements for change succeed and how they fail, and it’s not hard to figure out how to play the game effectively. That could have been done at any point in the history of the peak oil movement. It could still be done now.

Like same-sex marriage, after all, peak oil isn’t inherently a partisan issue. Like same-sex marriage, it offers plenty of room for compromise and coalition-building. Like same-sex marriage, it’s a single issue, not a fossilized total worldview like those that play so large and dysfunctional a role in today’s political nonconversations. A peak oil movement that placed itself squarely in the abandoned center of contemporary politics, played both sides against each other, and kept its eyes squarely on the prize—educating politicians and the public about the reality of finite fossil fuel reserves, and pushing for projects that will mitigate the cascading environmental and economic impacts of peak oil—could do a great deal to  reshape our collective narrative about energy and, in the process, accomplish quite a bit to make the long road down from peak oil less brutal than it will otherwise be.

I’m sorry to say that the phrase “peak oil,” familiar and convenient as it is, probably has to go.  The failures of the movement that coalesced around that phrase were serious and visible enough that some new moniker will be needed for the time being, to avoid being tarred with a well-used brush. The crucial concept of net energy—the energy a given resource provides once you subtract the energy needed to extract, process, and use it—would have to be central to the first rounds of education and publicity; since it’s precisely equivalent to profit, a concept most people grasp quickly enough, that’s not necessarily a hard thing to accomplish, but it has to be done, because it’s when the concept of net energy is solidly understood that such absurdities as commercial fusion power appear in their true light.

It probably has to be said up front that no such project will keep the end of the industrial age from being an ugly mess. That’s already baked into the cake at this point; what were once problems to be solved have become predicaments that we can, at best, only mitigate. Nor could a project of the sort I’ve very roughly sketched out here expect any kind of overnight success. It would have to play a long game in an era when time is running decidedly short. Challenging? You bet—but I think it’s a possibility worth serious consideration.

***********************
In other news, I’m delighted to announce the appearance of two books that will be of interest to readers of this blog. The first is Dmitry Orlov’s latest, Shrinking the Technosphere: Getting a Grip on the Technologies that Limit Our Autonomy, Self-Sufficiency, and Freedom. It’s a trenchant and thoughtful analysis of the gap between the fantasies of human betterment through technological progress and the antihuman mess that’s resulted from the pursuit of those fantasies, and belongs on the same shelf as Theodore Roszak’s Where the Wasteland Ends: Politics and Transcendence in Postindustrial Society and my After Progress: Religion and Reason in the Twilight of the Industrial Age. Copies hot off the press can be ordered from New Society here.

Meanwhile, Space Bats fans will want to know that the anthology of short stories and novellas set in the world of my novel Star’s Reach is now available for preorder from Founders House here. Merigan Tales is a stellar collection, as good as any of the After Oil anthologies, and fans of Star’s Reach won’t want to miss it.