A Schoolboy's Guide To War

Anthony Seldon and David Walsh: Public Schools and the Great War - The Generation Lost

The New Criterion, October 1, 2014

Richard Ropner (Harrow), Royal Machine Gun Corps

Richard Ropner (Harrow), Royal Machine Gun Corps

On August 3, 1914, twenty-two of England’s best public school cricketers gathered for the annual schools’ representative match. The game ended the following evening. Britain’s ultimatum to Germany expired a few hours later. Seven of those twenty-two would be dead before the war was over. Anthony Seldon and David Walsh’s fine new history of the public schools and the First World War bears the subtitle “The Generation Lost” for good reason.

Britain neither wanted nor was prepared for a continental war. Its armed forces were mainly naval or colonial. The regular army that underwrote that ultimatum was, in the words of Niall Ferguson, “a dwarf force” with “just seven divisions (including one of cavalry), compared with Germany’s ninety-eight and a half.”

Britain’s more liberal political traditions, so distinct then—and now—from those of its European neighbors, had rendered peacetime conscription out of the question, but manpower shortages during the Boer War and growing anxiety over the vulnerability of the mother country itself led to a series of military reforms designed to toughen up domestic defenses. These included the consolidation of ancient yeomanry and militias into a Territorial Force and Special Reserve. The old public-school “rifle corps,” meanwhile, were absorbed into an Officers’ Training Corps and put under direct War Office control.

Most public schools signed up for this, and by 1914 most had made “the corps” compulsory. Some took it seriously. Quite a few did not. Stuart Mais, the author of A Public School in Wartime (1916), wrote that Sherborne’s prewar OTC was seen as “a piffling waste of time . . . playing at soldiers” that got in the way of cricket. Two decades later, Adolf Hitler cited the OTC to a surprised Anthony Eden (then Britain’s foreign secretary) as evidence of the militarization of Britain’s youth. It “hardly deserved such renown,” drily recalled Eden (in his unexpectedly evocative Another World 1897–1917), “even though the light grey uniforms with their pale blue facings did give our school contingent a superficially Germanic look.”

And yet this not-very-military nation saw an astonishing response to the call for volunteers to join the fight. By the end of September 1914 over 750,000 men had enlisted. But where were the officers to come from? A number of retired officers returned to the colors, and the Territorials boasted some men with useful experience, but these were not going to be close to sufficient numbers. The army turned to public-school alumni to fill the gap. In theory, this was because these men had enjoyed the benefit of some degree of military training, however inadequate, with the OTC, but in truth it was based on the belief of those in charge, themselves almost always former public schoolboys, that these chaps would know what to do. Looked at one way, this was nothing more than crude class prejudice; looked at another, it made a great deal of sense. In the later years of the war, many officers (“temporary gentlemen” in the condescending expression of the day) of humbler origins rose through the ranks, but in its earlier stages the conflict was too young to have taught the army how best to judge who would lead well. In the meantime, Old Harrovians, Old Etonians, and all those other Olds would have to do.

To agree that this was not unreasonable implies a level of acceptance of the public school system at its zenith utterly at odds with some of the deepest prejudices festering in Britain today. British politics remain obsessed with class in a manner that owes more to ancient resentments than any contemporary reality. A recent incident, in which a columnist for the far left Socialist Worker made fun of the fatal mauling of an Eton schoolboy by a polar bear (“another reason to save the polar bears”), is an outlier in its cruelty, but it’s a rare week that goes by in which a public school education is not used to whip a Tory cur, as David Cameron (Eton) knows only too well.

Under the circumstances it takes courage to combine, as Seldon and Walsh do, a not-unfriendly portrait of the early twentieth-century public schools (it should be noted that both men are, or have been, public schoolmasters) with a broader analysis that implies little sympathy for the sentimental clichés that dominate current British feeling—and it is felt, deeply so—about the Great War: Wilfred Owen and all that. It is not necessary to be an admirer of the decision to enter the war or indeed of how it was fought (I am neither) to regret how Britain’s understanding of those four terrible years has been so severely distorted over the past decades. Brilliantly deceptive leftist agitprop intended to influence modern political debate has come to be confused with history.

Oh! What a Lovely War smeared the British establishment of the 1960s with the filth of Passchendaele and the Somme. Similarly, the caricature of the war contained in television productions such as Blackadder Goes Forth (1989) and The Monocled Mutineer (1986) can at least partly be read as an angry response to Mrs. Thatcher’s long ascendancy. Coincidentally or not, the late 1980s also saw the appearance of The Old LieThe Great War and the Public-School Ethos by Peter Parker. For a caustic, literary, and intriguing—if slanted—dissection of these schools’ darker sides, Parker’s book is the place to go.

Seldon and Walsh offer a more detailed and distinctly more nuanced description of how these schools operated, handily knocking down a few clichés on the way: There were flannelled fools aplenty, but there was also the badly wounded Harold Macmillan (Eton), “intermittently” reading Aeschylus (in Greek) as he lay for days awaiting rescue in a shell hole. Aeschylus was not for all, but a glance at the letters officers wrote from the front is usually enough to shatter the myth of the ubiquitous philistine oaf. There was much more to the public schools than, to quote Harrow’s most famous song, “the tramp of the twenty-two men.”

But however harsh a critic he may be (“that men died for an ethos does not mean that the ethos was worth dying for”), Parker is too honest a writer not to acknowledge the good, sometimes heroic, qualities of these hopelessly ill-trained young officers and the bond they regularly forged across an often immense class divide with the troops that they led. “I got to know the men,” wrote my maternal grandfather Richard Ropner (Harrow, Machine Gun Corps) in an unpublished memoir half a century later, “I hope they got to know me.” In many such cases they did. It is tempting to speculate that such bonds (easier to claim, perhaps, de haut than en bas) may have been more real in the eyes of the commanders than of the commanded, but there is strong evidence to suggest that there was nothing imaginary about them. Men died for their officers. Officers died for their men.

My grandfather owned a set of memorial volumes published by Harrow in 1919. Each of the school’s war dead is commemorated with a photograph and an obituary. It is striking to see how frequently the affection with which these officers—and they almost all were officers—were held by their men is cited. Writing about Lieutenant Robert Boyd (killed at the Somme, July 14, 1916, aged twenty-three), his company commander wrote that Boyd’s “men both loved him and knew he was a good officer—two entirely different things.” This subtle point reflects the way that the public school ethos both fitted in with and smoothed the tough paternalism of the regular army into something more suited to a citizen army that now included recruits socially, temperamentally, and intellectually very different from that rough caste apart, Kipling’s “single men in barricks.”

The public schools relied heavily on older boys to maintain a regime that had come a long way from Tom Brown’s bleak start. This taught them both command and, in theory (Flashman had his successors), the obligations that came with it. That officers were expected both to lead and care for their men was a role for which they had thus already been prepared by an education designed, however haphazardly, to mold future generations of the ruling class. Contrary to what Parker might argue, these schools had not set out to groom their pupils for war. But the qualities these institutions taught—pluck, dutifulness, patriotism, athleticism (both as a good in itself and as a shaper of character), conformism, stoicism, group loyalty, and a curious mix of self-assurance and self-effacement—were to prove invaluable in the trenches as was familiarity with a disciplined, austere, all-male lifestyle.

There was something else: The fact that many of these men had boarded away from home, often from the age of eight, and sometimes even earlier, meant that they had learned how to put on a performance for the benefit of those who watched them. A display of weakness risked transforming boarding school life into one’s own version of Lord of the Flies. That particular training stayed with them on the Western Front: “I do not hold life cheap at all,” wrote Edward Brittain (Uppingham), “and it is hard to be sufficiently brave, yet I have hardly ever felt really afraid. One has to keep up appearances at all costs even if one is.” It was all, as Macmillan put it, part of “the show.”

There are countless examples of how stiff that upper lip could be, but when Seldon and Walsh cite the example of Captain Francis Townend (Dulwich), even those accustomed to such stories have to pause to ask, who were these men?: “Both legs blown off by a shell and balancing himself on his stumps, [Townend] told his rescuer to tend to the men first and said that he would be all right, though he might have to give up rugby next year. He then died.”

Pastoral care was all very well, but the soldiers also knew that, unlike the much-resented staff officers, their officers took the same, or greater, risks that they did. This was primarily due to the army’s traditional suspicion that the lower orders—not to speak of the raw, half-trained recruits who appeared in the trenches after 1914—could not be trusted with anything resembling responsibility, but it also reflected the officers’ own view of what their job should be. And so, subalterns (a British army term for officers below the rank of captain), captains, majors, and even colonels led from the front, often fulfilling, particularly in the case of subalterns, a role that in other armies would be delegated to NCOs. The consequences were lethal. Making matters worse, the inequality between the classes was such that officers were on average five inches taller than their men, and, until the rules were changed in 1916, they always wore different uniforms too. The Germans knew who to shoot. The longer-term implications of this cull of the nation’s elite may have been exaggerated by Britons anxious to explain away their country’s subsequent decline, but the numbers have not: some 35,000 former public schoolboys died in the war, a large slice of a small stratum of society.

Roughly eleven percent of those who fought in the British army were killed, but, as Seldon and Walsh show, the death rate among former public schoolboys (most of whom were officers) ran at some eighteen percent. For those who left school in the years leading up to 1914 (and were thus the most likely to have served as junior officers) the toll was higher still. Nearly forty percent of the Harrow intake of the summer of 1910 (my grandfather arrived at the school the following year) were not to survive the war. Six Weeks: The Short and Gallant Life of the British Officer in the First World War (2010) by John Lewis-Stempel is an elegiac, moving, and vivid account of what awaited them. Lewis-Stempel explains his title thus: “The average time a British Army junior officer survived during the Western Front’s bloodiest phases was six weeks.”

To Lewis-Stempel, a fierce critic of those who see the war as a pointless tragedy, the bravery and determination of these young officers made them “the single most important factor in Britain’s victory on the Western Front,” a stretch, but not an altogether unreasonable one, and he is not alone in thinking this way. The British army weathered the conflict far better—and far more cohesively—than did those of the other original combatants, and effective officering played no small part in that.

Lewis-Stempel attributes much of that achievement to the “martial and patriotic spirit” of the public schools, a view of those establishments with which Parker would, ironically, agree, but that is to muddle consequence with cause. Patriotic, yes, the schools were that, as was the nation—being top dog will have that effect. But, like the rest of the country, they were considerably less “martial” than Britain’s mastery of so much of the globe would suggest. A public school education may have provided a good preparation for the trenches, but it did not pave the way to them. That so many alumni came to the defense of their country in what was seen as its hour of need ought not to form any part of any serious indictment against the schools from which they came. That they sometimes did so with insouciance and enthusiasm that seems remarkable today was a sign not of misplaced jingoism, but of a lack of awareness that, a savage century later, it’s difficult not to envy.

And when that awareness came, they still stuck it out, determined to see the job done. Seldon and Walsh write that “It was the ability . . . to endure which underpinned the former public schoolboys’ leadership of the army and the nation.” Perhaps it would have been better if they had had been less willing to endure and more willing to question, but that’s a different debate. To be sure, there was plenty of talk of the nobility of sacrifice—and of combat—but, for the most part, that was evidence not of a death wish or any sort of bloodlust, but of the all too human need to put what they were doing, and what they had lost, into finer words and grander context.

And that they clung so closely to memories of the old school—to an extent that seems extraordinary today—should come as no surprise. These were often very young men, often barely out of their teens. School, especially for those who had boarded, had been a major part of their lives, psychologically as well as chronologically. “School,” wrote Robert Graves (Charterhouse), “became the reality, and home life the illusion.” And now its memory became something to cherish amid the mad landscape of war.

They wrote to their schoolmasters and their schoolmasters wrote to them. They returned to school on leave and they devoured their school magazines. They fought alongside those who had been to the same schools and they gave their billets familiar school names. They met up for sometimes astoundingly lavish old boys’ dinners behind the lines, including one attended by seventy Wykehamists to discuss the proposed Winchester war memorial. The names of three of the subalterns present would, Seldon and Walsh note, eventually be recorded on it.

So far as is possible given what they are describing, these two authors tell this story dispassionately. Theirs is a calm, thoroughly researched work, lacking the emotional excesses that are such a recurring feature of the continuing British argument over the Great War. That said, this book’s largely uninterrupted sequence of understandably admiring tales could have done with just a bit more counterbalance. For that try reading the recently published diaries written in a Casualty Clearing Station by the Earl of Crawford (Private Lord Crawford’s Great War Diaries: From Medical Orderly to Cabinet Minister) with its grumbling about “ignorant and childish” young officers arousing “panic among the men [with] their wild and dangerous notions.”

Doubtless the decision by Captain Billy Neville (Dover College) to arrange for his platoons to go over the top on the first day of the Somme kicking soccer balls is something that Crawford would have included amongst the “puerile and fantastic nonsense” he associated with such officers. Seldon and Walsh, by contrast, see this—and plausibly so—not as an example of Henry Newbolt’s instruction to “Play up! play up! and play the game!” being followed to a lunatic degree, but rather as an astute attempt by Neville to give his soldiers some psychological support. “His aim was to make his men, who he knew would be afraid, more comfortable.” Better to think of those soccer balls than the enemy machine guns waiting just ahead. Nineteen thousand British troops were killed that day, including Neville. He was twenty-one.

Crawford was a hard-headed, acerbic, and clever Conservative, but occasionally his inner curmudgeon overwhelmed subtler understanding, as, maybe, did his location behind the lines, fifteen miles from where these officers shone. Nevertheless, one running theme of his diaries, the luxuries that some of them allowed themselves (“yesterday a smart young officer in a lofty dogcart drove a spanking pair of polo ponies tandem past our gate”) touches on a broader topic—the stark difference in the ways that officers and men were treated—that deserves more attention than it gets in Public Schools and the Great War. Even the most junior officers were allocated a “batman” (a servant). They were given more leave, were paid a great deal more generously, and, when possible, were fed far better and housed much more comfortably than their men. Even in a more deferential age, this must have rankled. Perhaps unsurprisingly, Parker dwells on this issue in more detail than Seldon and Walsh, but, fair-minded again, agrees that what truly counted with the troops was the fact that “when it came to battle [the young officers’] circumstances were very much the same as their own.”

They died together. And they are buried together, too, not far from where they fell. As the founder of the Imperial War Graves Commission explained, “in ninety-nine cases out of a hundred, the officers will tell you that, if they are killed, they would wish to be among their men.”

A century later, that’s where they still are.

Richard Ropner - Random Recollections (unpublished - 1965)

Richard Ropner - Random Recollections (unpublished - 1965)

The Kremlin Mountaineer

Paul Johnson: Stalin - The Kremlin Mountaineer

The Wall Street Journal, May 23, 2014

Stalin.jpg

In the months leading up to the Bolshevik Revolution, Joseph Stalin was, recalled one fellow revolutionary, no more than a “gray blur.” The quiet inscrutability of this controlled, taciturn figure eventually helped ease his path to some murky place in the West’s understanding of the past, a place where memory of the horror he unleashed was quick to fade. Pete Seeger sang for Stalin? Was that so bad?

This bothers Paul Johnson, the British writer, historian and journalist. Hitler, he notes with dry understatement, is “frequently in the mass media.” Mao’s memory “is kept alive by the continuing rise . . . of the communist state he created.” But “Stalin has receded into the shadows.” Mr. Johnson worries that “among the young [Stalin] is insufficiently known”; he might have added that a good number of the middle-aged and even the old don’t have much of a clue of who, and what, Stalin was either.

Mr. Johnson’s “Stalin: The Kremlin Mountaineer” is intended to put that right. In this short book he neatly sets out the arc of a career that took Soso Dzhugashvili from poverty in the Caucasus to mastery of an empire. We see the young Stalin as an emerging revolutionary, appreciated by Lenin for his smarts, organizational skills and willingness to resort to violence. Stalin, gushed Lenin, was a “man of action” rather than a “tea-drinker.” Hard-working and effective, he was made party general secretary a few years after the revolution, a job that contained within it (as Mr. Johnson points out) the path to a personal dictatorship. After Lenin’s 1924 death, Stalin maneuvered his way over the careers and corpses of rivals to a dominance that he was never to lose, buttressed by a cult of personality detached from anything approaching reason.

Mr. Johnson does not stint on the personal details, Stalin’s charm (when he wanted), for example, and dark humor, but the usual historical episodes make their appearance: collectivization, famine, Gulag, purges, the Great Terror, the pact with Hitler, war with Hitler, the enslavement of Eastern Europe, Cold War, the paranoid twilight planning of fresh nightmares and a death toll that “cannot be less than twenty million.” That estimate may, appallingly, be on the conservative side.

Amid this hideous chronicle are unexpected insights. Lenin’s late breach with Stalin, Mr. Johnson observes, was as much over manners as anything else: “a rebuke from a member of the gentry to a proletarian lout.” And sometimes there is the extra piece of information that throws light into the terrible darkness. Recounting the 1940 massacre of Polish officers at Katyn, Mr. Johnson names the man responsible for organizing the shootings—V.M. Blokhin. He probably committed “more individual killings than any other man in history,” reckons Mr. Johnson. Ask yourself if you have even heard his name before.

To be sure, Mr. Johnson’s “Stalin” will not add much new to anyone already familiar with its subject’s grim record. It is a very slender volume—a monograph really. Inevitably in a book this small on a subject this large, the author paints with broad strokes, sweeping aside some accuracy along the way. Despite that, this book makes a fine “Stalin for Beginners.”

As Mr. Johnson’s vivid prose rolls on, the gray blur is replaced by a hard-edged reality. Stalin’s published writings were turgid, and he was no orator, but there was nothing dull about his intellect or cold, meticulous determination. As for his own creed, Mr. Johnson regards him as “a man born to believe,” one of the Marxist faithful, and maybe Stalin, the ex-seminarian, was indeed that: Clever people can find truth in very peculiar places.

But what he was not, contrary to the ludicrous, but persistent, myth of good Bolshevik intentions gone astray, was the betrayer of Lenin’s revolution. As Mr. Johnson explains, Stalinist terror “was merely an extension of Lenin’s.” Shortly before the end of his immensely long life, Stalin’s former foreign minister (and a great deal else besides), Vyacheslav Molotov, reminisced that “compared to Lenin” his old boss “was a mere lamb.” Perhaps even more so than those of Stalin, Lenin’s atrocities remain too little known.

Over to you, Mr. Johnson.

A Vulnerable Equilibrium

Jens Nordvig: The Fall of The Euro

National Review Online, April 29, 2014

Euro.jpg

He ate poisoned cakes and he drank poisoned wine, and he was shot and bludgeoned just to make sure, but still Rasputin lived on. And that gives me just enough of an excuse to use the mad, almost indestructible monk to begin an article about a mad, possibly indestructible currency. The euro has crushed economies, wrecked lives, toppled governments, broken its own rule book, made a mockery of democracy, defied market economics, and yet it endures, kept alive by the political will of the EU’s elite, fear of the alternative, and the magic of a few words from Mario Draghi, the president of the European Central Bank (ECB) back in July 2012.

Speaking to an investment conference, Draghi said that, “within our mandate” (a salute to watchful Germans), the ECB was “ready to do whatever it takes to preserve the euro.” “Believe me,” he added, “it will be enough.” Those few words, and their implication of dramatic market intervention, did the trick. Financial markets calmed down, and there are now even faint hints of economic recovery in the worst corners of the euro zone’s ER. And all this has happened without the ECB’s actually doing anything. Simply sending a signal sufficed.

The crisis has been declared over by the same Brussels clown posse that always declares the crisis over. They may be right, they may be wrong, but a calm of sorts has descended on the euro zone — not peace exactly, but quiet, punctuated occasionally by tremors that may be aftershocks, but could be omens of fresh chaos ahead.

That makes this a good time to take a look at The Fall of the Euro, a guide to the EU’s vampire currency by Jens Nordvig, global head of currency strategy for the Japanese investment bank Nomura Securities. If you are looking for a quick, clear, accessible account, free from financial mumbo-jumbo, that explains how the euro came to be, why trouble was always headed its way, what was done when the storm broke, and what might happen next, this book (which was published last autumn) is an excellent place to start.

It is written from the point of view of a market practitioner. Nordvig is not too fussed about the deeper European debate. He mainly wants to know what works. Here and there he will nod politely to democratic niceties, but this is a book where worries over lost sovereignty are dismissed as “sentimental.” Overall, Nordvig is a supporter of closer European integration (“a noble ideal,” he maintains — it isn’t, but that’s another story), but one with considerably less time for illusions than most in his camp.

And the euro, he argues, was built — and run — on illusions, the illusion that Germany was Italy, Italy was Portugal, and Portugal was Finland, the illusion that one size would fit all. Its creation was a “reckless gamble.” Politics prevailed over economics. No one made any preparations for the rainy day that could never come. The foundations for catastrophe were laid, and then built on by regulators, policymakers, and financial-market players only too happy to believe that the impossible was possible. Imbalance was piled on imbalance, and a shared currency masked the nightmare developing underneath. Employed by Goldman Sachs at the time, Nordvig saw how markets viewed the euro zone as an indivisible whole. But Greece was still Greece. And Germany was still Germany.

“Policy makers,” writes Nordvig, “can attempt to circumvent the basic laws of economics, but over time, the core economic truths take their revenge.” Unsustainable boom was followed by what has seemed, until recently, like permanent bust.

Nordvig does a fine job of explaining how the euro zone has been kept intact since the storm first broke, but he focuses more on the how than on the implications. Thus he relates how some of what has been done appears to “circumvent” a clear legal prohibition on European Central Bank financing of public-sector deficits, but seems to see that as more of a curiosity than cause for concern. But concern is called for: The EU’s combination of lawlessness at the top (remember how the Lisbon Treaty was used to “circumvent” those French and Dutch referenda) and tight control over everyone else has been a hallmark of tyranny through the ages.

Then again, financial types generally focus, understandably enough, on the financial rather than the political. But when the two look to be at risk of colliding, market attention shifts. Nordvig suspects that the euro zone may be getting closer to one of those moments.

He sees the euro zone as having emerged from its travails into what is now a state of “vulnerable equilibrium.” But to work properly, it needs substantially deeper fiscal and budgetary integration — something resembling the set-up that underpins monetary union in the U.S. He’s right about that, and that he is goes a long way toward explaining why euroskeptics are so opposed to the single currency. A realist, Nordvig concedes that the political support for such a step is simply not there, and he’s right about that too. New Yorkers might grumble about the way that, courtesy of the federal government, they effectively send cash to Mississippi, but they accept that their two states are in the same American boat. Germans look across at the Greeks (and other mendicants) and realize that they have been conned into bailing out a bunch of foreigners. That’s why, when Germany accepted the need for some sort of fiscal union to keep the euro zone in one piece, it insisted (as Nordvig explains) on an arrangement that falls far short of how such a union is usually understood. The Fiscal Compact that ensued is intended to minimize deficit spending in euro-zone member states rather than give Brussels additional spending power, spending power that could have been used to help out the battered periphery. It is no “transfer union.”

All that is left for the euro zone’s weaker performers is yet more austerity (sensibly enough, Nordvig sees the current currency regime as akin to a gold standard, and not in a good way), adding further bite to the deflationary crunch which these countries face. And it’s a crunch made worse by the perception, both fair and unfair, that it is being imposed on them from “abroad.” Greece is not Germany. And nor is France.

With bailouts resented in the euro zone’s more prosperous north, and austerity loathed elsewhere, it’s surprising how passive voters have been. There are plenty of explanations for this, but Nordvig is right to stress fear of the turbulence that abandoning the euro might unleash (a fear reinforced by establishment propaganda and the failure of many of the euro’s critics to articulate a credible alternative). A residual attachment to that “noble idea” of closer European union has also played a part as has, Nordvig notes, the determination of the dominant parties of center right and center left to hang onto the single currency. That’s something that has left anti-euro, but otherwise mainstream, voters struggling to find an outlet for their discontent.

That said, the prolonged economic grind is increasingly forcing voters in the direction of less respectable parties (such as France’s Front National) that believe that the euro zone and EU need much more than a mild course correction (the FN would pull France out of the euro). If these parties gain significant ground in May’s elections to the EU parliament (the betting is that they will), the danger (or opportunity) is not that they will overthrow the prevailing consensus in the EU parliament (they have neither the numbers nor the cohesion to do that), but that their success will shove their mainstream opponents in a more euroskeptic direction back home. Credibly enough, Nordvig identifies the possibility of a revolt within the political center (which could take very different forms: The Finns, say, may decline to support another bailout, while the Greeks might eventually turn away from austerity) as another potential block on the road to the closer integration that the single currency needs.

Even if the euro zone’s leadership does manage to fumble its way to agreeing on how closer integration could be secured — a deal that would inevitably involve massive transfers of sovereignty to Brussels — it will not be easy to push such a package through without the approval of a referendum or two. On past form, and in the electorate’s present mood, that will not be easy.

But, warns Nordvig, “if further integration is not feasible, some form of breakup is inevitable.” Nordvig may be sympathetic to the European project, but he is too much of a realist to pay too much attention to the Brussels myth that there is no alternative to preserving the euro “as is.” Specifically, he rejects the argument that, just because a “full-blown” breakup would be cataclysmic (as Nordvig convincingly shows, it could well be), all forms of breakup must be too. That’s a claim he heretically and correctly regards as little more than “a convenient tool to bind the euro zone together” and one, moreover, that has been used to stifle any proper analysis of what the costs and benefits of, say, a particular country’s quitting the euro might be. Such a departure, he believes, could be engineered “without intolerable pain.”

In understanding what Nordvig means by this, pay attention to his observation that “the cost of exit may be more concentrated around the transition phase, while the cost of sticking with the euro accumulates gradually over time.” Jumping out of a burning building is never easy, but it often beats the alternative.

Nordvig deftly summarizes what the costs and benefits of that jump might be, concluding that quitting the euro would be very tough for Ireland, Greece, Portugal, and Spain, easier than perhaps expected for France and Italy, and easiest (although far from problem-free) for Germany (I’d agree). That’s a position that logically takes him not too far (although he doesn’t quite arrive there) from support for a division of the single currency into northern and southern euros, something that has, in my view, long been the way to go. According to Nordvig, however, the most likely quitter is a country reduced to a state of such excruciating agony (not only in that burning building, but on fire) that exiting the euro finally comes onto the agenda. That is highly unlikely to be Germany, the nation most able to cope, inside the euro and out.

So what happens next? Suitably cautious in the face of such an uncertain environment, Nordvig lays out a number of different scenarios. While accepting, as he should, that political turmoil could upend everything, Nordvig appears, on balance, to conclude that the German austerity model will prevail, that a transfer union will be avoided, and that the euro zone’s laggards will trudge their way to an excruciatingly slow recovery. My own suspicion is that this assumes too much patience on the part of the periphery. Pushed both by common sense and fear of an increasingly unruly electorate, its governments will start a slow-motion revolt against what remains of the hard-money ECB that the Germans were once promised. Still in thrall to the cult of “ever closer union,” and terrified of the alternatives, Germany’s leadership will acquiesce. In fact there are clear signs that this process may be well underway.

This will lead to another of the scenarios sketched out by Nordvig. Loose money will try to fill some of the gap left by the transfer union that never was, and will do so just well enough to enable the euro to survive, but as a currency that is more lira than deutsche mark. That will be yet another betrayal of taxpayers in Europe’s north, while leaving the continent’s south still trapped in a system that does not fit.

And for what?

Hunger For Truth

Ray Gamache: Gareth Jones - Eyewitness to the Holodomor

The Weekly Standard, March 24, 2014

Gareth Jones.jpg

For decades, the notebooks of Gareth Jones (1905-35), a brilliant young Welshman murdered in Japanese-occupied Manchuria, were stashed away in his family’s house in South Wales, only to be retrieved by his niece, Siriol Colley, in the early 1990s. By that time, Jones, once a highly promising journalist and an aide to a rather better-known Welshman, David Lloyd George, had largely vanished from history.  But two books that appeared around then, Robert Conquest’s The Harvest of Sorrow (1986) and Sally J. Taylor’s Stalin’s Apologist (1990), gave a hint of what was to come.

In the first, a groundbreaking account of the manufactured famine that devastated Soviet Ukraine in 1932-33, Conquest told how Jones had gotten off a Kharkov-bound train, tramped through the broken Ukrainian countryside, and, on his return to the West, sounded the alarm about what Ukrainians now call the Holodomor (literally, to “kill by hunger”). Conquest explained how Jones’s “honorable and honest reporting” was trashed not only by Soviet officialdom, but also by Western journalists in the Soviet capital, a squalid episode discussed in more depth in Stalin’s Apologist, a biography of Walter Duranty, the New York Times’s Pulitzer Prize-winning correspondent in Moscow.

Duranty, whose relationship with the Stalin regime fueled a very well-paid career, took the lead in discrediting Jones. Claims of famine were “exaggeration” or, worse, “malignant propaganda.” Jones hit back, but to little avail. With just two years of life remaining to him, the path for his descent into historical oblivion was set. As for those three, four, five, maybe more, million deaths—well, so far as the West was concerned, nothing on that scale had happened. Sure, something bad had taken place, but to borrow Duranty’s term, there’s no omelet without breaking eggs; that’s how it goes.

It says something about the extent to which the Ukrainian genocide had been erased from Western memory that when Colley went through her uncle’s notebooks—the scribbled source material for the best English-language eyewitness reports of the famine—what caught her eye most (admittedly it had long been the source of family speculation) were later sections relating to what would ultimately be his murder in Manchuria. That was the topic that became the subject of Colley’s first book, Gareth Jones—A Manchukuo Incident (2001), a privately published volume in which only a page or two was reserved for Ukraine.

Times change. The reappearance of Gareth Jones was accelerated by the determination of many Ukrainians—free at last from imposed Soviet silence—to understand their own history. The investigation of a family tragedy broadened into an effort, helped by supportive members of the Ukrainian diaspora, to rediscover a journalist whose long-forgotten writing could be used to shape this newly independent nation’s sense of self and, more specifically, to help pull it away from Russia’s grip. It is no coincidence that Gareth Jones was posthumously awarded Ukraine’s Order of Merit at a time when Viktor Yushchenko, the most pro-Western of Ukraine’s presidents up until now, was in charge. 

By then, Siriol Colley had written More Than a Grain of Truth (2005), a biography (again self-published) of her uncle, offering a fuller portrait of a man who was a blend of Zelig—on a plane with Adolf Hitler, at San Simeon with William Randolph Hearst, you name it—and Cassandra, warning of nightmares to come. Meanwhile, a website (Garethjones.org) developed by Colley’s son Nigel had evolved into an invaluable online resource for anyone wanting to know more. Interest in Jones has continued to grow. A steady flow of stories in the British press, a documentary for the BBC, an exhibition at his old Cambridge college, and much else besides, were evidence that Jones was re-entering history beyond the frontiers of Ukraine—history that (as related in the West) finally had room for the Holodomor. This shift boosted interest in Jones, but was also, in a virtuous circle, partly the product of the rediscovery of his account of that hidden genocide, an account written in accessible English rather than a Slavic tongue.

But the reemergence of Jones does not diminish the darkness that accompanied his original eclipse, a darkness that runs through Gareth Jones: Eyewitness to the Holodomor. Ray Gamache’s work does not pretend to be a comprehensive biographical study, although it features enough helpful detail to act as a reasonable introduction to Jones’s extraordinary life. And it handily knocks down a few myths along the way. To name but two, the notion of a plot by the Moscow correspondents (such as it was) should not be overstated. And Jones did not sneak onto that train to Kharkov (his journey had official approval); it was where he got off—in the middle of nowhere, into the middle of hell—that was unauthorized.

That said, this fine book’s central focus is something more specific, a perceptive, methodical, and diligently forensic examination of the articles that Gareth Jones wrote about the Soviet Union, the circumstances in which they were written, the message they were designed to deliver, and, critically, their overall reliability. The reader is left in no doubt that this courageous, intensely moral man, an exemplar of the Welsh Nonconformist conscience at its best, saw the horrors he so meticulously chronicled in his notebooks and to which he then bore witness in his journalism: “This ruin I saw in its grim reality. .  .  . I saw children with swollen bellies.”

This is an academic book and thus not entirely free of jargon (“journalism texts are linguistic representations of reality”) or the contemplation of topics, such as the journalistic ethics of Jones’s giving food to the starving, likely to be of scant interest off-campus. That said, Gamache’s shrewd, careful work gives an excellent sense of Jones’s powerful analytical skills and the layers of meaning contained in his plain, unvarnished prose.

Above all, this book forcefully conveys Jones’s foreboding that something wicked was headed towards the peasantry. A leftish liberal in that early-20th-century way, he had had a degree of sympathy with the professed ideals of the Bolshevik Revolution; but then, as he wrote later, “I went to Russia.” And while he found things to admire in the Soviet Union, the underlying structure of its society appalled him. He saw a ruthless Communist party astride a hierarchy of which the peasantry—relics of the past who were of use, mainly, to feed the industrial proletariat—were at the bottom. With the dislocation, the fanaticism, and the failures of the first Five Year Plan becoming increasingly obvious, Jones knew who would pay the price. References to the danger of famine begin to surface in his reporting, and by October 1932, he was writing two pieces for Cardiff’s Western Mail under the headline “Will there be soup?” In March 1933, Jones returned to the Soviet Union to find out. The rest is history.

That it took so long to be recognized as such, however, was due to more than Soviet disinformation and Walter Duranty’s lies. For as dishonest and influential as that campaign by Duranty was, some of it, even on its face, did not ring quite true—not least the tortured circumlocutions with which he buttressed his denials of famine. Writing in theNew York Times, Duranty conceded that, yes, there had been an increase in the death rate, but “not so much from actual starvation as from manifold disease due to lowered resistance.”

Phraseology like that is only sufficient to fool those who wanted to be fooled, and there were plenty in the West ready to give the Soviet Union the benefit of the doubt. Many more simply did not care. The broad outline of what was happening, if not its details, was there for anyone prepared to look. To take just a few examples, there was the reporting of Jones and a handful of others (including Malcolm Muggeridge, whose role vis-à-vis Jones was, as Gamache reminds us, a complex one); there were the stories filtering out through the diaspora; there was the relief effort being attempted by Austria’s Cardinal Innitzer. But few took much interest. After all, said Duranty later, the dead were “only Russians,” a faraway, alien people who didn’t, apparently, count for a great deal.

And there was something else. Gamache records how the Foreign Office, which had access to good information of its own about the famine, deliberately kept quiet, worried about some British engineers then being held by the Soviets—by July 1933, all had been released—and, more broadly, about damaging Britain’s relations with the USSR, a concern sharpened, Gamache suggests (perhaps too charitably), by Hitler’s arrival in power earlier that year.

Looking across the Atlantic, Gamache notes, it has been argued that plans by the Roosevelt administration to extend diplomatic recognition to the Soviet Union may well have led Washington to downplay the famine. In any event, the United States and the Soviet Union agreed to establish formal diplomatic relations in November 1933, an event fêted with a lavish dinner at the Waldorf-Astoria, where Walter Duranty was a guest of honor. In a nod to the cuisine of the Soviet homeland, borscht, a traditional Ukrainian dish as it happens, was on the menu. That evening, at least, there was soup.

Through A Glass, Very Darkly

Mark Schrad : Vodka Politics - Alcohol, Autocracy, and the Secret History of the Russian State.

National Review, March 24, 2014

erofeev-venedikt4.jpg

If you cannot face going to Russia to see the real thing — in a dank Moscow underpass perhaps, or a broken attempt at a village — the best introduction to that nation’s drinking culture is to meet up with Venya, the narrator of Venedikt Erofeev’s Moscow to the End of the Line, a strange, bleakly comic, forbidden masterpiece of the early Brezhnev era. In the course of its first page, he drinks four vodkas, two beers, port “straight from the bottle,” and then, more vaguely, “something else.” It’s downhill from there.

In meandering, chaotic prose, Erofeev describes a drink-sodden, phantasmagoric train journey, punctuated by depictions of decay, echoes of Russia’s past, and recipes for cocktails that would make Appalachia blanch. With luck, no Russians ever drank “The Spirit of Geneva” (White Lilac, athlete’s foot remedy, Zhiguli beer, and alcohol varnish), but, as Mark Schrad, an assistant professor at Villanova University, notes in his absorbing, no less drink-sodden, not much less meandering, and even more horrifying Vodka Politics, they came close, not least during the time when Mikhail Gorbachev was cracking down on alcohol production:

“The most hard-up drinkers turned to alcohol surrogates: from mouthwash, eau de cologne, and perfume to gasoline, cockroach poison, brake fluid, medical adhesives, and even shoe polish on a slice of bread [a recipe that requires some additional preparation]. In the city of Volgodonsk, five died from drinking ethylene glycol, which is used in antifreeze. In the military, some set their thirsty sights on the Soviet MiG-25, which — due to the large quantities of alcohol in its hydraulic systems and fuel stores — was affectionately dubbed the “flying restaurant.”

It’s difficult not to smile at that, but then thoughts turn to those five dead in Volgodonsk, just a tiny fraction of a death toll from alcohol poisoning that ran into the tens of thousands, casualties of the burgeoning zapoi (a binge that lasts days or weeks) that finally lurched out of control during the economic implosion that followed the Soviet collapse. Post-Soviet Russia had little realistic alternative to the principle of shock therapy (how it was carried out is a different matter), but Schrad is right to stress the depths to which the country sank. The bottle, often filled with dubious black-market hooch, was one of the few sources of solace left. This, rather literally, added further fuel to the fire already raging through Russia’s demographics: “Average life expectancy for men — 65 at the height of [Gorbachev’s] anti-alcohol campaign in 1987 — plummeted to only 62 in 1992. Two years later, it dipped below 58.”

According to Schrad, “the best estimates are that in the 1990s, Russians quaffed some 15 to 16 liters of pure alcohol annually,” a figure that, tellingly, does not appear to be so different today, and is well above the “eight-liter maximum the World Health Organization deems safe.” These are per capita data, kindly averages that mask the extent to which it is mainly men who are drinking to excess, a fact that helps explain why Ivan can expect to live some ten years less than Natasha.

All those liters might alarm even skeptics legitimately suspicious of the WHO’s nannyish side, but the results for other nations offer context, if not reassurance. According to the WHO, Russia’s alcohol consumption in 2011 was near the top of the international  range, but it was far from the only country to cross the eight-liter threshold (the U.S. clocked in at 9.44 liters, the U.K. at 13.37). Beyond obvious differences in standards of health care, Russia’s catastrophe was clearly due to something subtler than the overall volume of alcohol consumed. What may have mattered more is that so much (6.88 liters) of the Russian tally was accounted for by spirits (the U.K., no stranger to the binge, came in at 2.41). That suggests that what is drunk (and, more specifically, how it is drunk) counts. Schrad quotes another Erofeev, the contemporary writer Viktor: “The result, not the process, is what’s important. You might as well inject vodka into your bloodstream as drink it.”

A disaster of this magnitude — on some estimates as many as half a million Russians each year are dying as a consequence, directly or indirectly, of alcohol abuse — was not the product of economic implosion alone. Other hard-drinking countries in the former Soviet space went through comparable traumas in the 1990s. But none, with the possible exception of Ukraine, a land long exposed to the worst pathologies of Russian rule, tippled quite so far over the edge. There was something that singled Russia out, and it predated the collapse of Communism by a very long time. As early as 1967, the rapid growth of alcohol consumption in the post-war years — something helped along by growing prosperity and a state that had replaced the anti-alcohol militancy of the earlier Soviet period with a sharp appreciation for the revenues that vodka brought in — had taken annual per capita consumption of pure alcohol (exclusive of bootleg samogon) to 9.1 liters. Drinking was an accessible pleasure for a society that had cash, but — in a still-austere Soviet Union — not much to spend it on. There was also something else: Vodka may have been used to soothe the pain associated with the collapse of Communism, but it had also been a way of anaesthetizing people through the dreary decades that preceded that long-overdue change, decades in which aspiration was stifled, life was hard, and futility was the norm. Under the circumstances, why not drink up?

But boozing one’s way through Brezhnev was also a reversion to older patterns of behavior that the early Bolsheviks — in some respects a puritanical bunch — believed they had swept away for good. In 1913, the wicked old empire’s per capita consumption stood at just under that perilous eight liters, and that was less of a gulp than the swigging that had preceded it a few years before. Vodka was not only a familiar presence in the Russian troika, it had also become one of its drivers, a wild, erratic, and destructive driver, to be sure, but one so powerful that attempts to unseat it contributed to the fall of both Gorbachev and (Schrad makes the case well) quite possibly the last czar too.

How demon drink grabbed the reins is the question that lies at the heart of Vodka Politics, which comes with the subtitle “Alcohol, Autocracy, and the Secret History of the Russian State.” That “secret” is something of an overstatement (as Schrad acknowledges, this is far from being the first work on this topic), but it is a claim not inconsistent with his occasionally excitable style (“While the cold wind howled beyond the Kremlin walls . . . ”). As told by a chatty and engaging author, this is Russia’s past seen, one might say, through the bottom of a glass, a perspective that is certainly skewed (sometimes too much so — the liberal Decembrist rebels of 1825 were rather more than an “inebriated Petersburg mob”) but is undeniably fascinating and often enlightening.

Schrad’s central thesis is simple enough: It is the tale of vodka as a “dramatic technological leap” (like a cannon, it has been said, compared with the “bows and arrows” of wine, beer, and more traditional drinks) that was adopted by Russia’s rulers (and how — if it’s epics of alcoholic excess that you are after, this is the book for you) and then ruthlessly exploited by them, a story that, with brief interruptions, has continued essentially unchanged for more than half a millennium.

Ivan III (reigned 1462–1505) was the first to establish a state monopoly on distillation, but Schrad prefers to credit his grandson, a rather more terrible Ivan, with “being perhaps the first to realize the tremendous potential of the liquor trade.” In between debauches and atrocities (if you are on the hunt for Grand Guignol, chronicled with faintly unseemly relish, this is also the book for you), he outlawed privately held taverns and replaced them with state-run kabaks. This was the next stage in, as Schrad describes it, the evolution of a system of “macabre beauty” under which the state built itself up by using mechanisms designed to increase the dependence of its subjects on a product — cheap to produce, profitable to sell, potent to consume — that gave them the illusion of release only to enslave them still further.

The catch was that the state itself became dependent on this dependency. At the height of the Russian empire, vodka funded a third of what became known as the state’s “drunken budget.” Toward the end of Soviet rule, vodka’s contribution was roughly a quarter. And vodka was a pleasure too tempting to be confined to those at the bottom of the heap. It seeped through all social classes, high and low, at immense cost to the country’s progress then and now, its spread facilitated by the unwillingness of the czarist and Soviet regimes to allow room for a civil society strong enough to push back, not to speak of their failure to nurture a nation in which the bottle would not seem like quite such an attractive escape.

So what now? With Russia’s economy in somewhat better shape and (thanks, primarily, to higher oil prices) vodka’s percentage contribution to the state’s income having shrunk to comparatively modest mid single digits, the chances — one might think — ought to be good that something serious could be done to address a public-health cataclysm that has not gone away. After all, the ostentatiously sober Vladimir Putin is at the helm. Some measures have indeed been taken, but this is still Russia, a top-down place where the people come last, where vodka profits accrue to the state and to the well-connected, a country where outside, maybe, some metropolitan centers, hope remains in short supply.

Moscow to the End of the Line draws to a close with the drunken Venya missing his stop and returning to the point at which he began.

And then things get worse.

Looking On The Bright Side

Josef Joffe: The Myth of America’s Decline - Politics, Economics, and a Half Century of False Prophecies

National Review, November 26, 2013 (December 16, 2013 Issue)

Statue of Liberty, June 2009 © Andrew Stuttaford

Statue of Liberty, June 2009 © Andrew Stuttaford

There is usually a moment in the course of a typical English picnic of drizzle, hard-boiled eggs, and chill, when someone looks up at the gray, unyielding sky and brightly announces that the weather is “clearing up.” If Josef Joffe attends English picnics, he would be that someone.

In this cheery take on America’s prospects, Joffe, the editor of Die Zeit, looks around and ahead and decides that, for all its problems, the U.S. will do just fine. He reminds us that pundits and politicians have been awaiting the end of America since its beginning. In itself, of course, this proves nothing: Time passes, facts change; what once was set in stone ends up slithering on sand. Joffe takes care to say that the failure to come true of previous prophecies of America’s decline “does not mean that [one] never will,” but, given the broader themes of this book, those words — and a handful of others like them — are the equivalent of the quick-fire muttering that accompanies some car commercials, caveats that no one is meant to notice.

Joffe, a shrewd and subtle analyst, is on firmer ground when he turns his attention to the nature, origins, and history of “declinism.” Predictions of an American tumble, he argues, frequently owe more to the dreams, fears, or ambitions of those who made them than to any reasonable calculation of what the future might hold. There have always been those, abroad, who have taken comfort in the thought that this over-mighty giant — and dangerous inspiration — might be faltering. Here at home, however, prophecies of doom are often intended to be self-defeating, designed to change behavior — enough already with the twerking, enough already with the neglect of missile defense — that would otherwise lead to catastrophe.

And declinism is a powerful political tool (fear sells) that has long been used and abused. Joffe relates how insurgent presidential candidates have a habit of basing their campaigns on existential threats that have a way of disappearing by the time, four years later, that the insurgent-turned-incumbent, “first Jeremiah, now redeemer,” is seeking reelection by a country where it is, again, morning. This record of apocalyptic bunkum does not mean that every politician’s prediction of approaching Armageddon can safely be ignored, but skepticism is generally a better response than panic.

Next, Joffe asks if there is any country in a position to topple America from its “towering perch,” a perch that is, he shows, far loftier than widely imagined. By contrast, Britain, even at its imperial peak, was merely first among some fairly grand equals. Joffe again buttresses his argument with the wreckage of earlier predictions — that Japan would overtake America, that Europe (Europe!) would fly by, that the Soviets would bury us — before turning a bracingly cold eye on China. The starting point of his enjoyably iconoclastic take on this latest contender is a blend of math and history — “as the baseline goes higher, as economies mature, growth slows” — but it quickly evolves into a perceptive critique of authoritarian modernization (and particularly its Chinese variant) that would make Thomas Friedman very unhappy indeed. Imagining a Chinese Sorpasso any time soon is, maintains Joffe, an extrapolation too far.

What is true of the economic contest is, broadly, true of the military race too. Joffe acknowledges, as he must in the wake of 9/11, “the power of the weak,” but concludes — too sanguine, perhaps, about the equalizing effects of technology — that America is so far ahead of its rivals “that it plays in a league of its own,” and it does so more cannily (“on top, not in control”) and, if not exactly on the cheap, more frugally (amazing, but true, despite those famous Pentagon toilet seats) than the alpha nations that preceded it. America may one day abdicate (Joffe highlights Obama’s combination of “reticence” abroad with “nation-building” at home), but it is unlikely to be imperial overstretch that brings it down.

A drawback of Joffe’s focus on the competition is that it allows relative strength to obscure absolute decay, an error avoided by Alan Simpson when the former senator compared the fiscal condition of the U.S. with that of some European nations. America was, he said, the “healthiest horse in the glue factory,” an ugly truth not inconsistent with the broader observation by Joffe (who, we should note, also frets about deficits) that “only the United States can bring down the United States.”

But an even more profound menace to this country’s future may come from a transformation that owes little to foreign plotting or domestic excess and quite a bit to free trade, free enterprise, and technological progress, features — rightly applauded by Joffe — of the American system that have done so much to make the country what it is today. That America’s generosity and optimism, in the form of an immigration policy — nuttily cheered on by a Joffe still in thrall to ancient Ellis Island myth — may make things even worse only sharpens the irony still further.

The exceptional nation has undeniably been exceptionally successful. Yes, America is an idea and a dream and all that, but above all, it has worked. As Joffe recounts, there have been busts, panics, and slumps, but overall this has truly been a land of opportunity. The result has been a nation held together in no small part by the shared belief that a better life is there for the taking by those who work hard, a belief fed by the fact that it was true enough for enough people for enough of the time, a belief that may now be becoming a delusion.

Inflation-adjusted household median income has yet to return to its 1999 peak — 14 years ago, in case anyone is counting — and now stands at only a fraction more than the level a decade before that, a stagnation that cannot (despite some wishful thinking to the contrary) be explained away by changes in household size. It is no coincidence that the percentage of Americans in work also peaked around the turn of the century, before going into a decline that the Great Recession has only intensified: Work-force participation is back to levels last seen in the disco era, a regression with ominous ramifications for the sustainability of Social Security, Medicare, and all the rest.

The tentative nature of the current recovery, and its particular shape — hiring at the top and bottom of the wage scale has picked up, in the middle not so much — looks a lot like yet more evidence that happy days will not be here again for the American middle class anytime soon. Its labor is simply not as valuable as it was. As technology gets ever smarter, and as workers in lower-cost emerging markets upgrade their skills, opportunities will narrow in the office suite as well as on the factory floor, squeezing cleverer, well-educated Americans of a type who have only rarely been squeezed before. And they won’t like it one bit.

In his fascinating and, in its implications, terrifying new book, Average Is Over, economist Tyler Cowen surveys this scene and predicts the arrival of a “hyper-meritocracy” in which a comparatively small segment (maybe 10 to 15 percent) of the population does extremely well, most people eke their way along, and there are few in the middle: a vision that may be exaggerated, but not by enough to save what’s left of Bedford Falls.

Unlike many apocalypticians, Cowen has room for a little relief (of sorts). He accepts that there will be “some outbursts of trouble” but anticipates a future that is “downright orderly.” The country will be older, and shared pride in America’s leading position in the world (Joffe would not disagree) will throw additional social cement into the mix, while “cheap fun” distracts the potentially restless.

“Revolts,” writes Joffe, “are the hardest part of the soothsaying business.” I’m not so sure. Smashed expectations, a large cohort of well-educated (and often young) underemployed, high numbers of unemployed men looking for work in factories that no longer exist, ethnic and cultural fragmentation (the last apparently not a concern to Joffe or Cowen, immigration enthusiasts both), and the window that the Internet provides into the lives of the rich are a recipe for disorder that it will take more than Grand Theft Auto to head off.

And the increasing emphasis on growing inequality (the inequality is real enough, but it is a symptom, not a cause, of middle-class woes) in today’s political debate — from Occupy to Obama — is characteristic of a society in which the focus has shifted away from growing the pie to slicing it up. That’s a harbinger of a crisis within the American model, and, I suspect, an early taste of an Argentinian future to come.

Joffe dismisses a mid-’90s prediction of a coming automated dystopia as “a stew of Malthus and Marx.” He would be unlikely to be much kinder about Cowen’s Skynet lite. That’s a mistake. The clouds aren’t clearing. They are getting darker.

Himself Alone

Curzio Malaparte: The Skin

The New Criterion, October 1, 2013

curzio malaparte.jpg

Navigating the first half of Italy’s twentieth century took elasticity. There were few more elastic than Curzio Malaparte (1898–1957), author of The Skin (La Pelle, 1949), a novel of which the first English-language “unexpurgated” version is being released by New York Review Books this autumn. Malaparte was a fascist, and then he was not. He flirted with Communism, and then he did not. A protestant by baptism, an atheist by choice, he converted to Catholicism on his deathbed, but in his will left the house he had built on Capri, the most beautiful in the world some said—his “Casa Come Me”—to the Chinese Communist Party. He was a writer of great, if unreliable, talent. He was a soldier, diplomat, moviemaker, duelist, agitator, provocateur, and dandy. A famously successful Romeo, his only true loves were his dog Febo and, above all, himself.

The Skin is a strange, uneven, and baroque creation, a fabulist memoir, a surrealist fiction based on fact and anything but. It is a terrifying, occasionally hallucinatory and weirdly arch depiction of an Italy devastated by war and moral catastrophe. Added by the Vatican—no mean judge of a good read—to its Index Librorum ProhibitorumThe Skin works well enough on its own merits, but to accept it at face value is to be beguiled by a mask.

To hazard a guess at what lies beneath involves peering into the evasive Malaparte’s bewildering, and frequently rewritten, career, and being prepared to risk being led hopelessly astray. Bear with me, this is going to take a while.

Born Kurt Erich Suckert, to a German father and Italian mother, he changed his name in 1925 to something more in keeping with Mussolini-era Italianizzazione. It was one of the more straightforward maneuvers in a life of transformation and disguise, but it came with a characteristically perverse wrinkle, with Malaparte (“bad side”) a spin on the Bonaparte already taken by another, more illustrious, narcissist.

We catch our first glimpses of him: a precocious schoolboy, some early writings, and what the British once called a “good war,” enlisting with the French in 1914 and then, after Italy signed up for Armageddon, joining his (maternal) home team. Next came stints as a diplomat, at the Versailles Conference and in Warsaw. But it was his first book, La Rivolta dei Santi Maledetti (1921), a sympathetic account of the mutiny that followed the Italian defeat at Caporetto (1917), that, in its rage at the old order, paved the way for what was to come. While Malaparte was—to the extent that any ideological labels apply (“anarcho-fascist” has been one brave shot)—a man of the “right,” it was the drama of revolution that appealed to him more.

And that’s what Mussolini appeared to offer. With the help of some fiery books, energetic journalism, and a spot of more sinister activity, Malaparte worked his way into a leading role among Italy’s fascist intelligentsia. Then the story starts to cloud. Malaparte was drawn to power, but he was too restless, too self-involved to play its games with the discipline that they require. Frustrated by Mussolini’s failure to unleash the social upheaval that had once seemed possible (and making no secret of that frustration), Malaparte drifted away from the regime’s center, but not too far: He was appointed editor-in-chief of La Stampa in 1929, only to lose the job a year or so later, for reasons that remain unclear. But the myth—assiduously promoted by Malaparte in the postwar years—that he was already slipping into outright opposition to fascism is nonsense, brutally debunked by Maurizio Serra, author of the invaluable and sternly forensic Malaparte, Vies et Légendes (2011), the finest biography of the writer to date.

Nor did Malaparte’s 1933 conviction for defaming one of the fascist leadership represent a definitive break with Mussolini. The offending letters were a clumsy power play gone wrong, nothing more. Malaparte’s punishment—five years’ confino on an island just off Sicily—turned out to be rather less than he would subsequently maintain. Thanks to Galeazzo Ciano, Mussolini’s son-in-law, another of his useful friends, the sentence was eased, reduced, and, after some eighteen months, commuted. Throughout it, Malaparte wrote for the Corriere della Sera, albeit under a pseudonym—some proprieties had to be observed. A Yevtushenko (if even that) rather than a Solzhenitsyn, he resumed his career on his release. In addition to journalism, there were a number of books, and in 1937 he founded Prospettive, an initially propagandist publication that evolved into a magazine of the arts offering a modernist, outward-looking reminder that there remained some room in fascist Italy for more than jackboots and bombast. Breton, Eliot, Éluard, Garcia Lorca, Joyce, MacLeish, and Pound were amongst those who made their appearances in pages that, as Serra notes, almost never featured contemporary German writers, one of the many rebellions with which this revealingly incomplete opportunist punctuated his career.

When Europe caught fire, Malaparte was recalled to his old Alpine Division and appointed a war correspondent, in the catbird seat to view the inferno that fueled his best work. Il Sole è Cieco (1947), the first, in terms of the period it covered, of his books on World War II, is an unexpectedly lyrical piece based loosely—of course—on a couple of weeks in the mountains spent with the Alpini as they half-heartedly campaigned against the French in June 1940. Twelve months later, after travels that included a grubby detour in Athens writing articles intended to help prepare the Italian public for the invasion of Greece, Malaparte was tacking along with the Germans, reporting for the Corriere della Sera as the Wehrmacht swept out of Romania and into the Soviet Union.

Presciently—and unfashionably—enough, Malaparte described the Soviets as tough opponents, even in retreat, something that he claimed (later) had led to him being expelled from the war zone in September 1941 “by order of Goebbels” (no less!), a tale that, like so many of his confections, crumbles under closer inspection. After a break in Italy (no, not “house arrest”), he returned to what he thought would be a more congenial sector of the Eastern Front, the stretch controlled by Germany’s Finnish allies. After some rewriting to restore what had (supposedly) been lost to the censor, a selection of his dispatches from the Eastern Front were first published in book form in 1943—after the fall of Mussolini—as The Volga Rises in Europe (Il Volga nasce in Europa), a volume that is, much more so than the far better-known Kaputt (1944), his greatest work.

Vivid, haunting, and elegiac, the book ranges from descriptions of the summer Blitzkrieg pouring into the Ukraine, to the snow and silence of Karelian forests from which isolated Finnish outposts overlook besieged Leningrad, and skirmishes evoke the “primitive ferocity” of ancient war: “wholly physical, wholly instinctive, wholly ruthless.” There are also stirrings here—visiting remnants of the Czarist bourgeoisie clinging onto shreds of the old ways in Moldavia, an excursion to the deserted house in Kuokkala where the Russian painter Ilya Repin had lived—of the lament for a broken European civilization that emerged as a major theme in Kaputt and The Skin, and, difficult as it is to reconcile with his past enthusiasm for an upending of the social order, became another twist in this writer’s labyrinth of contradiction and ambiguity.

Malaparte may—true to form—have spent considerably less time at the front than he implied, but a good part of what makes The Volga Rises in Europe more compelling than Kaputt or The Skin is the debt it owes to the discipline of journalism. The prose is spare, the stories brief, telling snapshots of moments that may once have even been real. By contrast Kaputt and The Skin, bestsellers both, are sprawling, fragmented, astounding epics of hideous accuracy, exaggeration, and deception, “novels” where fact merges with fiction, and where lies tell a truth that Europe was just beginning to grasp. Recounted in the first person by a “Malaparte” who is both fictional and not, they were also designed to distance their author from his fascist past (a pressing necessity by 1944) whilst (in Kaputt) also trumpeting his presence—witty, sardonic, superior—in the center of the Axis’ nightmare world. This was a tricky maneuver, but unavoidable if he was to be able to demonstrate that he—his best, his only hero—mattered: Malaparte had peered into the abyss and found it filled with mirrors.

His “horribly gay and gruesome” Kaputt is—as Malaparte recognized—a far stronger work than The Skin. He spent 1941–43 close enough to the heart of darkness to recognize it for what it was. Based however remotely on his experiences during this time, Kaputt is savage, sensual, and brilliant, decadent, revolting, and beautiful. It is filled with black humor, narcissism, self-conscious erudition, and embarrassing snobbery. There is champagne as well as carnage, Proust as well as Goya, a jarring mix that sometimes reinforces the sense of cataclysm or is sometimes just crass. Horrors are layered upon horrors, but in a way that not infrequently suggests that they are being deployed to showcase his formidable descriptive powers, an aestheticization of barbarity that underlines Malaparte’s icy detachment, a detachment that this most flawed of chameleons does not always bother to conceal.

Asked by a delegation of Jews in the Rumanian city of Jassy (Iasi) if he can intercede with the military to head off the pogrom that they rightly fear is imminent, Malaparte (no anti-Semite in fact or fiction) starts off well into the next afternoon on what he believes to be a hopeless trudge to see the relevant officers, only to pause to inspect a statue, and then head in the direction of the local bigwigs’ club to discuss poetry. He never even manages that, but instead takes a turn towards a cemetery for a nap. He awakes at sunset, woken by the sound of a Soviet bombing raid, and heads off to see a sixteen-year old waitress, Marioara, for whom he feels, well, it’s hard to say. A few hours later the pogrom begins.

In reality, Malaparte arrived in Jassy shortly after the slaughter. That didn’t stop him painting a sickening picture of the pogrom and of an aftermath that pointed to the hecatombs to come. A writer looking to walk away from an Axis-tainted past might have been expected to take the opportunity to present himself in a nobler light. And yet Malaparte does not. It says something too that, while living in France in the, for him, not uncomplicated late 1940s, Malaparte sent a portion of his royalties from Kaputt to Céline, the collaborationist French author, and notorious anti-Semite, then living in uncomfortable exile in Denmark. Whatever one might think of that gesture, it was not the act of the shape-shifter that Malaparte was so often said to be. According to Serra the two had never even met.

Kaputt ends with Malaparte’s arrival in Naples after the fall of Mussolini in July 1943, and his own brief detention by Italy’s new government. The Skin opens some time later, with Malaparte installed (as indeed, in another impressive twist to his resume, eventually he was) as a liaison officer with the U.S. army in that same occupied, liberated, humiliated, degraded, and anarchic city. Most of the book describes “Malaparte’s” stay there before a coda tracking his journey north with the Allies to Rome and beyond.

To its detriment, The Skin is more didactic than Kaputt. Its portrait of Italy’s moral, political, and physical ruin is bloated by a blend of pacifism and high school nihilism that is crude stuff after the detachment and elegant disenchantment of the earlier book. It concludes with the muttered observation that “it is a shameful thing to win a war,” an observation that is as wrong-headed (it rather depends on who is doing the winning) as it is overwrought.

And the latter adjective will do quite well to describe much of The Skin. The bizarre, sometimes surreal, interludes dotted through Kaputt work because they are interludes, whereas in The Skin (in which the freak show includes an outlandish “Uranian” rite, talking fetuses, dead soldiers on parade, and a feast with cannibalistic and mythic elements) they come close to overwhelming the book, and somehow undercut the sense of apocalypse that the British writer Norman Lewis, who was in the same city at the same time, conveyed so calmly and so effectively in Naples ’44. Worse, they fuel the suspicion—this is a book with longueurs unimaginable in Kaputt—that Malaparte was either running out of new things to say or, more cynically, that he had not much interest in doing so. Kaputt’s sales had not been hurt, to put it mildly, by its author’s emphasis on the cruel, the macabre, and the grotesque, so why not repeat the trick, only more so? But too often more turns out to be less, too rococo, too much. That’s not to argue that The Skin is without sequences of remarkable power and extraordinary beauty. It has those, but it is telling that one of its most memorable passages (a characteristic Malaparte set-piece) describes his discovery of a Ukrainian road lined with trees on which Jews have been crucified, an out-of-place digression that, even allowing for the herky-jerky chronology of both books, reads as if it was left over from a draft of Kaputt—an atrocity surplus to requirements.

As with Kaputt, it is what The Skin adds to the understanding of its elusive author that make for some of its most intriguing moments, whether it be the mocking condescension with which he views African-American GIs, or the peculiar obsession with homosexuality, something found elsewhere in his writing, which may suggest that Malaparte wore at least one mask that he was never prepared to recognize, let alone remove.

Above all, there is a delightful scene—as so often in these books revolving around a meal (well, he was Italian)—in which the trickster plays games with his own reputation. He is eating couscous with a group of French officers just before the final advance on Rome, one of whom teasingly comments that “judging by Kaputt, Malaparte eats nothing but nightingales’ hearts . . . at the tables of Royal Highnesses, duchesses, and ambassadors.” If their “humble camp meal” is to make it into Malaparte’s next book, it will have to be reinvented into an infinitely grander occasion. That leads to a more general discussion as to the truth or otherwise of what is found in Kaputt, to which Malaparte’s American colleague, Jack, eventually responds that “It is of no importance whether what Malaparte relates is true or false . . . the question is whether or not his work is art.” Malaparte then discloses that, unwilling to break up “such a pleasant luncheon,” he had “nibbled” his way through the hand of one of the French goumiers, blown off by an earlier grenade, only to end up, he had discovered, in the couscous. The French are appalled. Malaparte subsequently explains to a delighted Jack how he had arranged some ram’s bones on his plate to look like the remnants of a hand. It was left to readers more observant than me to work out that Kaputt did not appear until several months after the liberation of Rome. Malaparte’s story about his lying could never have been other than a lie. How the ghost of Laurence Sterne must have laughed.

Perhaps it was inevitable that Malaparte’s later peacetime years were, with his war books behind him (Mamma Marcia, arguably a fifth, unfinished, was published after his death), creatively something of an anti-climax, distinguished mainly, and tellingly, by an examination of the aftermath of that conflict, Il Cristo Proibito (1951), a well-received movie that he both wrote and directed. He was, for the most part, rehabilitated, but never entirely trusted, and the perception that his charlatanry extended into his art as well as his politics meant that he was never able to regain quite the prominence he had once enjoyed. In 1956, still, despite everything, tempted by the hard men, he traveled to the China of Chairman Mao, and decided that he liked what he saw, but his Chinese doctors did not like what they saw in him: Malaparte was diagnosed with lung cancer. They did what they could (leaving his house to the Chinese Communist Party was partly Malaparte’s thanks for the care he had received), but there was nothing to be done.

He returned to Italy to choreograph the death-bed drama that was, writes Serra, his last masterpiece, wooed by right, left, and the Vatican alike, each eager to claim his scalp for its own. The Communists sent him a party card, but he neither acknowledged nor repudiated it, preferring instead to reaffirm his membership of the centrist Republican Party. And yes, he did indeed, finally, convert to Roman Catholicism, if I had to guess, a Malapartian hedge, gaming God on the basis of a hint from Pascal, but a dramatic switch nevertheless, the last, critics might jeer, in a long turncoat career.

But that’s too simplistic: The only colors he really wore were his own.

Wrong Place, Wrong Time

Prit Buttar: Between Giants -The Battle for the Baltics in World War II

The Wall Street Journal, August 15, 2013

RigaWW2.jpg

The finest English-language portrayal of the fate that came calling for the Baltic States in 1939 is  William Palmer’s  “The Good Republic,” a short novel written on the eve of the breakup of the U.S.S.R. that evokes both the horror that engulfed these nations and the monstrous dilemmas that the war left in its wake. Early in its pages, an aging émigré, back in his homeland after nearly 50 years, ruefully remembers how his (unnamed) Baltic country had, for a while, led “a charmed life . . . between mad giants.” That characterization is recalled in the title of Prit Buttar’s history of what happened when Nazi Germany and Soviet Russia carved up northeastern Europe between them before turning on each other.

The Nazi-Soviet pact of 1939 consigned the Baltic trio of Latvia, Lithuania and Estonia to Moscow’s sphere of influence. Mr. Buttar, a British physician and independent military historian, recounts how these three small countries were first forced to accept Soviet garrisons and then incorporated into the U.S.S.R. in August 1940 after elections that were as bogus as the choreographed “popular” revolutions that preceded them. The arrests, deportations and executions that followed were the standard Stalinist script.

When the Germans invaded the U.S.S.R. in June 1941, they quickly rid the Baltic States of their Soviet occupiers and were initially welcomed as liberators. This was an illusion that the countries’ Jews obviously didn’t share. Though Estonia had only a tiny Jewish minority, about 5% of the Latvian population (some 95,000 people) was of Jewish descent, as was around 9% of Lithuania’s (roughly 250,000). Most of these people were dead at the end of 1941, murdered by the Einsatzgruppen, German mobile killing squads.

The perception that the Jews had collaborated with Soviet rule reinforced older prejudice, and all too frequently Hitler’s butchers had local assistants. Mr. Buttar relates the dismal chronicle of the Baltic’s willing executioners with some skill, if, perhaps, with too little consideration of the way in which the Soviet destruction of the established political, economic and social order had eliminated the elements that might have put some brake on the descent into atrocity.

The danse macabre of ethnicity and ideology didn’t stop there. Had the Germans so chosen, they could have restored a measure of self-determination to the Baltic States and bought some strategically useful loyalty. But Hitler had other plans for the region. In his Teutonic take on manifest destiny, the indigenous populations, even purged of the Jews, offered little more than prospective labor for the greater German good.

As the Red Army pushed back and then west, though, the Reich’s leadership began to view the Baltic nations as a source not just of auxiliaries but of front-line troops. Latvian and Estonian formations were established within the Waffen-SS and fought in battles on the Eastern Front. Some of these recruits were true believers in the Third Reich, and some were simply opportunists. But a good number—knowing what the return of Soviet power would mean—signed up in the belief that they were choosing the lesser of two evils, their countries’ last hope, however remote. Others were the conscripts of any war, young men in the wrong place at the wrong time.

Mr. Buttar neither judges nor whitewashes these soldiers. But after going through his carefully balanced account of the predicament in which Balts found themselves in those years, readers will find it easier to understand why today’s reunions of Baltic Waffen-SS veterans, which include an annual parade through Riga, the Latvian capital, trigger not only outrage but also a degree of local approval.

The Red Army re-invaded the Baltic States in 1944 and in a sequence of brutal autumn battles evicted the Germans from Estonia and Lithuania. Several hundred thousand troops were cut off in Latvia’s “Courland Pocket” and continued fighting until war’s end in May 1945. Mr. Buttar is himself an army veteran, and it is from the military perspective that he relates the savage unraveling of the Baltic world during World War II’s last year. There’s plenty here on weaponry, on tactics and strategy, on the movement of units—and, as so often in volumes of this type, who won what decorations for what actions. Thus we are told that in January 1945 the soldiers holding out with desperate effectiveness against the Soviets were each “awarded a ‘Kurland’ badge or armband.” But what conditions were truly like in that cutoff redoubt has largely to be guessed from glimpses of exhausted men, references to continuous fighting and laconic details of “increasingly meaningless” battles fought on until the fall of the Reich many months later.

The Soviet “liberation” of the Baltic States, and their postwar reabsorption within the U.S.S.R., restarted the cruel machinery of Stalinist repression on an even more hideous scale than before. Unlike in 1940, however, tens of thousands of Balts took to the forests and staged a lonely epic of defiance often overlooked by historians. To his credit, Mr. Buttar takes his story through the postwar period. Partisan activity peaked in the mid- to late 1940s but was severely hampered by a wave of mass deportations—over 90,000 Balts were sent to Siberia in 1949. Despite this blow to its base, the resistance struggled on, outnumbered and outgunned, well into the next decade. They were hoping for effective Western support. It never turned up.

Our Climate-Change Cathedral

Rupert Darwall: The Age of Global Warming -A History

National Review Online, July 27, 2013

milleritecartoon (1).jpg

A 19th-century Scottish journalist, songwriter and poet is not an obvious guide to a 21st-century intellectual and political phenomenon, but when it comes to making sense of climate-change zealotry, there are worse choices than Charles Mackay (1812–89), the author of Extraordinary Popular Delusions and the Madness of Crowds (1841), an acerbic, often drily amusing study of the frenzies — from witch mania to the tulip bubble — that regularly possess our supposedly sophisticated species.

“In reading the history of nations,” wrote Mackay, “we find that whole communities suddenly fix their minds upon one object, and go mad in its pursuit; that millions of people become simultaneously impressed with one delusion and run after it.” One recurrent fantasy, he jeered, was that the last trumpet is ready to sound: “An epidemic terror of the end of the world has several times spread.”

This is not — exactly — to categorize alarm over the impact of anthropogenic global warming (AGW) as just another of these prophecies of doom. The notion that a sharp, man-made increase in emissions of carbon dioxide and other greenhouse gases could have a significant effect on the climate is infinitely more soundly based than, say, the dodgy math of a Mayan apocalypse, but that — by itself — is not enough to explain why global warming has so evidently turned out to be the right fear at the right time. To learn more about that, The Age of Global Warming: A History,an intriguing new book (released in the U.K. in March) by the British writer Rupert Darwall (full disclosure: an old friend), is a good place to turn, but read some Mackay first.

To Darwall, “the science [of global warming] is weak, but the idea is strong.” He duly discusses some of the scientific controversies that have arisen, but the underlying objection to today’s scientific consensus on AGW set out in his book is more fundamental. Like Karl Popper, perhaps the last century’s most able philosopher of science, Darwall believes that the essence of a properly scientific theory is that it is falsifiable: “It should be capable of being tested against nature and therefore [potentially] refuted by evidence. . . . The more a theory states that certain things cannot happen, the stronger the theory is.” Put another way: What would it take to persuade believers in AGW or, more important, those concerned by what it could lead to, that they are mistaken? The answer is — let’s be polite — unclear.

If it is not possible to construct a Popper-proof proof of a link between the rise in CO2 (and other greenhouse-gas) emissions and the (now, ahem, paused) increase in the planet’s temperature, then those who believe that there is such a connection are forced to rely on what is effectively a continuous poll of scientific opinion over what the data might mean. It is from this process that the much-cited consensus has emerged. That’s not as unreasonable as Darwall might think, but it is second-best science. And when, as Darwall rightly maintains, it has been tainted by the political importance of maintaining a consensus (and the consequent delegitimization of debate) it ends up as something even less than that.

But even those convinced of the reality of AGW — and the danger it could pose — should find Darwall’s book a fascinating, if uncomfortable, history of climate change as a political and intellectual phenomenon. Those who want to focus on detailed scientific debate would do better to look elsewhere, as would those itching for a rant. There are some clever, occasionally lethal, jibes, scattered throughout The Age of Global Warming, but Darwall’s work is no noisy polemic. It is calmly forensic — and deeply disturbing.

Inevitably, Darwall is unable to resist mentioning earlier doomsayers that have got it spectacularly wrong. These include old Thomas Malthus, the Nixon era’s Club of Rome, and William Stanley Jevons (1835–82), a genuinely brilliant English economist whose best-selling The Coal Question (1865) warned that Britain was going to run out of the coal on which its economy depended. He predicted that by 1961 it would need to produce a colossal 2.2 billion metric tons a year. By the time that 1961 actually showed up, Britain’s annual coal consumption was running at less than 10 percent of that figure: Somehow the country continued to function.  To be sure, the failure of these particular forecasts does not prove that all predictions are nonsense, but they are a vivid demonstration of the need for intellectual humility and, more specifically, of the perils of extrapolation. We cannot know how human ingenuity, chance, or simply the passage of time will change what once seemed so certain. We can, of course, do our best to anticipate what is to come, but in the end, it is only a guess.

The British economist Nicholas Stern, author of the 2006 report that did so much to shackle his unfortunate country to a fundamentalist view of AGW — and what to do about it — took a rather more robust approach. He carried out a cost-benefit analysis of the problem of climate change (something that, outside the U.S., few had bothered to do), but his report’s sometimes controversial methodology had room (as Darwall records) for assumptions that ran up to 800 years in the future, a distance across time that might have made even Nostradamus hesitate. No matter; the U.K.’s establishment found Stern’s work compelling, useful, or both.

Others have been won over by a more atavistic dread. There’s no doubt that one element in the mosaic of AGW panic is a continuation of the ancient anxiety that something — food, say, or water or fuel — will run out, an anxiety created by millennia of human survival at the edge of subsistence, an anxiety that, even now, need not always be unjustified.

Another important ingredient finds its origins in thinking that developed in response to 19th-century industrialization. Romantics fretted that accelerating technological progress was taking man ever further from an imagined Arcadian idyll. Harder-headed sorts worried that the fruits of capitalism were a threat to existing social, financial, political, and religious hierarchies. To read Darwall’s deadpan account of the sometimes lunatic proto-environmentalism of the first half of the 20th century is to be reminded that today’s greenery has profoundly reactionary roots.

The old, Marx-pocked Left traditionally took a very different approach. As Darwall explains, its view of man’s relationship with nature was essentially promethean. The planet was there to be mastered by science and the proletariat. The radiant future would be secured not by the bucolic values of an Eden that never was, but by technological progress. It was only when the failure of the Communist experiment became too obvious to be ignored by its Western sympathizers that the opponents of capitalism looked for another banner around which to rally. Red shaded into green, a shift — boosted by the likes of Herbert Marcuse — that Darwall correctly sees as a key moment in the growth of environmentalism as a political force.

That the evolving environmental narrative fit in so well with currents found running through many spiritual traditions — an aspect of this saga on which Darwall could have focused more attention — also did not hurt. A tale of flawed, fallen, wasteful humanity needing to be led by an enlightened elite (step forward, Al Gore!) back to the austere path of righteousness, wisdom, sacrifice, and restraint has a clear religious resonance, as does the often apocalyptic language of environmentalist discourse and the furious reaction of some of the faithful to any dissent or, to use a more appropriate word, heresy.

And then, of course, there is Charles Mackay’s inconvenient truth: The end of the world has long been good box office.

Mix these elements together and then throw in the warming trend seen in the last quarter of the 20th century and it becomes easier to understand why, once the moment came, AGW won so much acceptance so quickly. Borrowing from an observation made by the British philosopher and mathematician A. N. Whitehead (1861–1947), Darwall argues that an idea “works slowly before mankind suddenly finds it embodied in the world. It builds cathedrals before the workmen have moved a stone. So it [was] with global warming.” Environmentalists were already predisposed to believe the worst about what hydrocarbons could do.

It was not only the intellectual infrastructure that was in place. Darwall shows how a small, curiously influential group of the unelected — including the annoying Canadian Maurice Strong (the “international man of mystery,” of an old National Review cover story) and Barbara Ward, a pushy, devoutly Roman Catholic, devoutly left-wing former foreign editor of The Economist — had been working to drive the environment up the international agenda since the 1960s. These were typically cleverer-than-thou command-and-control sorts, sometimes, tellingly, with a touch of the mystic about them (Fritz “Small Is Beautiful” Schumacher included astrology in his large collection of spiritual enthusiasms). They truly trembled for the environment (by the early 1970s, Ward was predicting that we’d be pretty lucky to make it to 2000), but they also saw environmentalism as a gateway through which technocratic controls could pour. Better still, the fact that environmental problems often seep across national borders could be used as an argument for supranational regulation, something that fit in nicely with their vision of a world increasingly run from Turtle Bay, by — pass the Dom Pérignon — people very much like themselves.

Darwall recounts how, starting with a 1972 shindig in Stockholm, U.N. environmental conferences were convened. (He has kind words for the chlorofluorocarbon-bashing 1987 Montreal Protocol.) Above all, the concept of “sustainable development” was turned into a device that could be used to head off objections from Third World nations that Western environmentalism would stand in the way of their own badly needed industrialization. As Darwall describes this convenient “political fiction,” it was based on the thesis that “economic growth was . . . double-edged. When rich countries got richer, it harmed the environment; when poor countries grew, the environment benefitted.” To be fair, that’s marginally — marginally — less absurd than it sounds, but in any event it did the trick. As the 1980s partied on (environmentalism has tended to flourish in prosperous times), grand reports (Brandt, Brundtland) were written and institutional mechanisms — national, supranational, NGO — were put in place to help greenery along.

When AGW — with its blood-curdling new angle on the dire consequences of man’s excess –arrived on the scene, the natural response by many in the environmentalist community was to see it as a fresh stick with which to whip humanity into line. Official concern over AGW finally crystalized in 1988, thanks primarily to the efforts of NASA’s James Hansen and a supporting cast that included, of all people, Margaret Thatcher, filled with hubris and pride in herself as a scientist. All was set for the climate-change circus to hit the road, and it did so at a speed that showed how well the way had been paved. Other politicians jumped on board, joined in due course by big business playing the usual corporatist game. Less than four years later the 1992 Rio Earth Summit had been held, and the U.N. Framework Convention on Climate Change put in place. Darwall notes, albeit with some exaggeration, “After Rio, debating the science of global warming became superfluous. Politics had settled the science.”

The route the circus took from Rio to Kyoto (1997) to Bali (2007) and to Copenhagen (2009) is detailed by Darwall, a meticulous and occasionally caustic chronicler with a sharp eye for the intricate political and diplomatic maneuvering that this journey has involved.

But, as Darwall points out, warnings of climate disaster came with a catch: The helpful idea that economic growth in the Third World was benign could not — for AGW mavens — coexist with the inconvenient reality of surging greenhouse-gas emissions from some emerging economies. The climate-change jamboree held in Copenhagen was designed to resolve this contradiction. The ultimate objective was to extend the Kyoto concept of binding obligations onto the United States and, crucially, growing industrial powers such as China and India. For all practical purposes, it got nowhere.

In what Darwall sees as a reflection of the diminishing clout of the West, New Delhi and Beijing stuck to their chimneys. As a result, the Obama administration declined to agree to a deal. The EU was left humiliated and without the broad, binding treaty its leadership craved. Its only consolation was that there was (just) enough in the mealy-mouthed final Copenhagen Accord to, in Darwall’s words, “keep the whole negotiating process going on indefinitely and provide cover for European governments to continue with their global warming policies.” President Obama has, of course, recently signaled that he still wants to push the U.S. in a similar direction.

And so the jihad against AGW will likely lurch along, regardless of India and China, regardless of the uncertainties that dog the science, and regardless of the obvious stupidity and astonishing expense of some of the policies (we could start with biofuels, but Darwall offers up plenty more to choose from) that it has set in motion. It has become too big to fail.

But even if this effort is one day abandoned, Darwall suspects that the Western mind would fill the gap that it leaves behind by dreaming up yet another environmental crisis that can be avoided only by crippling the modern industrial economy.

The end of the world, it appears, will always be with us.

Landscape After

Marci Shore: The Taste of Ashes - The Afterlife of Totalitarianism in Eastern Europe

National Review, April 3, 2013 (April 22, 2013 Issue)

Warsaw, Poland, May 1998 ©  Andrew Stuttaford

Warsaw, Poland, May 1998 ©  Andrew Stuttaford

The new dawn that broke over Eastern Europe in 1989 was bright, but the landscape it illuminated was exhausted, a territory of shadows and regret, where hope was jostled by apprehension and old demons stirred. To read some of the accounts of that time and that place is to confront sadness unexpected after the jubilation on the Wall, the Hungarian border, or Wenceslas Square — a melancholy echoed in Marci Shore’s beautifully written, discursive, and by her own admission “deeply subjective” new memoir, a work of well-told history and perceptive reporting that is both less than its title promises and rather more. Either way, it could have done with an index.

Don’t read The Taste of Ashes expecting a survey that covers all of the old Eastern Europe. With the exception of Romania (does that count?), the Balkans do not really feature, nor do Hungary and the former East Germany. There’s a brief excursion to Lithuania, but the other Baltics and the rest of the Soviet far west are notable only by their omission. This is a volume centered on Poland, the Czech Republic, and Slovakia, but quite a bit of what Shore discovered there could easily have been found elsewhere in the region. Thus, in early 1995, she returns to Prague and visits the house of an apolitical elderly couple, all too typical of a generation throughout Eastern Europe whose lives had been impoverished by Communism, but ruined by its fall:

The Velvet Revolution had brought freedoms they had no use for, and in any case had not the money to enjoy. Their whole adult lives they had worked under the Communist regime, and that regime had promised they would be cared for in their old age. Now the social contract had been broken. For their generation the revolution had come too late. For Pan Prokop and Paní Prokopová, it would have been better had it not come at all.

That, of course, assumes that — had it survived — the crumbling economy of Communist Czechoslovakia would have been in a position to deliver on those undertakings, something that is by no means certain.

Shore is an associate professor of history at Yale, a Generation X intellectual of somewhat progressive hue. Thus it may not be surprising that her description of her time in the Eastern Europe of the “post-Communist moment” — doubtless further skewed, in a form of confirmation bias, by the views and experiences of those with whom she chose to associate — comes with a sigh of disappointment. History failed again. The rise of the philosopher king, Vaclav Havel, was not accompanied by the rise of a philosopher people.

There’s prim tut-tutting about the profusion of pornography, and, more justifiably, about the increase in crime and the persistence of the inertia, passivity, and conformity of the “realm of the not possible” that was so much of the Communist state. There is little about the revival in free enterprise, but plenty on the resurgence in national tensions. Shore spent time “working as an intern for an ethnic-conflict project at an American-funded research institute.” Ex-Yugoslavia was in flames, and other long-suppressed conflicts had reemerged into the space that Moscow had once policed. In Romania, Shore investigated tensions between ethnic Hungarians and ethnic Romanians. Fair enough, interesting enough, but all this risks giving an unbalanced impression of a region where most just wanted their lives to be “normal,” an adjective that Shore happens to hear used by a Romanian politician, but that was voiced often in Eastern Europe in those days. When the old regimes fell, achieving a “normality” defined in largely Western terms was a widespread objective.

This idea is reflected in a clever phrase deployed by Shore to describe 1989’s upheavals: “Time, seemingly halted for so long, suddenly leaped forward.” And if the results of the leap have been uneven, they have still been impressive: “To tens of millions of East Europeans the end of Communism brought countless good things — above all a freedom the vast majority of people never imagined that they would live long enough to see.”

But in one sense, time did count, and it counted very much. Shore trains her historian’s eye on the impact of the Communist years, with a keen focus on the telling detail and defining atrocity. And she takes a longer view than most. The rise of the red flag over what her husband, Yale professor Timothy Snyder, dubbed the “bloodlands” in his magisterial book of the same name cannot, she correctly stresses, be seen in isolation. The subtitle of Shore’s book refers not just to Communism but to totalitarianism. In her view, the different stages in the evolution of Eastern European Communism must be read as links in a chain that stretches back to World War II, Nazi occupation, and the Holocaust and, before that, to the Depression, the rise of Fascism, and even to “the dizzying possibilities of the 1920s.”

It is widely recognized that war and Nazi misrule were critical in clearing a political, military, and (perversely) moral space for Eastern European Stalinism, and the link between the war and the troubled decade that preceded it is hardly a secret. The connection to the “unhinging” 1920s is more novel. Shore uses the microcosm of a group of 20th-century Polish poets as a window into the revolutionary fervor that enveloped large sections of the European intelligentsia in a decade happier, luckier Americans remember for jazz and Al Jolson.

Those poets — many of them of Jewish descent — dominated her book Caviar and Ashes and crop up again in The Taste of Ashes. From the later 1920s onward, they exchanged the (to them) ultimately unbearable uncertainties of nihilism for the messianic determinism of the far left, a faith that they — and a number of their associates — were eventually, and crucially, to put at the disposal of Stalin’s Poland. That’s a timetable that suggests to me that the original sin from which the nightmares Shore describes were to flow was the Bolshevik Revolution of 1917. After all, it was Lenin’s blood-soaked millennial upheaval that frightened many into Fascism as a supposed last bulwark of civilization. And it was Lenin’s revolution that drove its followers worldwide into a Communist cult that became dangerously intertwined with Stalinism years before the alibi that was Auschwitz.

To attribute so much of the blame to Lenin fits awkwardly with the emphasis that Shore’s narrative places on Hitler. The Holocaust is justifiably central to our reading of Eastern Europe’s dark 20th century, but its role as Stalin’s enabler needs more nuance than Shore gives it. Equally, notwithstanding the pantomime anti-Semitism (a gargoyle hounding of phantoms) that still, shamefully, persists in these lands, the horrors of the Shoah are less critical — other than for the hideous absence that it left behind — to our understanding of the region today than Shore appears to suggest. In a book billed as wide-roaming, she devotes perhaps too much space to what is now, tragically, only a tiny, introspective, often conflicted minority of Polish Jews. A minority of a once slightly larger minority (the Communists arranged a final anti-Semitic purge in 1968), they stay put in a country that groups of visiting Jewish teenagers — there to mourn at the death camps — regard (Shore recalls) “as a cemetery.”

Theirs is a disturbing, compelling story, but it crowds out a broader discussion of the encounter with a Communist past in which so many Eastern Europeans were profoundly compromised and then, “in a world where all the rules had changed,” left exposed by the opening of files that were either devastatingly ambiguous or, worse, all too clear.

There could have been more too in this book on the appeal of totalizing ideology to so many intellectuals. It is a topic that obviously interests Shore (it surfaced in Caviar and Ashes), but she gives too little attention to a phenomenon that still endures, if more benignly, even in the attitude of those such as the former dissident and Velvet Revolutionary who opts out of the, yes, normal politics of the new era: “To be engaged” is, in his view, “to forgo clean hands,” an abdication that is itself a declaration of absolutist thinking. Shore notes that “dissidence . . . had often been born of communism”: Once a believer, always a believer — all that changes is in what. That dangerous thrill remains.

Shore concludes the book with a tale of meeting a “bright young” member of the Polish new Left, who thanks her for Caviar and Ashes — a work he regards as rehabilitating those Marxist intellectuals of seven decades ago. Shore contradicts him, explaining that their fate is a “tragedy.” “But I didn’t read it as a tragedy!” says the bright young man. “I read it as a romance.”

We have been warned.