Stumbling Down the Road to Hell

Ian Kershaw: Making Friends with Hitler

The New York Sun, December 2, 2004

Londonderry.JPG

Ian Kershaw is best known for "Hitler," his two-volume, definitive account of one of history's monsters. His new book, by contrast, deals with an irritating British nobleman who was at best a footnote, at worst a nonentity. In telling the strange, sad story of the lord who tried to befriend a fuhrer, Mr. Kershaw highlights the English ineptitude that was to prove so helpful to the German dictator throughout the 1930s. "Making Friends With Hitler" (The Penguin Press, 488 pages, $29.95) also comes with a disturbing contemporary resonance. In part it's a tale of people living in the comfort of Western democracy, but all too ready to excuse totalitarian savagery overseas in the interest of their own ideological obsessions. Those people still exist: Chomsky, Sarandon, Moore, take your pick.

The exhaustingly, and slightly repetitively, named Charles Stewart Henry Vane-Tempest-Stewart, the 7th Marquess of Londonderry, was born into immense wealth and an even larger sense of entitlement. He was also born too late. By the time he became a member of Parliament, the old aristocratic order was beginning to crumble, and by the time he returned home from the trenches of the World War I, Britain was only a few years from its first Labour government.

Oblivious or uncaring, this self-important but not very talented aristocrat still felt high office was his right. The viceroyalty of India eluded his grasp, but in the end perseverance, connections, and aggressive entertaining produced their reward: In effect, Londonderry catered his way into the Cabinet, becoming Britain's Air Minister in 1931. As was said, a touch acidly, about one of his earlier, equally dubious, promotions, it was not possible to "use a man's hospitality and not give him a job."

Maybe, but the early 1930s were not the best time to put a mediocrity into such a role. As minister in charge of the air force he had somehow to reconcile Britain's security requirements with increasingly assertive demands from Germany for strategic parity. All this at a time when most Britons were still calling for disarmament and the exchequer was short of spare cash.

It was a task for which Londonderry was neither intellectually nor temperamentally equipped. As Mr. Kershaw explains, "having imbibed the aristocratic values of Victorian and Edwardian England" he was "totally unprepared for the rough, tough, world of the 1930s ... where the mailed fist and political thuggery were what counted."

But if he was unprepared, so was his country, and that parallel, I suspect, was Mr. Kershaw's point in choosing to make this minor figure the focus of such a major study. Mr. Kershaw treats Londonderry as a symbol of the failures of Britain's governing class; the story of his undeserved rise and precipitate fall is used to tell the wider tale of his country's disastrous failure to head off Hitler.

The problem is that Londonderry was not a particularly representative figure. While his story (which Mr. Kershaw, as one would expect, tells well) is of interest, it is as a curiosity more than anything else - "Believe It or Not" rather than "The Gathering Storm." This is a book for readers who enjoy the byways and the detours of history, and the tales of those who can be found there.

Those wanting a general account of British foreign policy in that "low dishonest decade" should thus look elsewhere. They will be frustrated by the amount of time he spends with Londonderry, a man who lost what little significance he had when he was fired, somewhat unfairly, from government. He then compounded his unimportance by alienating many of the few who could be bothered to pay him any attention.

Had Londonderry gone quietly into retirement, Mr. Kershaw would not have much to say, but instead the fallen minister began the freelance diplomacy that shattered what was left of his reputation. In the hands of a lesser historian, these efforts, designed to promote a more friendly relationship between the Third Reich and Britain, could have been caricatured as the acts of a Nazi sympathizer, even a potential Quisling. Mr. Kershaw recognizes that Londonderry's motives were patriotic and basically well intentioned.

Friendship between Britain and Germany was, this veteran of the Somme believed, essential if the tragedy of another Great War was to be avoided. This was very different from supporting Hitler, or working to establish some sinister New Order in the sceptr'd isle. Even the photographs that illustrate this book under line the distance between Londonderry and the gangsters he was attempting to cultivate: We see him, Savile Row immaculate, posing with Hitler, being entertained by Goring, alongside his houseguest von Ribbentrop. In each picture, this British aristocrat seems guarded, a little uneasy, a thoroughly decent chap not altogether comfortable with the rough company he is keeping.

Certainly some of Londonderry's effusions about Hitler's "tremendous successes" make for very queasy reading. But, to put this into better context, Mr. Kershaw could have included some discussion of the useful idiots who were, at the same time, busy proclaiming the birth of a new civilization in Stalin's slaughterhouse Soviet Union. By comparison with such apologists, Londonderry was relatively restrained in the praise of his dictator. He shared with them, however, their determination to give evil the benefit of every doubt. And like them he lacked much empathy with those unfortunate enough to live under totalitarianism.

We see this most strikingly in Londonderry's underwhelming response to the plight of Germany's Jews. To be sure, he shared in the clubland anti-Semitism of many of his class, but this was a far cry from sympathy for Nazi cruelty. It appears to have been enough to let him regard Hitler's relentlessly grinding pogrom primarily as bad PR, an unnecessary obstacle to the necessary friendship between Britain and Germany. The idea that such horrors might have been evidence of a regime so pathological it could be no more trusted abroad than at home seem not to have occurred to him until too late.

Fortunately, there were others who did understand - none more so than his cousin, Winston Churchill. Relations between the two became, apparently, a little strained.

Measuring Man

Charles Murray: Human Accomplishment

American Outlook, December 1, 2004

rome-pantheon.jpeg

Did Charles Murray have a difficult time in high school? Judging by what he writes, when he writes, and how he writes, he’s someone who would not have enjoyed the conformist, unimaginative world of contemporary American secondary education. A controversialist who never knows when to stop, a math geek who understands what counts, Murray was probably jostled in the school yard, pushed about in the cafeteria, and, in that hallmark of intellectual independence, repeatedly hauled up in front of the principal. “Murray, don’t ever, ever argue with your teachers again.”

His best-known work, 1994’s The Bell Curve (co-written with Richard J. Herrnstein), triggered a spasm of denunciation, condemnation, and self-righteous indignation that an earlier heretic, the luckless Galileo, would have found all too familiar. It’s not necessary to agree with Murray and Herrnstein’s thesis to be struck by the nature of the criticism it generated, a carnival of vituperation where the language used, replete with keening cries of anathema and frenzied declarations of conformist piety, was more reminiscent of the deliberations of the Inquisition than any attempt at scientific discourse. The message? Suggestions that intelligence is an inherited characteristic are perilous and, if in any way associated with “race,” positively lethal.

So what, nine years later, has Murray gone and done? Indefatigable, delightfully tactless, and armored only with a thick cladding of protective statistics, America’s heretic has volunteered once more for the stake, this time as the author of a book that in essence argues that a wildly disproportionate part of mankind’s intellectual and cultural patrimony is the work of those reviled monsters, the “dead white males.” Will the man never learn?

Praising dead white males is bad enough, of course, but even if we put that grave offense to one side, it’s a sad reflection of the current intellectual climate to see that Murray’s belief in the possibility of making objective assessments of human achievement will likely be condemned as lunacy, and, worse still, as unacceptably—and archaically—“judgmental.” Seared by the inquisitorial fire last time, Murray tries to anticipate these objections with statistical method; taken in aggregate, he argues, the data cannot lie. It may be reasonable to disagree with the relative rankings of, to pick two of his greats, Michelangelo and Picasso, but not with the overall conclusion: “Now is a good time to stand back in admiration. What the human species is today it owes in astonishing degree to what was accomplished in just half a dozen centuries by the peoples of one small portion of the northwestern Eurasian land mass.”

But before any living white males are tempted to reach for brown shirts and chilled champagne, it’s important to recognize that Human Accomplishment is far from being a piece of ethnic cheerleading, nor is it any cause for Old World complacency. Always reliably gloomy, Murray warns, “it appears that Europe’s run is over. In another few hundred years, books will probably be exploring the reasons why some completely different part of the world became the locus of great human accomplishment.”

Murray’s method of reaching these conclusions is intriguing. To start with, he confines his examination of “accomplishment” to the sciences and the arts (some of them anyway; omissions include, dismayingly, architecture). That’s a little too narrow, in my judgment. There’s no room for the military, for example. In defense of that omission, Murray maintains that “putting ‘Defeated Hitler’ on the human résumé is too much like putting ‘beat my drug habit’ on a personal one,” but excluding the warriors and the warlords shuts out a Churchill or a Caesar, individuals who certainly ought to be found on any roll call of human genius. Governance and commerce are also eliminated. “Those achievements,” Murray avers, “are akin to paying the rent and putting food on the table, freeing Homo Sapiens to reach the heights within reach of the human mind and spirit—heights that are most visibly attained in the arts and sciences.”

There’s more than a touch of the ivory tower about Murray’s decision to restrict his investigation in this way, but it fits nicely with the aspirational message of Human Accomplishment: the arts and the sciences matter. More cynical folk will note that these areas of activity also lend themselves better than most to Murray’s approach. He writes,

After reviewing histories and chronologies of [commerce and governance], my judgment was that while it was possible to compile inventories of people and events, the compilations were unlikely to have either the face validity or the statistical reliability of the inventories for the arts and sciences. The process whereby commerce and governance have developed is too dissimilar from the process in the arts and sciences.

That’s true enough, and, more importantly, Murray’s relatively narrow focus doesn’t necessarily detract from the case he is trying to build. After all, success in the arts and sciences are not only worthy aims in themselves: taken together, they represent an excellent proxy for the achievements of a particular society at a particular time.

Good proxy or not, it’s still jarring to read about “the statistical reliability of the inventories for the arts.” “Statistical reliability” is bean-counter speak, hardly the lofty language usually associated with an early Picasso or the glories of a Turner sunset. This helps explain why some readers’ initial reaction to the methodology at the heart of Human Accomplishment will lie somewhere between incredulity, astonishment, and laughter. Mind you, Murray’s methodology is unusual enough to raise an eyebrow or two regardless of any aesthetic considerations. Basically (and this is a gross oversimplification), what he has done is count the footnotes. He has gone through a large number of reference books dedicated to the history of the arts and the sciences, and kept a tally of references to a particular individual or event. After subjecting the data to various statistical adjustments, those accomplishments that feature in the most references are, he asserts, likely to represent the pinnacles of man’s achievement. In “recounting . . . accomplishment in the arts, sciences, and philosophy for the last 2,800 years,” there are, concludes Murray, 3,869 people “without whom the story is incomplete.”

And not 3,870? At first sight this technique appears absurd, little more than the mathematics of the lunatic asylum, but statistics is nothing if not a patient discipline, and Murray carefully explains his logic. As an example, he demonstrates how it works when applied to Western art. He begins with “a staple of undergraduate art courses, Art Through The Ages.” In its sixth edition, “Michelangelo has the highest total of page references and examples of works devoted to him, more than twice the number devoted to either Picasso or Donatello, tied for number two. Then comes a tie among Giotto, Delacroix, and Bernini, followed by a tie among Leonardo, Rembrandt, and Dürer, and then still another tie between van Eyck and Raphael. . . . ”

He then turns to another standard text, H. W. Janson’s History of Art. Many of the names overlap, but Delacroix (somewhat surprisingly highly rated in Art Through The Ages) doesn’t make the top eleven, whereas Titian and Masaccio do. Repeat this exercise enough times with enough sensibly chosen reference books, and the list is likely to end up dominated by the same names again and again, a list, Murray argues, that is a fair measure of artistic greatness. The high correlations are “a natural consequence of the attempt by knowledgeable critics . . . to give the most attention to the most important people. Because different critics are tapping into a common understanding of importance in their field, they make similar choices. Various factors go into the estimate of importance, but they are in turn substantially associated with excellence.”

Of course, there are many potential problems with this method, but although I am no statistician and Human Accomplishment is (casual readers beware) a math-heavy tome, it is impossible not to be impressed by the steps its author has taken to deal with some of the more obvious objections, particularly those involving cultural, geographical, ethnic, and gender bias, let alone the dread offense (and worse word) of epochcentrism. If, at times, the results make uncomfortable reading for the politically correct, those people should not look for much consolation from Murray: “it is important,” he warns, “not to conflate aspirations with history.”

This is not to say that Murray would claim that his method is perfect. His decision to create separate categories for what he sees as the great literary traditions (Arabic, Chinese, Indian, Japanese, and “Western”) is proof enough of that. How does one compare Shakespeare with Basho, or Kalidasa with Du Fu? And then there are those ancient feats of scientific discovery (fire, say, or the wheel) that underpin our society more than any microchip—who gets the credit for those? Murray sidesteps some tricky questions of attribution by beginning his survey at a comparatively late date in human history (and 800 B.C. is a comparatively late date), but even this maneuver doesn’t address those more recent human achievements that are now vanished from memory. If the Iliad hadn’t survived, for example, it would not have been included in Murray’s database, but would it have been any less of an accomplishment? In all likelihood, not enough such works have been lost, or discoveries forgotten, to invalidate Murray’s argument, but it is difficult not to think of these and other such issues when trying to weigh the wisdom of what he is trying to say.

These problems do not, however, undermine the core of his case: the central and defining role of Europe (and its American extension), particularly over the last half-millennium, as the pacesetter of human accomplishment. This ought to be a statement of the obvious. In much the same way as the small plaque in London’s St. Paul’s Cathedral dedicated to its architect simply states, “si monumentum requiris, circumspice,” so it is with Europe’s contribution to civilization. Just look around you.

Sadly, however, we live in an age when such commonsense observations can set off a scandal. Murray laments how

the idea of the Noble Savage . . . has reemerged in our own time. It has become fashionable to decry modern technology. Multiculturalism, as that word is now understood, urges us to accept all cultures as equally praiseworthy. Who is to say that the achievements of Europe, China, India, Japan, or Arabia, are “better” than those of Polynesia, Africa or the Amazon? Embedded in this mindset is hostility to the idea that discriminating judgments are appropriate in assessing art and literature, or that hierarchies of value exist—hostility as well to the idea that objective truth exists.

Of course, there’s no denying that, with all its lists and scatter diagrams, there is a hint of madness in the method that Murray uses to inventory “our species at its best.” Nevertheless, fans of insanity will discover far more to delight them in the posturing of today’s intellectual establishment, with its poisonous mix of self-loathing, political correctness, and frivolity, than in anything to be found in Human Accomplishment.

That said, there’s a danger that Murray’s readers may be left asking themselves exactly what his 668 pages are for. As a miscellany of intriguing information and quirkily intelligent observations, the book is a delight. To take two examples, both the charming description of the twelfth-century Chinese city of Hangzhou and the concept of a “meta-invention” (by which he means “the introduction of a new cognitive tool [such as logic] for dealing with the world around us”) are worth the price of admission alone; but, by themselves, they are commentary, not a theme.

More useful, perhaps, is to see Murray’s ratings of excellence as a valuable antidote to the ethos of an age deeply prejudiced against the notion of genuine achievement. As Murray reminds us, “excellence is not simply a matter of opinion, though judgment enters into its identification. Excellence has attributes that can be identified, evaluated, and compared across works.” Indeed it does. But if Murray is not just to be the highbrow equivalent of the record-store nerds in novelist Nick Hornby’s High Fidelity (“We’re messing around at work, the three of us, getting ready to go home and rubbishing each other’s five best side-one track-ones of all time”), there has to be more to Human Accomplishment than an accumulation of lists, applause, and fascinating facts.

So, is it the shocking science of IQ, genes, gender, and race? Is Human Accomplishment’s tale of dead white male success merely a return to some of the Bell Curve’s most controversial contentions? Somewhat cagily, Murray notes that “almost all of the current evidence regarding the causes of group differences is circumstantial and inconclusive. The debate will not have to depend on circumstantial evidence much longer, however. Within a few decades, we will know a great deal about the genetic differences between groups. Not all of the controversy will go away, but the room for argument will narrow substantially.”

Cagey, perhaps, but fair enough. That said, Murray’s conclusion that “it therefore seems pointless to use historical patterns of accomplishment to try to anticipate what these genetic findings will be” is disingenuous. Although he writes, correctly, that “biological and environmental explanations [for different rates of achievement among different ethnic groups or between the sexes] can both play a role, separately or interacting in such complex ways that the line between the roles of biology and environment blurs,” it is clear that he sees biology as highly important in the equation. His discussion of the extraordinary success of Ashkenazi Jews, for example, leaves little room for doubt that he believes that a good deal of the credit is due to their genes.

And if that could be true for the Ashkenazim, why not for other ethnic or racial groups? It is no surprise, then, to discover that the book contains a favorable reference or two to Francis Dalton, one of the most famous (or infamous, depending on your view) of the Darwinian danger men. Yes, of course Murray is entitled, and right, to insert the (handily diplomatic) disclaimer that it is still impossible to come to a precise assessment of the relative contributions of nature and nurture to individual and group differences, but that disclaimer comes at a high price. If he is suggesting that we may be on the verge of scientific discoveries that could transform our understanding of the sources of human accomplishment, logically this must substantially dilute the importance of much of what he is trying to say about that topic now.

That, doubtless, would be a disappointment to Murray. He has more than a touch of the teacher about him, and much of Human Accomplishment is best seen as an instruction manual for our species. It is this, I suppose, that the book is for. Murray being Murray, the controversialist extraordinaire, his advice makes uncomfortable reading for the vapidly sentimental. Money, he explains, makes the world go round—faster. Too much consensus or too much family can hold back achievement. War, amusingly, need not. Despite a somewhat shaky grasp of history and horology, The Third Man’s Harry Lime understood this perfectly: “In Italy for thirty years under the Borgias, they had warfare, terror, murder, and bloodshed. But they produced Michelangelo, Leonardo da Vinci, and the Renaissance. In Switzerland they had brotherly love, and they had five hundred years of democracy and peace. And what did that produce? The cuckoo clock.” Needless to say, democracy fares no better with Murray than with Lime. The record so far (distorted, admittedly, by the fact that democracies were very rare until recently) shows that a political structure permitting individual autonomy has been more valuable than the mere existence of universal suffrage. In this conclusion, Murray is clearly correct.

To Murray, it is, above all, the extent to which individuals use that autonomy to realize their potential that makes accomplishment possible. One of the most refreshing aspects of this book is the critical importance attached to the individual: “one may acknowledge the undoubted role of the cultural context in fostering or inhibiting great art, but still recall that it is not enough that the environment be favorable. Somebody must actually do the deed.” Doing the deed (in the sciences just as much as the arts) and, in the case of the most talented of all, having a shot at joining Murray’s blessed 3,869, involves extraordinary amounts of work (some of that “perspiration” that Thomas Edison was always talking about) and a degree of commitment that can often tip over into monomania. Murray argues that this takes not only talent but also a sense of some higher purpose. This is likely to be grounded in religion (Murray argues, for example, that post-medieval Christianity offered Europe particular competitive advantages). Even if it is not, however, such a sense of purpose will be impossible to reconcile with the “ennui, anomie, [and] alienation” that, Murray suggests, account for the twentieth-century artistic and cultural decay and are, quite clearly, the villains of his fascinating and stimulating book.

It is a beguiling argument, to be sure, but to return tactlessly to an earlier topic, will the issue that Murray has so elegantly tried to dodge reduce what he has to say to irrelevance? The notion that an individual’s future is irrevocably determined, in a Calvinism of the genes, by his or her biological make-up will probably always be the crudest of caricatures, but caricatures can be surprisingly persuasive. After all, Murray tells us,

after Freud [and] Nietzsche . . . it became fashionable . . . to see humans as unwittingly acting out neuroses and subconscious drives. God was mostly dead. Morality became relative. These and allied beliefs substantially undermined the belief of creative elites that their lives had purpose or that their talents could be efficacious.

That is probably quite true, but our increasing understanding of genetic science may mean that a far greater philosophical challenge is lurking just over the horizon. As Murray has said, “all we need is a few decades’ patience.”

Hang onto your hats.

Queen of The Desert

Christopher Buckley: Queen of the Desert

National Review, November 8, 2004

Florence_of_Arabia.jpg

All it takes for evil to prevail, warned Burke, is “for enough good men to do nothing.” True; but that doesn’t mean that the good men cannot occasionally relax with a good laugh or two. It might even help them, especially in a situation of the kind the West faces today: a war with an ideology so dedicated to the destruction of happiness that, in the shape of the Taliban, it made laughing too loud in public a crime. (For women, anyway.)

In Florence of Arabia, his dark, disturbing, and very funny new satire, Christopher Buckley highlights the cruelty of radical Islamism and the contradictions of America’s response to it. He does this against a backdrop not of history at its grimmest or journalism at its most intense, but of jokes, mockery, bouts of wordplay (a State Department bureaucrat is a “desk-limpet,” an Arab potentate has lips that are “oyster-moist from a life- time’s contact with the greatest delicacies the world [has] to offer”), and puns that teeter on the edge of catastrophe: The repressive Arab kingdom that is—along, naturally, with France—the main villain of this book goes by the name of Wasabia.

Wasabia is a sand-swept nightmare marked by oil wealth, joylessness, corruption, and ritualized cruelty, a tyranny where “offenses that in other religions would earn you a lecture from the rabbi, five Hail Marys from a priest, and, for Episcopalians, a plastic pink flamingo on your front lawn” are punished by “beheading, amputation, flogging, blinding, and having your tongue cut out . . . A Google search using the key phrases ‘Wasabia’ and ‘La Dolce Vita’ results in no matches.” Well, Prince Bandar, does that remind you of anywhere?

Gallows humor? Certainly. But insofar as the jihadists—with their car bombs, suicide bombs, and dreams of dirty bombs and worse—wish to shove you and me into mass graves at the earliest possible moment, a touch of Tyburn does not seem amiss. Of course, there are people who will find some of what Buckley has to say distinctly, you know, insensitive. The caliphs of multiculturalism will twitch a little, and this is not a book that will find many fans in Foggy Bottom (“the State Department’s reflexive response to any American in extremis overseas is to hand them a pamphlet—along with a list of incompetent local lawyers—and say, ‘We told you so’”).

But satire should not make comfortable reading for the subscribers to any orthodoxy. Running through this book is the clear implication that the American approach to the Middle East has not worked out quite as well as might have been hoped. And what, exactly, is the role played in Buckley’s drama by the Waldorf Group, an investment company (named, hmmm, after a New York hotel) that has danced a little too long, a little too closely, and a little too profitably with the despots of Wasabia?

But about Buckley’s heroine Florence, at least, there are no doubts. Forced out of the State Department for her unwanted imagination and initiative, she now has a new assignment: using covert funds to set up a TV station to transmit to the Arab masses. This will not, of course, be another Al-Jazeera, glossily repackaging nationalist resentments and religious prejudice 24/7, but nor will it be a source of ticky-tacky U.S. propaganda, ineffectively boasting about multicultural contentment in midwestern suburbs. Instead it will be something altogether more revolutionary, directed at the most excluded and mistreated of all the Arab masses: women. This will be Lifetime for women who really have no lives, its purpose to promote female emancipation as a counterbalance to militant Islam.

Qatar, the home of Al-Jazeera, being presumably unavailable, Florence’s TV station is hosted by the venal but fairly relaxed emirate of Matar (“pronounced, for reasons unclear, Mutter”), a state let created by Churchill at one of those colonial conferences that have done so much to make the Middle East the cheery place that it is today. “One might suspect,” writes Buckley, “that its borders had been drawn so as to deprive . . . Wasabia of access to the sea. One would be right.” The result was to leave Matar rich, permanently grateful to old Winston (spotting Matar’s Churchillian place names is one of the book’s many pleasures), and under the control of a royal family that knows how to handle its mullahs: cash, cars, and “an annual six-week paid sabbatical, which most of them chose to take in the South of France, one of Islam’s holiest sites.”

This relatively tolerant country makes an ideal base for Florence and her offbeat and entertaining team: a delightfully cynical PR man, a State Department employee so camp that he could have been pitching tents with T. E. Lawrence, and a CIA Col. Kurtz lite (a seductive— ask Florence—and effective mix of Esquire and Soldier of Fortune).

Throughout, Buckley’s lightly ironic tone only accentuates the savagery that is his main target, making it somehow all the more terrible when, as in this extract, it comes into clear, brutal focus:

The package turned out to contain a videotape. It showed Fatima buried in sand up to her neck, being stoned to death with small rocks. The tape was twenty minutes long. Everyone who watched it wept. Florence brought the tape to Laila. She could not bring herself to view it again so she left the room while Laila viewed it. She waited outside on the terrace, looking out over the Gulf in the moonlight, her skin misted by salty droplets from the fountain that spouted out the royal crest. Laila emerged, pale and shaken. Neither woman spoke. The two of them stood by the balustrade overlooking the gardens, listening to the waves lap the shore and the onshore breeze rustle the fronds of the date palms.

And then, right at the end of this book, cruel, bleak, awful reality finally comes crashing in. There, in the closing acknowledgments, Buckley pays tribute to Fern Holland, “a real-life Florence of Arabia,” who was assassinated in Iraq on March 9, 2004.

She was trying to help, and that would not do.

Other People's Money

Sebastian Mallaby: The World's Banker

The New York Sun, September 30, 2004

Wolfensohn.jpg

If there's anything more guaranteed to set off my inner sans-culotte than pampered, arrogant Teresa Heinz Kerry, it's a gathering of international bureaucrats, the spoiled, sanctimonious, worthless, and annoying aristocrats of our own sadly yet to be ancien régime. Locusts in limousines, they periodically descend on some unfortunate city, clogging the streets with their retinues, the restaurants with their greed, and the newspapers with their self-importance. Seen from this perspective, and judging by its remarkably unflattering cover photograph, "The World's Banker" (The Penguin Press, 462 pages, $29.95) an account by the Washington Post's Sebastian Mallaby of James Wolfensohn and the World Bank he is president of, promised to be a delightful, malicious treat. Mr. Wolfensohn, thin-lipped and narrow-eyed, glares out from the cover, seemingly disdainful of anyone impudent enough to even pick up the book. There is no attempt at a smile: Why bother to ingratiate? The look is the mask of a predator, a big beast to be avoided in boardroom, brawl, or multilateral institution.

Sadly, it's not always right to judge a book by its cover. While the Wolfensohn portrayed by Mallaby is not an altogether likeable character, "The World's Banker" is far from a hatchet job - either of the man or the institution over which he so imperiously presides. Over the years, both have made themselves into tempting targets for a cheap shot or two, but Mr. Mallaby takes the high road, treating them fairly, if sometimes (deservedly) critically. What's more, with a bright, breezy (occasionally too breezy) and assured style that reflects his years at the Economist, the author takes the complex and (let's admit it), potentially excruciating topic of the World Bank and makes it accessible to the general reader.

That said, the high road comes with a toll. This may be my inner Kitty Kelley surfacing, but this book's narrative would have hung together better with a little more Wolfensohn and a little less bank. Certainly the World Bank, like it or loathe it, is an important, some would say essential, institution. But in trying to tell its story through the partial biography of just one man, Mr. Mallaby has, despite a heroic effort, ended up with a slightly, and probably inevitably, unsatisfactory hybrid. His book does full justice neither to Mr. Wolfensohn nor to his bank.

There's another problem. A book that features the drama that is Mr. Wolfensohn had better be about Mr. Wolfensohn, and only about Mr. Wolfensohn. Anything or anybody else will be hopelessly upstaged. Mr. Mallaby has plenty to say about Bolivia, Uganda, and Indonesia, but much of the significance of what he is writing will be lost as even the most earnest readers find themselves impatiently turning the pages in expectation of the next fabulous, appalling Wolfensohn moment. The goat from Mali! The Frenchman's speech! Rostropovich! Harrison Ford! Um, Paul O'Neill! There are titanic rows, great rages, astonishing coups and, oh yes, that ego, well worth a full biography in its own right.

To describe this magnifico as merely protean would be an insult. He is a force of nature whose talents, personality, chutzpah, and remarkable networking skills took him from a comparatively modest upbringing in the Antipodes to Harvard, the Olympics (fencing for Australia!), Carnegie Hall (he's a cellist!), success in the City of London, Wall Street, and, since 1995, his current role.

As Mr. Wolfensohn ponders the chances of a third term at the bank, it is worth asking how much of a success has he made of the first two. Like many of the titans of high finance, his management skills appear - and this is being kind - rudimentary, a mixture of threats, bluster, overbearing ambition, and impatience. It's perhaps significant that Robert Rubin, whose background at Goldman Sachs must have made him very familiar with such types, was one of those in the Clinton administration most resistant to Mr. Wolfensohn's relentless, and typically skilful, lobbying for the World Bank job. It's no surprise to read that the great man's time in Washington has been marked by staff turmoil, mad fads, vast expenditures, grandiose planning, and feuds with shareholders: All the hallmarks, in short, of a Wall Street grandee at work.

All this sound and fury has signified something, however. In weighing Mr. Wolfensohn's career at the bank, Mr. Mallaby concludes that he can boast of some very real achievements - no small matter in a field where progress can mean a better life for large numbers of the desperately poor and, indirectly, for the rest of the planet. In a world that is ever more complicatedly, and sometimes dangerously, interconnected, it is no longer possible for the West to ignore the less prosperous parts of the globe - even if it wanted to. That is something Mr. Wolfensohn understood well, and that Mr. Mallaby makes clear.

If much of the World Bank's progress has seemed to come uncertainly, awkwardly, in fits and starts and after numerous wrong turns, neither Mr. Wolfensohn nor his employees are wholly to blame. When it comes to development, there is no magic bullet, no one answer, not trade alone or aid alone, not free market fundamentalism, not massive infusions of capital, not 'empowerment', not structural reform, not tough dictates suited to the Victorian workhouse, not the sentimentality and soft targets of the 1970s. And certainly not the leftist prescriptions and cultural imperialism of far too many NGOs. The correct approach probably draws on aspects of most of the above strategies and quite a few others besides.

"The World's Banker" gives a useful introduction to many of these issues, but only an introduction. Nevertheless, given the importance of this neglected topic, it is to Mr. Mallaby's credit that his readers, like the developing countries the World Bank was designed to assist, will be left asking for more.

The Fat Police

Kelly Brownell and Katherine Battle Horgen:  Food Fight - The Inside Story of the Food Industry, America's Obesity Crisis, and What We Can Do about It                

National Review, January 26, 2004

Santa Fe, New Mexico, January 1999   ©  Andrew Stuttaford

Santa Fe, New Mexico, January 1999   ©  Andrew Stuttaford

It is difficult to single out what is most objectionable about this hectoring, lecturing, and altogether dejecting piece of work, but perhaps it's the moment when its authors credit the rest of us with the IQs of greedy rodents. Quoting a study that shows that, presented with a cornucopia of carbohydrates and wicked fatty treats, laboratory rats will abandon a balanced, healthy diet in favor of dangerous excess, they draw a rather insulting conclusion: Civilization's success in creating so much abundance has come at a terrible price, a "toxic environment" so overflowing with temptation that, like those Rabelaisian rats, humanity will be unable to resist. We will eat ourselves if not to death, then to diabetes, decrepitude, and stretch pants.

The "obesity epidemic" is becoming a tiresome refrain and Yale professor Kelly Brownell is one of its most tireless advocates. Nevertheless, for those with the stomach for more on the fat threat, Food Fight is worth a look for what it reveals about the motives and objectives of the busybodies pining to police your plate.

But let's start with the "epidemic" itself. With a relish they are unlikely to show at the dinner table, the authors pepper their readers with data purporting to show that roughly two-thirds of Americans are overweight or obese, products of a feeding frenzy that is dangerous medically and drives up health-care costs by tens of billions of dollars. Some of the numbers may need to be taken with a pinch of low sodium salt, but the trends they represent are a matter of concern. In this at least Food Fight is right.

Over the past couple of decades. Americans have indeed put on some pounds. All too often, heavy isn't healthy. The mere fact of being too fat (calculating what is "too" fat takes more, however, than a wistful glance at the pages of Vogue) can cause problems such as arthritis and a range of other, sometimes serious, diseases. Despite this, corpulence should be seen as symptom of ill health as much as a cause: Being fat won't necessarily kill you, but the sloth and the gluttony that got you there just might.

To their credit, the authors do cite research showing that fit fatties are at lower risk that unfit string beans. Still, they tend to concentrate on obesity as a problem in its own right - and, ironically, that's something that may be counterproductive. Befuddled by standardized notions of an ideal weight, Americans spend an estimated $40 billion a year in the generally unsuccessful pursuit of one miracle diet or another. The result is yo-yoing weight - something often less healthful that having a few too many pounds - and unjustified self-congratulation for a population that likes to tell itself that it is "doing something" about its health, when, in fact, it is doing anything but.

Highlighting fatness, that soft, billowing symbol of self-indulgence, reflects an agenda that has expanded beyond legitimate health concerns to embrace asceticism for its own sake. There's a hint of this in the way the authors respond to the idea that all foods can find a place in a properly balanced diet. While conceding that such an approach has "some utility" in individual cases, they see the argument that flows from it (that no food is intrinsically "bad") as a distraction. They are wrong. An emphasis on balance is the best chance of persuading this country to eat more healthily - and, importantly, to stick with this decision. To Brownell and Horgen, more comfortable with proscription and self-denial than compromise and cheeseburgers, this is, doubtless, dismayingly lax.

Their language too is a giveaway. There is tut-tut-ting over the "glorification of candy" and anguish over restaurants "notorious" for their large portions. Under the circumstances, it's no shock that the reliably alarmist "Center for Science in the Public Interest," an organization famous for its efforts to drain away our pleasures, rates frequent and favorable mention.

Asceticism often brings with it a sense of moral superiority and the urge to spread the joys of deprivation amongst the less enlightened masses - by persuasion if possible, by compulsion if necessary, and sometimes by something that falls in between. So Brownell and Horgen lament the lack of "incentive" for recipients of food stamps to purchase "healthy foods." Common sense, apparently, is not enough. Worse, these wretches might even be tempted into "overbuying." Who knew the food-stamp program was so generous?

With tobacco a useful precedent, it's not difficult to see where all this is going. Brimming with tales of carnage, soaring health-care costs, and the threat to "the children," Food Fight follows a familiar script. That's not to say its writers don't make some telling points. The ways, for instance, in which junk food is marketed to America's no-longer-so-tiny tots are troubling, but at its core this book rests on the unpalatable belief that even adults cannot be trusted with a menu. The authors' solutions include regulation, censorship. subsidies, propaganda, public-spending boondoggles, and a faintly totalitarian-sounding "national strategic plan to increase physical activity." Oh, did I mention the "small" taxes on the sale of "unhealthy" food?

Food Fight is a preview of the techniques that will be used to persuade a chubby country to agree to all this. There are scare tactics (death! disease!), a convenient capitalist demon ) "big food"), and, best of all, an alibi. It's not our fault that we are fat. Yes, the importance of getting up off that sofa is fully acknowledged in Food Fight, but the book's soothing subtext is that we are all so helpless in the face of advertising and abundance that we can no longer be held fully responsible for what we are eating. Even the ultimate alibi (food might be addictive!) makes a tentative appearance, but whether this theory is true is, readers are informed, not "yet" clear.

The notion that eating too much is somehow involuntary is ludicrous, but it fits in with the view repeated in this book that "overconsumption has replaced malnutrition as the world's top food problem," a repugnant claim that makes sense only if feast is indeed no more of a choice that famine. Anyone who believes that will have no problem in arguing that, as people cannot reasonably be expected to fend off Colonel Sanders by themselves, government should step in. And "if the political process is ineffective" (voters can be inconveniently ornery), Brownell and Horgen would back litigation. Such cases might be tricky, but even the treat of mass lawsuits "regardless of legal merit" could, they note, help "encourage" the food industry to change its ways.

And that thuggish suggestion is more nauseating that anything Ronald McDonald could ever cook up.

Killjoy Was Here

Eric Burns: The Spirits of America

National Review, December 30, 2003

EndofProhibition.jpg

Abraham Lincoln, a wise man and a brave one too (he was speaking to the sober souls gathered at a meeting of a Springfield temperance society), once said that the damage alcohol can do comes not "from the use of a bad thing, but from the abuse of a very good thing." Drunkenness, not drink, was the real demon. Sensible words; yet, in their dealings with the bottle, his countrymen still lurch between wretched excess and excessive wretchedness. Moderation remains elusive. After the binging, there's always the hangover: dreary years of finger-wagging, sermonizing, and really, really dumb laws. Just ask poor Jenna Bush. Spirits of America, Eric Burns's entertaining history of the impact of an old pleasure on a new world, is rather like a Washington State cabernet sauvignon, unpretentious and thoroughly enjoyable. Burns, the host of Fox News Watch, is not a professional historian. His prose is engaging and relaxed, written in the rhythms of an accomplished raconteur rather than the jargon of the academic. In short, this book is about as dry as a colonial tavern.

To Burns, it's not surprising that the first settlers, as strangers in a strange and not always hospitable land, should have turned to drink: to beer, to whisky, to brandy, to rum, and even to an alarming-sounding series of proto-cocktails. Rattle-skull, anyone? Reading his account, it's easy to conclude that many of these early Americans spent most of the day drunk, proving once again (at least to this Brit) that they cannot have known what they were doing when, after a revolution fomented largely in those same taverns, they broke from the embrace of the mother country.

Needless to say, all this good cheer produced a reaction, and the greater (and most interesting) part of this book is devoted to prohibitionists and their long, far from fine, whine. It's a painfully familiar tale to anyone who has watched the drug war, the excesses of the anti-tobacco movement, or even the gathering fast-food jihad.

The parallels are telling. There's the junk science so shaky that, by comparison, "passive smoking" is as believable as gravity. Dr. Benjamin Rush, "the Hippocrates of [18th-century] Pennsylvania," linked drink to a wide range of health problems including scurvy, stomach rumblings, and, for the truly unlucky, spontaneous combustion. Around a hundred years later—and a century before the nonsense of DARE—the Woman's Christian Temperance Union was distributing an "education" program in schools that included the startling news that alcohol could lead "the coats of the blood vessels to grow thin [making them] liable at any time to cause death by bursting." Boozehounds should also watch out. Children were taught that even a tiny amount of this "colorless liquid poison" would be enough to kill a dog,

Like their successors today, these campaigners understood the uses of propaganda. Even the choice of that soothing word "temperance" (which ought to mean moderation, not abstinence) was, as Burns points out, nothing more than spin before its time. No less disingenuously, the name of the influential Anti-Saloon League camouflaged prohibitionist objectives far broader than an attack on the local den of iniquity, a technique that may ring a bell with those who believe that MADD is now straying beyond its original, praiseworthy, agenda.

Above all, what is striking is how, then as now, the zealots of abstention were unable to resist the temptation of compulsion. Burns is inclined to attribute the best of intentions to the "temperance" campaigners. He's wrong. The fact is that neither persuasion, nor education, nor even psychotic Carry Nation's hatchet was enough to satisfy the urge to control their fellow citizens that played as much a part in the psychology of teetotalitarianism as any genuine desire to improve society. From the Massachusetts law providing that alcohol could not be sold in units of less than fifteen gallons to the grotesque farce of Prohibition, Spirits of America is filled with tales of legislation as absurd as it was presumptuous.

Although he never holds back on a good anecdote (the story of Izzy Einstein, Prohibition Agent and master of disguise, is by itself worth the price of this book), when it comes to the Volstead years themselves. Burns gives a useful and, dare I say it, sober, account. Contrary to machine-gun-saturated myth, the mayhem (if not the corruption) was mostly confined to a few centers, and although Prohibition did clog up the justice system, enforcement, mercifully, usually tended to be less than Ness.

Even more surprisingly, while he doesn't come close to endorsing Prohibition, Bums is able to point to data showing that, in certain respects at least, the killjoy carnival was a success: Per capita alcohol consumption fell sharply, as did the incidence of drink-related health problems. But even these achievements may mean less than is thought. Other evidence (not cited by Burns) would suggest that, after an initial collapse, consumption started to rise again as new (illicit) suppliers got themselves organized, with often disastrous consequences for their customers. Winston Churchill, no stranger to the bottle himself, was told that "there is less drinking, but there is worse drinking," a phrase,  incidentally, that almost perfectly describes the impact on today's young of the increase in the drinking age to 21. As for the alleged health benefits, the 1920s also saw notable reductions in. for example, deaths from alcoholism and cirrhosis of the liver in Britain, a country that saw no need for prohibition.

What Burns underplays, however, is the fact that this debate should be about more than crudely utilitarian calculations. There's a famous comment (cited by Burns, but, sadly, quite possibly a fake) widely attributed to Lincoln that sums this up nicely. Prohibition, "a species of intemperance in itself . . . makes a crime out of things that are not crimes. [It] strikes a blow at the very principles upon which our Government was founded."

The Bloodstained Rise

Christopher Logue: All Day Permanent Red

National Review, November 9, 2003

helmet.jpg

Christopher Logue has been a dealer in stolen property (briefly), a prisoner in a Crusader castle (16 months), a pornographer (the book Lust), and, probably no less discreditably, an actor, a poet, and a writer of screenplays. As if this weren't enough, for over four decades this versatile Englishman has been engaged in a "reworking" of the Iliad. It is not, he is at pains to stress, a translation (he knows no Greek), but an episodic "account" of the ancient epic that has already taken far longer to produce than Troy took to fall.

And, as you read those words. I can hear you sigh. The prospect of yet another tawdry modernization of a classic that needs none seems like nothing to look forward to. Our age often shows itself too restless, unimaginative, and self-important to attempt a genuine understanding of our culture's past. Hot in the pursuit of some imagined relevance, we are forever reinterpreting and updating, here The Tempest as an allegory of slavery, there a few nipples to spice up that boring old Jane Austen. And if, in the process, the sense of the original is lost, we shrug, and settle for what is left: deracinated pap, bland at best, topically—and inconsequentially— "controversial" at worst. Only later do we bother to wonder where our literature has disappeared to.

But All Day Permanent Red is very different from the usual dross. Logue's previous work on the Iliad has been called a masterpiece (Henry Miller, not always a reliable source, described an early section as better than Homer): a devalued term these days, but, in this case, well deserved. All Day Permanent Red is the latest chapter and it doesn't disappoint. Here is Logue's description of the Greek soldiers rising to face their Trojan opponents:

Think of a raked sky-wide Venetian blind.

Add the receding traction of its slats

Of its slats of its slats as a hand draws it up.

Hear the Greek army getting to its feet.

Then of a stadium when many boards are raised

And many faces change to one vast face.

So, where there were so many masks.

Now one Greek mask glittered from strip to ridge.

In earlier installments—War Music (1981). Kings (1991), and The Husbands (1994)—Logue darted in and out of Homer's chronology, starting with the death of Patroclus and the return of Achilles, then taking his readers baek to the early quarrels between Agamemnon and Achilles, and then on to the single combat between wronged Menelaus and spoiled, lethal Paris. In All Day Permanent Red (the title is. wonderfully, borrowed from an advertisement for lipstick), Logue takes a step back—to the very first full day of combat between the two armies.

The language is as ferocious as its subject matter and, in its cinematic intensity, it's easy to see the hand of the former screenwriter:

Sunlight like lamplight.

Brown clouds of dust touch those brown clouds of dust already overhead.

And snuffling through the blood and filth-stained legs

Of those still-standing-thousands goes Nasty, Thersites' little dog.

Now licking this, now tasting that.

But there is more to this saga than a simple recital of slaughter. The savagery on the plains before Troy is echoed in the heavens above. Nowadays we tend to trust in the benign God of the monotheistic imagination or, failing that, in the indifference of a universe that does not actually set out to harm us. The men of Homer's time had no such comfort: "Host must fight host, / And to amuse the Lord our God / Man slaughter man."

The gods of antiquity were capricious - selfish, and vain, playground bullies or the smug members of the smart set in a high-school movie, monsters as often as they were saviors. Pitiless, dangerous Olympus is a recurrent theme that Logue, like Homer, has emphasized throughout his narrative, and this new volume is no exception. Here is Athena's response to a plea for help from Odysseus;

Setting down her topaz saucer heaped with nectarine jelly

Emptying her blood-red mouth set in her ice-white face

Teenaged Athena jumped up and shrieked

"Kill! Kill for me!

Better to die than to live without killing!"

Logue's language, both grand and, at times, oddly conversational ("Only this this is certain: when a lull comes—they do— / You hear the whole ridge coughing"), brings immediacy to an ancient epic. His use of deliberately anachronistic wording neither jars (partly because most English-speaking readers, including this one, are not comparing Logue's work against the original Greek) nor does it break that sense of the past that is no small part of the spell of a tale thousands of years old. And, yes, the references to Venetian blinds, plane crashes, and even an aircraft carrier somehow work in this tale of Bronze Age fury. Their very modernity reminds us both of our vast distance from this saga, and of the extraordinary cultural continuity that its survival represents.

And if we want to understand why, beyond an accident of history, the Iliad has been remembered for so long, Logue's extraordinary, compelling poetry gives us a clue. The Iliad has as much to say about the human condition now as it did when Homer began to write, not least the destructive, glorious, inglorious love of battle that will endure until the Armageddon which, one day, it will doubtless bring about:

Your heart beats strong. Your spirit grips.

King Richard calling for another horse (his fifth).

King Marshal Ney shattering his saber on a cannon ball.

King Ivan Kursk, 22.30 hrs, July 4th to 14th '43, 7000 tanks engaged,

"... he clambered up and pushed a stable-bolt Into that Tiger-tank's red-  hot-machine-gun's mouth

And bent the bastard up. Woweee!"

Where would we be if he had lost?

Achilles? Let him sulk,

A masterpiece? Of course it is.

Horror Show

Joe Bob Briggs: Profoundly Disturbing -  Shocking Movies that Changed History

National Review, August 26, 2003

Briggs.jpg

The title is reassuringly lurid and the cover comfortingly nasty, but, on opening this book, anxious readers may worry that Joe Bob has left the drive-in. Now that would be profoundly disturbing. Author, journalist, cable-TV stalwart, and former NR columnist, Briggs overcame fictitious origins and nonexistent competition to become America's finest drive-in-movie critic. He saw Nail Gun Massacre and he watched All Cheerleaders Die. Who else could take on that sort of responsibility?

He is the Zagat of the Z-movie, the one indispensable guide for those who like slaughter, sex, and lethal household tools with their popcorn. He wallows in the movies that other critics flee. Ebert on Shrunken Heads? Silence. Kael on Fury of the Succubus? No comment. But Joe Bob was there for them both. He's funny, well informed, and succinct (The Evil Dead is "Spam in a cabin"), and he tells his audience what it needs to know (Bloodsucking Freaks: "pretty good fried-eyeball scene . . . 76 breasts . . . excellent midget sadism and dubbed moaning"). If Joe Bob tells you to "check it out," that's what you do.

And when, as a result, you are watching man-eating giant rats starting their gory feast (Gnaw), you will still be laughing at the memory of what Joe Bob had to say. Yes, he both subverts and celebrates these films, but who cares? It's better to lighten up, grab a beer, and just see Joe Bob as someone who delights in rummaging through cinema's trash heap and telling us what he's found.

He does this brilliantly, in a style — Hazzard County, with a touch of Cahiers du Cinema — that is all his own; but, after all these years, is the drive-in still enough for Mr. Briggs? Joe Bob's Jekyll, the erudite and rather more suave "John Bloom," has been developing a journalistic career of his own, while Joe Bob himself has been spotted on stage and screen, and in the pages of Maximum Golf magazine; can the country club be far behind?

In spite of this, it's still startling to find that Briggs chose The Cabinet of Dr. Caligari as the first movie to discuss in his new book. The fact that it's foreign isn't the problem. Joe Bob has written about plenty of foreign films; they usually feature kickboxing, kung fu, gratuitous violence, more kickboxing, incomprehensible dialogue, over-choreographed fight scenes, and the exploitation of attractive young actresses who manage to lose their clothes and their lives in the course of the movie. They are, in short, identical in almost every respect to the domestic offerings he reviews.

Caligari is different. Yes, it's a horror movie, but it's a coffee-on-the-Left-Bank, furrowed-brow unfiltered cigarette of a horror movie and, like a number of the other films described in this book, it's far from typical territory for the sage of the slasher pic. It's a German expressionist masterpiece from 1919, an allegory of totalitarianism often thought to have anticipated the Nazi terror to come. There are no nunchucks in Caligari. Still, there's more than an echo of the drive-in in the irreverent glee with which Joe Bob penetrates the Teutonic gloom. All too often, Caligari is shown with a melodramatic "silent movie" musical backdrop, rather than the modernist score envisaged by its makers. Perhaps worse still, it has also been relentlessly over-analyzed by film highbrows. To Joe Bob, this is like "trying to watch Schindler's List with 'Turkey in the Straw' playing in the background and a professor pointing out every shaft of light as a pivotal moment in German Expressionism."

Caligari is, Briggs argues, a film that "changed history," but in this book that can mean less than you might think. The movies in Profoundly Disturbing may all "have been banned, censored, condemned, or despised" at one time or another, but some of them wouldn't change the course of an afternoon, let alone history.

Perhaps this is why Joe Bob is careful to stress that, in a number of cases, the only history that has been changed is cinema history. How the films he discusses relate to the broader cultural picture is complex: Did a movie influence the culture, merely reflect it, or a bit of both? As he tries to find an answer to this question, quality can be irrelevant. Deep Throat is a terrible film even on its own terms, but somehow it managed to help shape the Ice Storm era and thus had much greater cultural impact than the far more artistically significant Caligari. Caligari may have warned Germans about the dangers of totalitarianism, but little more than ten years later Hitler was in power.

If Profoundly Disturbing doesn't always convince us that the movies it describes "changed history," it is, nonetheless, a hugely entertaining account of the frequently bizarre way they came to be made. Some of these films were made by people operating at the creative edge (the art director of Texas Chainsaw Massacre was, we learn, able "to indulge his lifelong fascination with animal bones") while others were manufactured by those who had hit artistic rock-bottom (Linda Lovelace for President) and didn't care. This is a cinema of desperate improvisation (the night before the "classic tongue-ripping scene" in Blood Feast, the victim still hadn't been cast) and even more desperate finances.

And then there's Mom and Dad (1947), a "sex education" movie that circulated for over 20 years through small-town America. This cautionary tale of the dangers of premarital naughtiness included footage of a live birth and hideous syphilitic sores. It grossed an estimated $100 million. Showings came complete with two women in nurse's outfits and a 20-minute lecture by "Elliot Forbes," an "eminent sexual hygiene commentator." At one point there were no fewer than 26 Elliot Forbeses, "most of them retired or underemployed vaudeville comedians."

If this all sounds like a carny stunt, it's because it was. Profoundly Disturbing includes a good number of more "serious" films (and Briggs writes about them very well), but the movies that make up its sleazy, captivating core are the successors of the freak show, the circus, and old-time burlesque. As told with gusto by an author obviously far from ready to quit the drive-in (whew!), theirs is a story of that wild, ludicrously optimistic entrepreneurial spirit that is, somehow, very typically American. Combine those hucksters, visionaries, and madmen with the dreams of a restless, somewhat deracinated population spreading across a continent and we begin to understand how this country's popular culture became the liveliest in the world — if not always the most elevated. Mencken was right: No one ever went broke underestimating the taste of the American public.

Why so much of that taste revolves around mayhem and gore (that sex has box-office appeal is no surprise) is a mystery beyond the scope of Profoundly Disturbing. Suffice to say that it does, and the result is a book that blends fascinating pop-culture history, first-rate film criticism, and learned commentary on the stunt-vomit in The Exorcist.

Check it out.

Everybody Must Get Stoned?

Jacob Sullum: Saying Yes - In Defense of Drug Use

National Review, June 20, 2003

Sullum.jpg

Jacob Sullum is a brave man. In his first book, the entertaining and provocative For Your Own Good, he attacked the excesses of anti-smoking activism and was duly—and unfairly—vilified as a Marlboro mercenary, a hard-hearted shill for Big Tobacco with little care for nicotine's wheezing victims. Fortunately, he was undeterred. In Saying Yes, Sullum, formerly of NATIONAL RF.VIEW and now a senior editor at Reason magazine, turns his attention to the most contentious of all the substance wars, the debate over illegal drugs. Sullum being Sullum, he manages to find a bad word for the mothers of MADD and a good one for 19th-century China's opium habit.

Sullum's effort in Saying Yes is more ambitious (or, depending on your viewpoint, outrageous) than that of most critiques of the war on drugs. Supporters of legalization typically base their case on moral or practical grounds, or both. The moral case is broadly libertarian—the individual has the right to decide for himself what drugs to take—while the practical objection to prohibition rests on the notion that it has not only failed, but is also counterproductive: It creates a lucrative (black) market where none would otherwise exist. Sullum repeats these arguments, but then goes further. Taken in moderation, he claims, drugs can be just fine—and he's not talking just about pot.

Whoa. In an era so conflicted about pleasure that wicked old New York City has just banned smoking (tobacco) in bars, this is not the sort of thing Americans are used to reading. Health is the new holiness and in this puritanical, decaf decade, most advocates of a change in the drug laws feel obliged to seem more than a little, well, unenthusiastic about the substances they want to make legal. Their own past drug use was, they intone, nothing more than youthful "experimentation." Most confine themselves to calling for the legalization of "softer" drugs and, even then, they are usually at pains to stress that, no, no, no, they themselves would never recommend drugs for anyone.

Sullum is made of sterner stuff. He admits to "modest but instructive" use of marijuana, psychedelics, cocaine, opioids, and tranquilizers with, apparently, no regrets. (Judging by the quality of his reasoning, I would guess the drugs had no adverse effect on him.) He seems prepared to legalize just about anything that can be smoked, snorted, swallowed, injected, or chewed—and, more heretically still, has no truck with the notion that drug use is automatically "abuse." "Reformers," he warns, "will not make much progress as long as they agree with defenders of the status quo that drug use is always wrong."

In this book Sullum demonstrates that if anything is "wrong"—or at least laughably inconsistent—it is the status quo. The beer-swilling, Starbucks-sipping Prozac Nation is not one that ought to have an objection in principle to the notion of mood-altering substances. Yet the U.S. persists with a war on drugs that is as pointless as it is destructive. This contradiction is supposedly justified by the assumption that certain drugs are simply too risky to be permitted. Unlike alcohol (full disclosure: Over the years I have enjoyed a drink or two with Mr. Sullum) the banned substances are said to be products that cannot be enjoyed in moderation. They will consume their consumers. Either they are so addictive that the user no longer has a free choice, or their side effects are too destructive to be compatible with "normal" life.

To Sullum, most such claims are nonsense, propaganda, and "voodoo pharmacology." Much of his book is dedicated to a highly effective debunking of the myths that surround this "science." There's little that will be new to specialists in this topic, but the more general reader will be startled to discover that, for example, heroin is far less addictive than is often thought. The horrors of cold turkey? Not much worse than a bad case of flu. (John Lennon—not for the only time in his career—was exaggerating.) Even crack gets a break: Of 1988's "crack-related" homicides in New York City, only one was committed by a perpetrator high on the drug. That's one too many, of course, but 85 percent of these murders were the result of black-market disputes, a black market that had been created by prohibition.

So if drug users are neither necessarily dangerous nor, in most cases, addicts, can they be successful CPAs or pillars of the PTA? Sullum argues that many currently illegal drugs can safely be taken in moderation—and over a long period of time. He interviews a number of drug users who have managed to combine their reputedly perilous pastime with 9-to-5 respectability. Sullum concedes that they may not necessarily be representative, but his larger point is correct: The insistence that drugs lead inevitably to a squalid destiny is difficult to reconcile with the millions of former or current drug users who have passed through neither prison nor the Betty Ford. As Sullum points out, "excess is the exception," a claim buttressed by the fact that there are millions of former drug users.

Typically, drug consumption peaks just when would be expected—high school, college, or shortly thereafter. Then most people grow out of it. The experience begins to pall and the demands of work and family mean that there's no time, or desire, to linger with the lotus-eaters. Others no longer want to run the risks of punishment or stigma associated with an illegal habit. Deterrence does-— sometimes—deter, and it may deter some of those who would not be able to combine a routine existence with recreational drug use. But this is not an argument that Sullum is prepared to accept: He counters that the potentially vulnerable population is small and may well become alcoholics anyway, "thereby exposing themselves to more serious health risks than if they had taken up, say, heroin." Sullum is not, we are again reminded, an author who is afraid of controversy.

But is he too blithe about the degree of potential medical problems associated with drug use? As he shows (occasionally amusingly and often devastatingly), much of the "evidence" against drug use has been bunk, little more than crude scare- mongering frequently infected with racial, sexual, or moralistic panic; but it doesn't follow that all the dangers arc imaginary. To be sure, he does acknowledge some other health hazards associated with drugs; but he can sometimes be disconcertingly relaxed about some of the real risks.

His discussion of LSD is a case in point. The causal relationship between LSD and schizophrenia is complex (and muddled by the fact that both schizophrenics and schizotypal individuals are more likely to be attracted lo drugs in the first place), but it's not too unfair to describe an acid trip as a chemically induced psychotic episode. The "heightened sense of reality" often recorded by LSD users is, in fact, exactly the opposite—a blurring of the real with the unreal that is also a hallmark of schizophrenia. Throw in acid's ability to generate the occasional-—and utterly unpredictable—"flashback" and, even if many of the horror stories arc no more than folklore, it's difficult to feel much enthusiasm for legalizing LSD except, just perhaps, under carefully controlled therapeutic conditions.

What's more, as a substance that, even in small doses, will create a prolonged delusional state, LSD is not exactly the poster pill for responsible drug use. But this exception should not distract us from the overall strength of Sullum's case. It is possible, he writes, to "control" drug consumption "without prohibition. Drug users themselves show that it is." It's unnecessary for him to add that the abolition of prohibition would imply a relearning of the virtue of self-control, a quality long imperiled by the soft tyranny of the nanny state.

For Sullum is not advocating a descent into Dionysian frenzy. The poverty of "Just Say No" may be obvious, he writes, "but moving beyond abstinence does not mean plunging into excess. Without abstaining from food, it is possible to condemn gluttony as sinful, self-destructive, or both . . . Viewing intoxication as a basic human impulse is the beginning of moral judgment, not the end. It brings us into the territory of temperance"—a word Sullum uses, accurately, to mean moderation. The 19th-century anti-alcohol campaigners who hijacked it were as cavalier with vocabulary as they were with science.

Proponents of legalization will, naturally, say yes to this book, but their opponents should read it too. Sullum's arguments deserve a response from those who disagree with him. As he points out, the costs of the war on drugs far exceed the billions of dollars of direct expenditure. They also include "violence, official corruption, disrespect for the law, diversion of law-enforcement resources, years wasted in prison by drug offenders who are not predatory criminals, thefts that would not occur if drugs were more affordable, erosion of privacy rights and other civil liberties, and deaths from tainted drugs, unexpectedly high doses, and unsanitary injection practices." Under these circumstances, it's up to the drug warriors to come up with a convincing explanation as to why we are fighting their drug war. Judging by this well-written, persuasive, and important book, they are unlikely to succeed.

Keepers Without Peace

Frederick Fleitz: Peacekeeping Fiascoes of the 1990s : Causes, Solutions, and U.S. Interests

UN.jpg

With his good intentions and his blue helmet, the U.N. peacekeeper was an icon of post-World War II internationalism. He was G.I. Joe for the Eleanor Roosevelt set, muscular assurance that the days of the feeble League of Nations would never return. And for a while it seemed to work. The record was far from perfect, but from Cyprus to West New Guinea to Namibia, the presence of relatively small numbers of U.N. troops was sufficient to separate warring forces and supervise the return to peace. The key to their success was evenhandedness and the consent of those whom they had come to police.

In the wake of the Gulf War and the breakup of the Soviet Union, this comparatively restrained approach to peace-keeping underwent a transmutation. The shambles that ensued is neatly summarized in this book’s delightfully blunt title. The author, Frederick Fleitz Jr., knows his material well: He is a former CIA analyst who covered the U.N. and its peacekeeping efforts during parts of the Reagan, George H. W. Bush, and Clinton administrations. Today, he is special assistant to the undersecretary of state for arms control and international security, though readers are warned that his opinions "do not necessarily represent the views of the Department of State, the Central Intelligence Agency, or the U.S. government." But what Fleitz has to say makes a great deal of sense, so we must hope that warning is not to be taken literally.

The real starting point for this book is the Soviet collapse, which made it possible for the West to intervene more aggressively in some of the world's most dangerous trouble spots. Fleitz's central thesis is that U.S. policymakers threw this opportunity away; Instead of building on the Cold War victory with a foreign policy that combined the judicious use of force with enlightened national interest, the government decided to expand the United Nations' global role in peacekeeping. The Clinton administration's poorly thought-out liberal-internationalist agenda combined sanctimony, parsimony, and ineffectiveness in roughly equal measure. The consequences were had for the U.N., in that they made a mockery of belief in that organization’s potential usefulness, and often disastrous for the U.S. There is a good reason that this book is dedicated to the U.S. Army Rangers and aircrew killed in Somalia in the terrible events of October 1993.

The rot began in the immediate aftermath of the Gulf War. As Fleitz explains, supporters of a more activist U.N. "seized on the fact that Operation Desert Storm was authorized by the U.N. Security Council" as proof that a new era had arrived. The U.N.'s role in approving the Gulf War was said by many liberals to herald "an end to the unilateral use of military force, at least by the United States." But as Fleitz correctly observes, "these claims ... ignored the reality that the first Bush administration used the U.N. endorsement... largely as a fig leaf to protect the sensitivities of America's Middle East allies."

These claims may have ignored reality, but they helped create a climate in which U.N. peacekeeping could be transformed. The scope of peace- keeping operations became more ambitious and the traditional requirements of consent and impartiality were abandoned. U.N. forces could now be empowered to impose "peace" on warring parties and, if necessary, take sides in a conflict. Fleitz argues that this more aggressive definition of peacekeeping (and the expansion of the U.N.'s role it implied) fitted in well with a liberal foreign-policy agenda in Washington. "It represented a way to implement . . . dreams of Wilsonian internationalism while drastically cutting defense spending." Beyond that, it is not necessary to hear the whirring of black helicopters to recall, as Fleitz does, that this was also a time when some foreign-policy gurus who were to be influential in the Clinton administration were "talking about how the new world order meant the lowering of national boundaries . . . and the beginning of a slow movement toward world government." It's also worth noting (although Fleitz never does so explicitly) that arguments for a more activist United Nations were always likely to find favor in a Clinton White House instinctively suspicious of the U.S. military and its use as an instrument of American power.

Much of the rest of the book is devoted to an examination of how these expanded notions of peacekeeping have worked or, far too frequently, failed to work. With topics that include Rwanda, Cambodia, Liberia, and Bosnia, this makes for grim but never sensationalist reading: Despite its title, this book is not an exercise in simple U.N.-bashing, satisfying though that would doubtless be. Fleitz is, quite justifiably, highly critical of the U.N., but he is also quick to acknowledge the way the organization has all too often been used as a scapegoat for feckless Western policymaking. And just as the book’s narrative is not sensationalist, neither is its style: The text is often highly detailed (this book will be found on the bookshelves of our more sensible universities for years to come) and brutally burdened down by the fact that U.N. military operations are rich in acronyms if not in achievements.

Above all, Fleitz stresses that these fiascoes were nothing if not predictable. With the precondition of consent abandoned, U.N. peacekeepers ran the risk of being seen as an occupying or hostile force, even when the motives for their mission were primarily humanitarian. The umpires had become players. Despite that, the troops sent in to do the dirty work were often as under-equipped as their objectives were ill-defined. In the course of this book, the author offers up various reasons as to why this was, but touches only briefly on one of the most likely explanations: the fact that the U.N. has been used by Western elites to pursue an internationalist agenda that ordinarily would not secure domestic political approval in their home countries. Using the United Nations to this end is a clever trick, but it ensures that peacekeeping missions will almost always be shortchanged when it comes to resources; proper funding would require politicians to admit the full scope of these operations to their electorates. And voters are rarely enthused by the idea of endangering their soldiers in the name of the United Nations.

This absence of democratic accountability—and the level of blame it should bear for foreign-policy disasters—would make an ideal topic for Fleitz's next book. In the meantime, Fleitz offers some highly practical advice: Continue to use U.N. peacekeepers, but only along the lines of the traditional, limited model that used to work so well. Combine a return to that more modest approach with the adoption by Washington of a realistic foreign policy in which bien pensant internationalism is discarded, American interests are put first, and the isolationist temptation is avoided, and the results could be impressive.

It won't be easy, but an intelligent foreign policy never is.