Independence Forever

 | 

I like independently published books. Some of the best books I’ve ever read have been published in that way. No, I haven’t abandoned HarperCollins or Oxford University Press, despite their manifold and great errors of taste, judgment, and simple common sense. But there are lots of books that have fascinated me that could never have appealed to the trendy recent college graduates who function as “editors” in the normal publishing firm — young people who know what they like, and it isn’t very much.

Could Jane Austen get Pride and Prejudice published today? Not by one of them. Not with that weird opening of her book. Imagine, she actually starts out by saying:

It is a truth universally acknowledged that a single man in possession of a good fortune must be in want of a wife.

However little known the feelings or views of such a man may be on his first entering a neighborhood, this truth is so well fixed in the minds of the surrounding families, that he is considered as the rightful property of some one or other of their daughters.

“My dear Mr. Bennet,” said his lady to him one day, “have you heard that Netherfield Park is let at last?”

Mr. Bennet replied that he had not. “But it is,” returned she, “for Mrs. Long has just been here, and she told me all about it.”

Nope, that would be a nonstarter at HarperColliins. But I would read a book like that, any time I found one.

With these thoughts in mind, I was delighted to discover a new novel by Liberty author Russell Hasan, Rob Seablue and the Eye of Tantalus. A creepy Eye, spells that can turn light into knives, people with special skills that put them in danger from "normal people," technology that might, in the wrong hands, substitute for humanity, the drama of growing up, the contest of the self-described "have-nots" against the "haves," the intransigence of individual choice — what more could a libertarian novel reader want?

Well, he or she might also want wit, humor, a warm grasp on the mundane world (in this case, the world of adolescents), and, in a fantasy novel, a plausible but dramatic relationship between the mundane and the fantastic. All these Rob Seablue has. The obvious influence of Ayn Rand has not prevented Hasan from doing things his own way. I can't tell you more about that way without spoiling the plot for you, but the book is ingenious throughout and most ingenious at its end — ingenious, I might add, without losing plausibility. Actually, the story continually becomes more plausible, as well as more exciting.

This first novel belongs, to an unusual degree, to its author, who is his own publisher. You can say the same thing about William Blake, you know.

Rob Seablue is available, like almost all other books in the wide, wide world, from Amazon — in ebook format readable on Kindle or any PC, Mac, or smartphone using the Kindle app.

Another recent independently published book that I believe will interest Liberty readers is Philip Schuyler’s The Five Rights of the Individual. I’m not sure that I agree with Schuyler about all elements of his theory of rights. For one thing, I think that all rights are ultimately one, and behold, he has five! But that’s close enough, and I don’t think that many libertarian readers will quibble about the point.

What I especially like about Schuyler’s book is the rich context — historical, social, moral, and psychological — in which he places his rights theory. He informs us, for instance, that we live in an historical era in which the US government “makes 350 pages of new laws each day” — and if you don’t think that entails a gross violation of rights, then you’re a bloodless political “scientist” who cares about theories, not about where they lead. I found Schuyler’s commentary on the psychological and cultural formations that support or destroy individual rights especially interesting. And thank God, his book is clearly and engagingly written — something you can’t say about 99% of university press publications on this subject and its conceptual neighbors.

I would be very remiss if I didn’t remind readers of Liberty that another of our authors, Gary Jason, recently published a fine collection of essays, many of which first appeared in these pages. His book is an encyclopedic account of political, economic, and cultural issues that confront libertarians and classical liberals (but it’s much more fun than an encyclopedia). Gary’s beat is everything from the environment to the movies, and you can never predict what will interest him. I don’t always agree with Gary, and strangely, he doesn’t always agree with me. But I always learn something from what he writes, and as I turn the pages, I always look forward to seeing what he’ll do with his material. That’s the effect of a real author.

When I was a student, eons ago, if I ever laid eyes on a libertarian book I clutched it to my bosom, fearing it would be the last one I found. Times have changed. Today, libertarian ideas are actually discussed on TV! But good books are still . . . well, they’re still not exactly common. The three books I’ve mentioned are very good books, and as independent in thought as in their means of publication. Take a look at them.


Editor's Note: Reviews of "Rob Seablue and the Eye of Tantalus," by Russell Hasan (Amazon Digital Services, 2012, 230 pages); "Dangerous Thoughts," by Gary Jason (XLibris, 2011, 632 pages); and "The Five Rights of the Individual," by Philip Schuyler (iUniverse, 2012. 287 pages).



Share This


Cambodia: Not to Be Forgotten

 | 

The Nazis killed Jews, Gypsies, gays, Polish cavalry, retarded people, and assorted other specific groups, intending to annihilate them. The Khmer Rouge killed anyone and everyone, indiscriminately, to make “ecologically sound” fertilizer.

First, the raw materials for the fertilizer — human beings — were made to dig a giant trench. Second, they were made to kneel along the edge. Third, Khmer Rouge soldiers went from one to another ”useless mouth” delivering a sharp blow with an axe to the nape of the neck — to save ammunition.

Over the first layer of bodies, rice husks would be spread, followed by a sprinkling of gasoline. This procedure would be repeated, layer upon layer, until the pit was full. It was then set ablaze. After the pit cooled, the bones were separated from the ashes, ground on giant mortars and pestles, then recombined with the ashes and packaged in jute sacks to fertilize paddy fields.

Denise Affonco, an ethnic Eurasian French citizen, was convinced by her husband, a Vietnamese Communist, to stay in Phnom Penh and welcome the liberators. She lost everything, including her entire extended family, except one son. Hers is a story of a miraculous four-year survival under the Khmer Rouge’s countryside resettlement policy.

What makes this book special is that there aren’t many Cambodian genocide survival stories in English. It is a miracle that the story has been written and published. Days after they arrived to liberate her, the Vietnamese insisted — and paid her — to record an account of her four years in hell, to be used in a subsequent trial-in-absentia of Pol Pot and Ieng Sery. She did; and as an afterthought squirreled away a carbon copy of what she had written. Twenty-five years later, in Paris, she heard an academic opine that the Khmer Rouge did “nothing but good” for Cambodia. She then realized it was time to publish her account.

The book has the immediacy of something written on the fly. There are quite a few translation and run-of-the-mill typos, but they do not detract — you’ll not easily lay it down. Reportage Press is a small UK outfit. A portion of the proceeds are contributed to a scholarship fund, set up in memory of Affonco’s daughter, who died of starvation. The book is available from Amazon and Amazon.uk.


Editor's Note: Review of "To the End of Hell: One Woman’s Struggle to Survive Cambodia’s Khmer Rouge," by Denise Affonco. Reportage Press, 2005, 165 pages.



Share This


Ron Paul: The Books

 | 

Two prominent libertarian authors, Walter Block and Brian Doherty, have just published books about the same important subject: Ron Paul.

Liberty thought it would be a good idea to ask each author to review the other. No one knew how this would turn out — but here are the results. Stephen Cox

* * *

Ron Paul’s Revolution: The Man and the Movement He Inspired, by Brian Doherty. HarperCollins, 2012, 294 pages.

Reviewed by Walter E. Block

This is a magnificent book. It is riveting, hard to put down, informative. I experienced much of the Ron Paul phenomenon myself, up close and personal, yet I learned a great deal from Doherty’s explication. In another life, he must have been a safari guide to the deepest jungles, or an inspired travel guide to foreign lands, or a gifted sociologist. He takes us on a trip through the libertarian movement as brought to us by Dr. Paul as no one else has been able to do.

If you are a Ron Paul fan, or are interested in his foray into Republican and Libertarian politics, or even hate the man and want to be informed about him, this is the book to get. Its main drawback is that it was released on May 15, which means that Doherty must have finished writing it early in the year (he covers the Iowa caucus in its last few pages); but so much has happened since then, and without this author to put all these recent occurrences together for us, it just isn’t the same. This means that if Ron Paul becomes the next president of the US and appoints me czar of anything, I shall order Doherty to write a sequel to this important book of his.

Our author takes us on a historical tour of Ron Pauliana from his early days, to his medical career, to his beginnings in politics, his struggles as the Dr. No congressman, and his three campaigns for the presidency — one for the Libertarian Party, and two for the Republicans. But this book is far more than a biography. One of its many strengths is Doherty’s incisive knowledge of the libertarian movement in all its esoterica.

Others are his numerous vignettes of the people who have given of themselves, lost jobs and alienated friends and family members, in their support of Paul. Doherty also offers candid assessments of Ron Paul himself; we get not only the palpable love that Doherty feels for Paul, but also some of Paul's warts; e.g., he refuses to take lessons from professional speakers, he keeps his religious faith to himself, and he almost absolutely refuses to tailor his message to his audience (of course without violating his principles — what kind of a politician is that?) — things I didn’t fully appreciate even though I, too, am something of an intimate of Paul.

Doherty had me at the edge of my seat, practically panting with glee, as he described the dramatic Giuliani-Paul dustup about 9/11.

Doherty is not a professional economist. Yet his insights into the gold standard, budgets, the deficit, the debt, the fallacies of Keynesianism, the Austrian business cycle theory, the Fed, inflation, the Ponzi scheme of Social Security, the difficulties with socialized medicine, and much more — are clear and true. He is a journalist, not a libertarian theorist, and he is also insightful in his treatment of the niceties of legalizing drugs, the distinction between crony and real capitalism, the strengths and weaknesses of various “movement” organizations and leaders, "voluntaryism," anarcho-capitalism, and a host of other often complicated issues.

The dramatic highlight for me in this book was our author’s depiction of the Giuliani-Paul dustup about 9/11. I witnessed this myself, firsthand. And I read what was said about it, in the aftermath. Yet Doherty had me at the edge of my seat, practically panting with glee, as he once again described this dramatic event. Doherty is nothing if not a magnificent storyteller, and this gift of his pervades the book.

This is a strange review for me to write, for at roughly the same time that his book about Paul was released, so was mine. Doherty and I agreed to review each others’ books, and this is my contribution to the agreement. Although Doherty and I share a love for Ron Paul, our books are very different. I don’t interview anyone; Doherty's book is chock-full of interviews. In contrast to Doherty's, mine shares no personal experiences with Paul and Paulians. Mine is not at all historical. I do not give any tour of the libertarian movement, as he does. Instead, my book is in part an attempt to garner publicity for Paul. I wrote articles that later became chapters in the book about whom he might pick for Vice President and whom for Supreme Court, not so much because I thought there was a clear and present need for such speculations, but more as an attempt to promote his quest for the presidency. In the book, I feature groups such as Jews for Ron Paul, to combat charges that he was anti-Semitic, anti-Israel. I offer a few “Open Letters to Ron Paul,” where I have the temerity to offer him advice on, among other things, how best to deal with interviewers who simply will not allow him to speak.

Another part of my book features my sometimes, I admit it, pretty vicious attacks on people who “done wrong” to Ron Paul. These chapters are not so much aimed at liberals or conservatives, although I do take on a few of them. I can (sort of) forgive them their trespasses. What do they know about anything important after all? No, my ire was aroused to the boiling point by unwarranted criticisms emanating from libertarians, several with impeccable credentials in this philosophy. They, it seems to me, should have known better.

Let me close this review with two very minor criticisms of the Doherty book. For one thing, he (along with practically everyone else) characterizes the war of 1861 in the US as a “Civil War.” But ’twas not a civil war. That term pertains to the case in which one party wishes to take over the entire country at the expense of its opponent. The wars in Spain in 1936 and in Russia in 1917 were true civil wars. While the North in 1861 did indeed wish to rule the entire nation, the South did not. It only wished to secede. So a more accurate characterization would be, the War to Prevent Southern Secession, or the War Between the States, or the War of Northern Aggression.

Second, Doherty (p. 254) claims that what enraged Ayn Rand about the publication in the Freeman of Milton Friedman and George Stigler’s article, “Roofs or Ceilings” was that Friedman “was willing to grant the good intentions of his intellectual adversaries.” No, she was angry at Friedman and Stigler because of “a paragraph on page 10, which seems to suggest the authors agree with the goal of equalizing income.” Rand (very properly in my own view) called them “the two reds” (Snow, 2012). In the view of Skousen, 1998: “Ayn Rand labeled the pamphlet ‘collectivist propaganda’ and ‘the most pernicious thing ever issued by an avowedly conservative organization’ because the economists favored lifting rent controls on practical, humanitarian grounds, not in defense of ‘the inalienable right of landlords and property owners.’” Miss Rand objected to Friedman-Stigler on both of the grounds just stated, and I concur with her on each.

But these are minor blemishes in an otherwise magnificent book. I loved reading it, and so will you, if you have even the slightest interest in Ron Paul and liberty.

References:
Skousen, Mark. 1998. “Vienna and Chicago: A Tale of Two Schools.”
http://www.thefreemanonline.org/features/vienna-and-chicago-a-tale-of-two-schools/
Snow, Nicholas. 2011. “Making Sense of the Controversy.” February 22;
http://www.fee.org/from-the-archives/making-sense-of-the-controversy/

* * *

Ron Paul for President in 2012: Yes to Ron Paul and Liberty, by Walter Block. Ishi Press International, 2012, 392 pages)

Reviewed by Brian Doherty

Libertarian economist Walter Block really, really likes Ron Paul, and thinks Paul ought to be (and thought when he wrote this book that he would be) the next president of the United States. As the title indicates, Ron Paul for President in 2012: Yes to Ron Paul and Liberty is a book of express, and strongly worded, advocacy. Block grants at one point that, well, libertarians can maintain their cred as true friends of liberty merely by not stabbing Paul in the back. But his general tone sells the message that anything other than pure adoration and belief in Paul’s eventual victory qualifies as such stabbing, and he writes that he sees support for Paul as “a sort of litmus test for libertarianism.” Anyone who does not share and express Block’s own thoughts and feelings regarding Ron Paul with precisely the same, or nearly the same, strength and commitment seems to be, in Block’s view, an objective enemy of libertarianism, and generally “despicable” (a favorite Block word for people or articles he thinks are anti-Paul).

Block’s new book is a collection of his articles and blog posts, most of which appeared at the website LewRockwell.com, and were written mostly over the course of Paul’s 2011–12 campaign. As Block writes in the book’s introduction, “Each and every last one of these chapters is an attempt . . . to expand and expound upon his [Paul's] views, to publicize them, to promote his candidacy, to defend it against attacks from within and without the libertarian movement.”

Block is a professor of economics at Loyola University in New Orleans by vocation, and by avocation the “Jewish mother” of what he sometimes calls the Austro-libertarian movement, the hardcore pushers of a Rothbardian plumbline of Austrian economics and anarchistic libertarianism. Here, this Jewish mother’s mission is to tell libertarians, and the world, that they need to push for Paul. Although Paul is not 100% by Block’s own standards — even Block admits the non-anarchist Congressman Paul is only a 97, and further admits to disagreeing with Paul on immigration and abortion — Block finds Paul’s rise in public prominence in his 2008 and 2012 campaigns the greatest thing that’s happened to the libertarian cause in, well, ever. Block believes that “the Texas congressman has acquainted more people with libertarianism, and converted them to this philosophy, then all of the other [libertarian thought leaders] put together.”

Block is well placed to judge these matters regarding the libertarian movement. He’s a grandmaster of modern libertarianism himself, fighting in the trenches of academic and popular writings on Austrian and libertarian issues for over four decades, since he was converted to Austrian economics at Murray Rothbard’s feet. He’s the author of the libertarian classic Defending the Undefendable, which rigorously argues for the legitimacy of such professions as the blackmailer, ticket scalper, slumlord, scab, and employer of child labor, professions which disgust many but which Block points out aggress against no one and provide real economic value and should not be interfered with by the state. That book’s purpose is not to be shocking, per se, but to be rigorously intelligent in identifying the legal and moral meanings of the modern libertarian project, and Block performs the purpose brilliantly. As F.A. Hayek, not nearly as hardcore as Block himself, said of the book: “Some may find it too strong a medicine, but it will still do them good even if they hate it. A real understanding of economics demands that one disabuses oneself of many dear prejudices and illusions. Popular fallacies in economics frequently express themselves in unfounded prejudices against other occupations, and in showing the falsity of these stereotypes Block is doing a real service, although he will not make himself more popular with the majority."

Block finds Paul’s rise in public prominence in his 2008 and 2012 campaigns the greatest thing that’s happened to the libertarian cause in, well, ever.

Block tries to write, here as in all his popular writings, with a light hand. His version of lightness, though, often manifests itself as a very New Yorker-ish (not the magazine — a stereotypical New Yorker) heavy sarcasm, with bursts of manic silliness. But his point is serious, even when made with bludgeoning irony. The book contains defenses and explanation of Paul’s stances on discrimination law, environmental protection, the dangers of the Federal Reserve, and ending the drug war, among other issues. Block advises Paul, from afar, about how to conduct himself during debates, while wisely allowing that, given Paul’s tremendous success, he’s obviously already doing most things right: “It is unlikely that [his success] is in spite of his presentation style.” Block also indulges in some Paul fannish fun, such as skylarking about possible Supreme Court nominees or vice presidential picks for the congressman.

Since this book collects pretty much everything Block has written in the past four years that mentions Paul at all, it is a bit repetitive, and it sometimes drifts a bit into more general libertarian controversies, such as Block’s daring defense of accepting money and jobs from the government. Block believes that as long as you stand against statist policies, “the more money you take from the coffers of the state, the better libertarian you are.”

The book also contains Block debating or attacking other libertarians for falling short of Paulist standards; instances, he believes, are Randy Barnett’s pro-Iraq War stance, and Wendy McElroy’s disdain for any major-party political leader for the libertarian cause. Block often provides line-by-line eviscerations of other people’s writings that he found mistaken or insufficiently respectful to Paul, whether from libertarian or nonlibertarian sources. (Block regards one of my Reason colleagues expressing on TV the opinion that there was no way Paul would win the presidency — and with a look on her face that he found objectionable, to boot — as a firable offence. He regards an organization that would not do such firing as unworthy of the libertarian label or libertarian support. Reason, of course,did not fire her.)

Block may be read by some as too hero-worshipping of Paul, and unrealistically optimistic about his chances. (Block, for example, seems to think the probability of Paul’s victory can be calculated merely by assuming that every single GOP candidate has the exact same odds of winning.) But Block is objective enough to admit that despite his admittedly great success as a proselytizer for the cause, Paul is “not a leading theoretician, not a leading economist . . . not a leading intellectual” of the movement. So what is he? I think Block would agree with my assessment, as author of my own book about Paul and someone who has followed his career with interest and support since 1988, that Paul is a staunch student and fan of Mises and Rothbard who has learned and can transmit their lessons well, who found himself in the position — ironically through a major-party run for president — of selling radically anti-political libertarian ideas with greater efficiency and success than anyone else has managed for a very long while. Block is correct in thinking that Paul has been uniquely successful at his task, and most interestingly by finding a huge mass of normal Americans who never thought of themselves as libertarians before, or as anything specifically political at all.

Understanding what Paul did and said since 2007 ought to be of great interest to libertarians or students of libertarianism, or just students of American politics, and Block gathers a useful collection of information and arguments about the Paul movement as it happened, touching on many of the controversies that surrounded Paul, both within and without libertarianism. If one is a Paul fan seeking a grab-bag of commentary and explanations that is unabashedly pro-Paul — something difficult to find in the modern media environment — then he or she will at least have fun with this book, and likely learn a lot about some of the more complicated issues Block addresses, such as strict property-right libertarian environmentalism, and how to figure out, amid the maddening empirical complications of modern foreign policy issues, who is and who is not an initial aggressor, as opposed to simply a retaliator.

Readers not already 100% sold on Paul are likely to feel Block’s suspicion and even contempt radiating at them. But 10, 20, or 30 years from now, when people look back on what the Paul movement may have meant for American libertarianism, this book will be a valuable document of the excitement and manic energy that Paul’s presence inspired in many a libertarian, old and new.




Share This


The Most Decisive Battle of World War II?

 | 

World War II was a messy affair. In spite of its perception as “the good war,” for some prospective combatants picking a side before all hell broke loose required intense political calculation. Alliances just before, during, and immediately after the war were fluidly tenuous.

The decade before the war’s outbreak presaged the muddle. The Spanish Civil War pitted — by proxy — the recently established Italo-German coalition against Russia in a classic ideological struggle. Italy’s incursions into Africa, on the other hand, were purely hegemonic grabs for colonial territory. In the Far East the situation was more complicated. In 1931, Japan grabbed Manchuria for its natural resources. In 1937, when Japan invaded the rest of China, both Germany and Russia squared off against it by supplying arms and essentials to Chiang Kai-shek’s Nationalist government. The United States, which supplied 80% of Japan’s oil imports and most of its steel, continued to do so.

Hitler considered Britain a natural ally, while Britain despised the Bolsheviks. Stalin despised the western democracies and the fascists equally, negotiating for an alliance with both camps right up to the day of the signing of the Ribbentrop-Molotov Non-Aggression Pact on August 23, 1939 — which was only three days before Hitler’s planned invasion of Poland (delayed for six days by the signing of the Anglo-Polish mutual defense pact).

The muddle continued even after Hitler invaded Poland. Two weeks later, when Russia invaded Poland,Edward Raczyński, Polish ambassador to Britain — citing their mutual defense pact — appealed to Britain to declare war on the Soviet Union. Foreign Secretary Lord Halifax responded with hostility, stating that it was Britain's decision whether to declare war (a moot point, as a secret protocol of the pact identified only Germany as a prospective aggressor). Six weeks later, when Russia invaded Finland and the latter — out of necessity — allied itself with Germany, being unable to muster aid from the western democracies, Britain debated declaring war on Finland. Luckily, cooler heads prevailed.

As to Japan and Germany, their alliance was more a marriage of convenience than a pairing of soulmates. For one, Germany resented having to cede its New Guinea colony to Japan after World War I and besides Berlin’s aid to China, the Japanese rejected Hitler’s racial policies, going so far as to declare publicly that Jews were not a problem. The Führer, in an uncharacteristic backtrack, announced, “I have never regarded the Chinese or the Japanese as being inferior to ourselves. They belong to ancient civilizations, and I admit freely that their past history is superior to our own. They have the right to be proud of their past, just as we have the right to be proud of the civilization to which we belong. Indeed, I believe the more steadfast the Chinese and the Japanese remain in their pride of race, the easier I shall find it to get on with them.”It wasn’t until November of 1939 — three months after Hitler’s invasion of Poland — that the two signed a cooperation pact, and nearly a year later before Japan joined the Italo-German Axis in the Tripartite Pact.

Britain debated declaring war on Finland. Luckily, cooler heads prevailed.

Russo-Japanese relations were awful and getting worse. Immediately following the Russian Revolution, Japan had unsuccessfully contributed 70,000 troops to the Anglo-American effort to overthrow the Bolsheviks. Then, in 1905, the Japanese decisively defeated Russia in the Russo-Japanese War. By 1937, Japan was eyeing Siberia as a natural extension of its Manchurian and Chinese incursions. Stalin treated the island kingdom gingerly.

With Europe on the brink of war, his worst nightmare was the prospect of a two-front conflict. Japan did not reciprocate: it hated the Bolsheviks. Much of its contempt was caused by Stalin’s purges, which had castrated the Red Army. On June 12, 1937, Marshal Mikhail Tukachevsky, the guiding spirit behind the modernized Soviet army, together with seven other high-ranking generals, was shot. Stuart Goldman, author of Nomonhan, 1939, elaborates,

Of the five marshals of the Red Army, three were shot, as were all eleven deputy commissars for defense. Seventy-eight of the eighty members of the Military Collegium perished. Every military district commander was liquidated, as were the heads of the Army Political Administration and the Frunze Military Academy. Of the fifteen army commanders, only two survived. Fifty-seven out of eighty-five corps commanders were shot, as were 110 of the 195 division commanders. At the brigade level, only 220 of the 406 colonels survived. In the Soviet Far Eastern forces the attrition rate was even higher, with 80% of the staff being removed in one way or another. According to some sources, between one-fourth and one-third of the entire officer corps was executed, or discharged within a period of eighteen months.

To the Japanese government, by now controlled by the military, the annihilation of the Soviet professional officer corps was heretical — and an open invitation to invade Siberia.

* * *

While the regimes of Hitler, Mussolini, and Stalin are well known and understood, Japan’s descent into military dictatorship and war was an enigma wrapped in a snowball set rolling by circumstance, without any one charismatic character leading the way.

During the last half of the 19th century, Japan had developed a parliamentary democracy under an emperor — revered to the point of veneration — as head of state. The Great Depression, which hit Japan early, in 1927, strained operations of government, already in disrepute because of widespread corruption, nearly to the breaking point. Frustrated by the Diet’s ineffectiveness, the military’s officer class dove into politics and pushed for decisive action — despite both an imperial prohibition and traditional samuraicustom. They held a trump card. As Goldman recounts, “An Imperial Ordinance dating back to 1900 stipulated that the army and navy ministers must be active-duty generals and admirals. Either service could thus cause the government to fall simply by withdrawing its service minister and refusing to put forward a replacement. By the late 1930’s, this expedient effectively brought civilian government under military control. Before long, generals and admirals themselves headed the government.”

For Japan, many factors, including both gekokujo — literally, “rule from below” — and bushido — “the way of the warrior” — produced a perfect storm. The government’s inability to deal effectively with the deteriorating economic situation was aggravated by Prime Minister Hamaguchi Osachi’s ratification of the London Naval Treaty of 1930. By this treaty, Japan accepted a ratio of 10:10:6 for American, British and Japanese heavy cruisers respectively — in spite of vehement opposition by the Navy General Staff, the Supreme War Council, the major opposition party, the Privy Council, countless nationalist societies, and much of the popular press. Six weeks afterward, Hamaguchi was assassinated. This was the first of a series of murderous assaults and coup attempts that prompted an American journalist to characterize the situation as “government by assassination.”

The Führer, in an uncharacteristic backtrack, announced, “I have never regarded the Chinese or the Japanese as being inferior to ourselves.”

Gekokujo is a Japanese concept that encourages action, initiative, and even principled disobedience in the application of moral ideals — especially if those ideals derive from bushido, Shinto, or Buddhism. It became the driving motivation for the political upheavals of 1930’s Japan. Coupled with another Japanese custom, that of considering direct orders an impropriety — a practice to which even commanding officers adhered — it became a justification for subordinates to ignore superiors’ “orders” (which, grammatically, were structured as “suggestions”), and act as they saw fit. While the top brass controlled the government, gekokujo controlled the lower ranks in a negative feedback loop that aggravated every contingency beyond anyone’s control.

* * *

The battle of Khalkhin Gol (Khalkhin River) — known in Japanese as the Nomonhan Incident — was a direct consequence of gekokujo. It wasone of the largest battles of World War II, and perhaps the most decisive one — except that it technically did not take place during World War II, or between declared combatants. It is the subject of Stuart D. Goldman’s Nomonhan, 1939: The Red Army’s Victory that Shaped World War II. Though based on a PhD. dissertation, it is a splendid book, gripping and well researched. It anticipates every question a reader might have, and answers it with context — a quality not uniformly present in historical narration.

Goldman sets the stage with an analysis of the global geopolitical calculus before the war, explores each country’s constantly adjusting foreign policy, then zeroes in on why Soviet-Japanese relations led to the conflict at Khalkhin River. The undeclared war — a series of confrontations spread over two years, involving nearly 150,000 personnel, and culminating in a massive battle near the village of Nomonhan — is brilliantly laid out, from the diplomatic to-and-fros, to battlefield minutiae, to individual soldier’s anecdotes, to follow-ups of the principal and minor characters during WWII and afterward (with Georgy Zhukov, later to become Marshal of the Soviet Union, Chief of the General Staff and Supreme Commander of Soviet forces, to the fore).

By 1937, Japan’s Kwantung Army, which in 1932 had conquered and occupied Manchuria (renamed Manchukuo), was bored and feeling its oats. In the interim, Japan’s Army General Staff (AGS) had been contemplating whether to extend the Manchukuo salient into Siberia, conquer the rest of China, or move south into Indochina. In June 1937, Kwantung took the initiative. Without notifying the AGS, it undertook a series of provocations along the Soviet-Manchukuoan border in an attempt to settle by force previously unsettled minor border alignment issues, with an eye to testing Soviet military resolve and gaining honor. The AGS had decided on a full-scale invasion of China proper, which it duly launched the following month. Faced with Kwantung’s provocation, the AGS was of two minds, and temporized. The result was a two-front war. Japan didn’t want that war, but still thought it could contain it if it played its diplomatic cards with the USSR adroitly.

Japan’s descent into military dictatorship and war was an enigma wrapped in a snowball set rolling by circumstance.

But Kwantung Army thought it knew better. Instead of heeding the AGS’s orders for restraint — phrased as suggestions — it escalated its thrusts into Soviet-dominated Mongolia. The deck was stacked against Stalin. Though the Soviet Far Eastern forces numbered half a million men, they were spread over a remote area two-thirds the size of the continental US, and hobbled by poor support and transport, including more than 400 miles of trackless terrain between Nomonhan and the nearest railhead (at Borzya in Siberia). Worst of all, the purges had demoralized the Soviet army. Kwantung Army, on the other hand, though numbering only 220,000 men, was bursting with pride and martial spirit from its recent victories, and was concentrated nearby, well-supplied by the South Manchurian Railway’s salient, which reached almost all the way to Nomonhan, yet was close enough to Japan to be reinforced quickly.

On June 1, 1939, Georgy Konstantinovich Zhukov, a young deputy commander in Minsk, received an urgent phone call summoning him to a meeting with Kliment Voroshilov, Commissar for Defense. Zhukov betrayed no sign of apprehension at the possibility of joining the ranks of the disappeared. He was a bull: stout, blunt, crude, and short-tempered; given to drink, accordion playing, and convivial singing; overbearing but exceptionally brave. He was one of the few to survive multiple disagreements with Stalin, and he had a reputation as a man who could get things done. He was also — before the German blitzkrieg — an early proponent of tank warfare, a technique first used during the Spanish Civil War but discontinued because of its ineffectiveness in that conflict’s urban and guerrilla theaters. Khalkin Gol, on the open plains of Mongolia, was a better laboratory. Voroshilov ordered Zhukov to take command of the First Soviet Mongolian Army Group and contain the Japanese incursions.

Zhukov amassed a fleet of 4,200 vehicles to ferry troops and materiel from the railhead at Borzya to Tamsag Bulak, a small village within striking distance of the battlefield. The trucks moved only at night, with their lights blacked out. Meanwhile, to ensure tactical surprise for the Soviet attack, Zhukov concocted an elaborate ruse, setting up a sophisticated sound system between Tamsag Bulak and the battlefield to simulate the noises of tank and aircraft engines and of heavy construction. This long, loud nightly performance was meant to give credence to the false messages (in easily decipherable code, and meant to be intercepted) referring to the construction of defensive positions in preparation for a prolonged autumn and winter ground-holding campaign.

At first, the Japanese were fooled, and fired in the general direction of the loudspeakers. After a few nights, however, they realized it was only sound effects, became accustomed to the nightly “serenade,” and tried to ignore it. On the eve of the Soviet offensive, the sounds of actual pre-attack staging — which included bridges across the Halha River (Khalkhin Gol), deceptively built about 10 inches underwater, so they couldn’t be seen — went largely unnoticed by the Japanese.

Zhukov’s attack was preceded by an artillery and bombing barrage that no one, anywhere, at any time, had ever experienced. At one point — for three solid hours — an average of two heavy artillery rounds per second rained continuously on the Japanese positions. By the third day of this saturating fire, Japanese soldiers, who already had a reputation for superhuman endurance and never surrendering, were going insane. On August 20, Zhukov’s cavalry — tanks and infantry — charged. By August 31, Zhukov had declared the disputed territory cleared of enemy troops.

Zhukov was one of the few to survive multiple disagreements with Stalin, and he had a reputation as a man who could get things done.

The Soviet victory was absolute. Japanese casualties totaled 48,000; Soviet casualties, 26,000 — a very reasonable ratio. Nevertheless, the Red Army was gaining a reputation for troop attrition. Zhukov did not flinch from incurring heavy casualties to achieve his objectives. After the war, he told General Eisenhower, “If we come to a minefield, our infantry attack exactly as if it were not there. The losses we get from personnel mines we consider only equal to those we would have gotten . . . if the (enemy) had chosen to defend the area with strong bodies of troops instead of mine fields.” In the Winter War against Finland — a scant three months later — Russian techniques for crossing mined territory had been refined. Lacking, or eschewing, conventional sappers, Soviet commanders would deploy a single line of infantrymen, elbows interlocked, backed by NKVD snipers, across the mined field — singing patriotic songs to steel their courage.

* * *

Goldman argues that the consequences of the Soviet victory at Nomonhan reached far beyond Mongolia: from Tokyo to the Battle of Moscow and to Pearl Harbor. The timing of the Khalkhin Gol defeat coincided with the signing of the Ribbentrop-Molotov Pact. The Japanese felt betrayed and diplomatically isolated. Defeated by the Red Army and deserted by Hitler, the government of Premier Hiranuma Kiichiro abruptly resigned.

In spite of Zhukov’s decisive victory, Stalin didn’t trust the Japanese — and with good reason. Like the Black Night in Monty Python’s Holy Grail, Kwantung Army was dismembered but foamingly rabid, raring to mount a full invasion of Siberia to regain lost face and honor. It went so far as to notify AGS to “kindly be prepared to mobilize the entire Japanese Army to engage in the decisive struggle against the USSR in the spring.” So Stalin reinforced Soviet Far Eastern Forces with 1.6 million men.

But the top brass at AGS had learned their lesson. They not only decapitated Kwantung’s command; they decided to phrase orders as “orders,” instructing Kwantung to assume a strictly defensive posture. And they reassessed imperial objectives. The thrust north into Siberia was shelved; instead, they set their sights on Indochina as a possible venue for breaking the increasingly stalemated China war by opening up a southern front against Chiang Kai-shek. This decision, logical in the short term, proved the Axis’ ultimate undoing.

It took nearly a year for all the contributing factors to fall into place. For one, Japan hadn’t yet joined the Axis (and wouldn’t for another year). Additionally, it took some time to convince Stalin that Japan was no longer a threat — in spite of his having a spy, Richard Sorge, in the highest levels of the Japanese government. How a Caucasian infiltrated the extremely ethnocentric Japanese high command is another story; but he did, and his intelligence was of the highest caliber. Very slowly, Stalin came to realize that Japan would not be a threat to his eastern flank.

His first move came two weeks after Zhukov’s victory, with the signing of the Molotov-Togo truce, terminating hostilities at Nomonhan. The reason Stalin didn’t invade Poland in conjunction with German forces was that he was waiting for a resolution at Khalkhin Gol. It wasn’t until the day after the cease-fire went into effect at that location that he gave the Red Army the go-ahead to grab eastern Poland. Finally, a year and eight months later, in April of 1941, Japan and the Soviet Union signed a neutrality pact.

Two months later, in June 1941, Hitler invaded the USSR, a move that took Stalin completely by surprise — but which Zhukov had predicted. By late summer, the German army was threatening Moscow. Stalin took a do-or-die stance: he entrenched himself in the capital, declaring that he was “going to hold Moscow at all costs”. As Averell Harriman, US Ambassador to the Soviet Union, later stated, recalling a conversation with Stalin, if Moscow — the nerve center of the USSR — fell, the Soviet Union would likely have capitulated.

“By early autumn, some Western military experts were predicting the collapse of Soviet military resistance within a matter of weeks,” Goldman states. Then, in September, Sorge reported that Japan would “absolutely” not attack Siberia. Only then did the Soviet High Command transfer the bulk of the 1.6 million men stationed in Siberia from east to west for the defense of Moscow. By December 1, German forces were only 12 miles away. It was then that “the Siberians” came to the rescue.

On December 5, Zhukov, who had been put in charge of the Odessa Military District after Khalkhin Gol and was now in charge of the defense of Moscow, launched a massive counteroffensive, spearheaded by the Far Eastern reinforcements. He threw the Germans back about 100 miles and held them there through the winter. It was the first Soviet success since the German invasion.

One day later, Japan attacked Pearl Harbor.

For Goldman, these two events — direct consequences of the Battle of Khalkhin Gol — were the turning point of the war, rather than the Battle of Stalingrad (February 1943). He connects the dots between Khalkhin Gol and Pearl Harbor in this way: in July 1941, while the Germans were blitzing toward Moscow, Japan invaded Indochina — as per the AGS’s post-Khalkhin Gol plan. In response, the US and Britain cut all oil sales to Japan, over 80% of which came from the Anglo-Americans and their allies. The embargo was meant to stop the Japanese war machine; and it would have gone further, throttling the entire Japanese economy. To the Japanese, this was intolerable. The closest oil source was in the Dutch East Indies, modern day Indonesia. But they believed that if they attacked Indonesia, the US would enter the war. So, against the judgment of many of their senior commanders — based on the estimate that US industrial strength dwarfed Japan’s by a factor of 10:1 — AGS decided on a preemptive strike against the US fleet. It was a decision that one Japanese general presciently termed suicidal. The rest, as they say, is history.

* * *

Josef Stalin was the only major WWII combatant to avoid a two-front war. Throughout the first years of the war he’d badgered his allies to invade Europe, and at the February 1945 Yalta conference he, in turn, was pressured to declare war on Japan. He agreed to do so, but only three months after Germany's capitulation. This would allow him several months to transfer sufficient Red Army forces from Europe to the Far East.

At midnight August 8, exactly three months after VE day, and two days after the atomic bombing of Hiroshima, Stalin delivered: the Red Army launched a massive invasion of Manchukuo — against Kwantung Army.

Many perceived Stalin’s move as a cynical grab for spoils. But at Yalta, Stalin had been unaware of the Los Alamos efforts; the war against Japan was nowhere near concluded; and his commitment to open up a Siberian front was a substantial undertaking, made in good faith. After Hiroshima, however, he did take advantage of the situation, trying to reclaim territory lost to Japan in the 1905 Russo-Japanese War — principally, Sakhalin and the Kurile Islands. Though Emperor Hirohito, on August 15, “ordered” (again, phrased in an oblique manner) Japan’s surrender, the Soviet advance continued down Manchuria, into Korea, and across to the off-lying islands. Some 600,000 Japanese troops surrendered and were marched north into the Gulag.

On September 2 Japan formally surrendered. Japan later concluded separate peace treaties with all the victors except the Soviet Union. There has been no formal peace treaty between Japan and the USSR or its successor, the Russian Federation. Russia’s occupation of the Southern Kuriles continues to poison relations between the two countries.

* * *

The Japanese Army General Staff’s decapitation of Kwantung Army did not dampen gekokujo or bushido. These qualities merely spread and entrenched themselves further. Kwantung’s high command had been punished with only slaps on the wrist: transfers and early retirement — no court martials. Mid-level commanders stayed put or were transferred.

Throughout the war Japanese soldiers gained a reputation for fanaticism, for never surrendering, and for suicide attacks. Even after Hirohito’s “order” of capitulation, a radio announcer tried to clarify: the emperor’s message actually meant that Japan was surrendering. But Imperial General Headquarters did not immediately transmit a cease-fire order. When it did, some thought it was a call for further sacrifice; others did not understand it or ignored it.

Japan concluded separate peace treaties with all the victors except the Soviet Union. There has been no formal peace treaty between Japan and the USSR or its successor, the Russian Federation.

Second Lieutenant Hiroo Onoda exemplified Japanese moral values. (On Onoda, see his No Surrender: My Thirty-year War, Kodansha International Ltd, 1974 — another good book.) He was stationed on Lubang Island in the Philippines in 1944. Onoda's orders stated that under no circumstances was he to surrender or take his own life. So he held out, and held out, and held out. Thirty years later, on February of 1974, Norio Suzuki, a Japanese adventurer on a quest for Lieutenant Onoda, a panda, and the Abominable Snowman, in that order, discovered him, befriended him, and urged him to come home. Onoda refused, citing his orders.

When Suzuki returned to Japan, he contacted Major Yoshimi Taniguchi, Onoda's commanding officer — by then a bookseller. When Taniguchi finally found Onoda, he couldn’t convince him to give up his position until he phrased his mission as an order following strict military protocol. Onoda came in from the heat on March 9, 1974. As of 2012, Hiroo Onoda is still alive and living in Brazil.


Editor's Note: Review of "Nomonhan, 1939," by Stuart D. Goldman. Naval Institute Press, 2012, 226 pages.



Share This


Batman and Business

 | 

Business is bad in Hollywood, and I'm not talking about the box office receipts. Businesspeople have been portrayed as bad guys in movies for the past several decades. When an audience member asked about this trend during the "Liberty in Film" panel at the Anthem Libertarian Film Festival last month, Hollywood biographer and insider Marc Eliot dismissed it with a wave of his hand. "It's just a shortcut," he explained. "When you see a businessman on the screen, you know it's the villain. It just streamlines the story."

As moderator of the panel, I agreed with him that these shortcuts are probably not intentionally sinister; in fact, the technique goes all the way back to Aesop, who used them in his fables. "If a character was a dog, you knew he would be loyal," I acknowledged. "A fox would be cunning. A crow would steal. In the old days," I went on, "a black hat meant 'bad guy' and a white hat meant 'good guy.' But shortcuts are dangerous and unfair when we're talking about whole groups of people." I specifically referenced the "shortcuts" of earlier generations of filmmakers: blacks were clowns; Indians were ferocious; women were weak. I suggested the danger of having a new generation automatically think "villain" when it sees a businessperson. The problem is that these characters often mirror and perpetuate basic prejudices within a culture. Onscreen stereotypes lead to real-life prejudices.

Panelist Gary Alexander added this biting criticism: "Using shortcuts is just plain lazy." It's true that filmmakers have always used stock characters as shortcuts to storytelling, and they probably always will. But that doesn't mean we have to accept them.

The silver lining to this clouded silver screen is that these shortcuts can be changed. The challenge for filmmakers is to break away from them and create independent characters who can surprise and satisfy. Just as filmmakers of the ’60s, ’70s, and ’80s deliberately challenged black and female stereotypes by casting against type and writing untraditional storylines, so libertarian filmmakers today need to write screenplays that challenge and overturn the stock business villain. These characters need to be portrayed in the rich, three-dimensional diversity that exists in the real world, where some business people are admittedly bad but others are surprisingly (to filmgoers) good.

What a reversal of stereotypical shortcuts! A businesswoman who expresses the proper role of business, and a burglar who reveals her petty jealousies.

This actually happens in The Dark Knight Rises, the latest entry in the Batman franchise. It's subtle, but it's clear: although there are some bad businesspeople in the film, there are just as many good ones, smashing the stereotype and insisting that viewers look past their stock expectations. For example, when Bruce Wayne (Christian Bale) discovers that his homes for at-risk and orphaned boys have not been funded for two years, he confronts his trusted friend and protector, Alfred. "The homes were funded by profits from Wayne Industries," Alfred sadly explains. "There have to be some." That’s a reminder to Bruce, who has been in a deep funk since his girlfriend died, that his neglect of his company has had wide-ranging effects. Bruce — and the audience — are thus informed that "excess profits" are a good thing. They can be used for doing good works, if that is the business owner's goal.

Similarly, in another brief interchange the audience is told that everyone is affected by the stock market, whether they own stocks or not. I don't think I'm giving away too much to tell you that, early in the film, the bad guys break into the stock exchange. The chief of police is unconcerned about the consequences of a financial meltdown, arguing that the average person saves his money under a mattress and doesn't care about what happens to the stock market. The head of the exchange tells him, "If this money disappears, your mattress will be worth a lot less." A simple truth, simply stated.

Later, Bruce Wayne teams up with Miranda Tate (Marion Cotillard), the head of another corporation, and she voices similar truths about the free market. "You have to invest to restore balance to the world," she tells him, acknowledging the importance of capital investment and private enterprise. And when he looks around at a lavish business party she is hosting, she tells him, "The proceeds will go wherever I want, because I paid for the spread myself." Even Ayn Rand would likely approve this self-interested heroine who understands the value of business.

Meanwhile, Catwoman (Anne Hathaway), one of Batman's archenemies, looks around at Bruce Wayne's huge estate and growls jealously, "You're going to wonder how you could live so large and leave so little for the rest of us." What a reversal of stereotypical shortcuts! A businesswoman who expresses the proper role of business, and a burglar who reveals her petty jealousies. Bravo, Christopher Nolan!

Cinematically The Dark Knight Rises delivers all that was promised in the weeks and months building up to its release. Christian Bale's troubled Bruce Wayne lifts the character far above the comic book hero created by Bob Kane and trivialized by the Adam West TV series in the ’60s. Gone, too, is the sardonic humor injected by George Clooney's portrayal in the ’80s. This Batman is a reluctant savior of a world that has largely misunderstood and rejected him. While he has a few ardent supporters, most consider him a traitor and want him destroyed. He is briefly tempted away from his mission by the love of a woman. He suffers indescribable agony in a dark prison at the hands of a monstrous villain named Bane (Tom Hardy) — the "bane" who wants to destroy the world. Despite his reluctance, Bruce accepts his arduous task. In short, he is a classic Christ figure, adding gravitas to the modern myth of Batman. He even says at one point, "My father's work is done."

I had to display the contents of my purse to a uniformed employee before entering the theater. I hope that a TSA-style Movie Safety Authority does not take over our malls and movie theaters.

But while the characters are rich and well acted, the story is interesting, Hans Zimmer's musical score is powerfully compelling, and the final hour is particularly thrilling, it was difficult to watch this film. Action movies have always provided an opportunity to enter another world, suspend one's disbelief, enjoy vicarious experience, then step back into the real world where "things like that" don't really happen. But in light of what did happen in Aurora, Colorado on opening night, I found it almost impossible to separate myself from the barrage of onscreen shooting in the first half hour of the film. It seemed devastatingly real because I knew it was during this scene of heartless shooting in a very public location that the actual shooting began. I was almost ashamed to be there, seeking a few hours' entertainment from a film that was the unwitting stage for such terror.

I also found myself looking around the aisles and corners of the theater, watching for suspicious characters and devising an escape plan. This was partly because I had to display the contents of my purse to a uniformed employee before entering the theater. I hope that fears like this dissipate for everyone. And I hope that a TSA-style MSA (Movie Safety Authority) does not take over our malls and movie theaters.

Spoiler alert — read the next paragraph only if you have already seen this movie, or if you have no intention of ever seeing it:

The film ends with an "aha" moment that is so thrillingly unexpected that, when I saw it, the entire audience gasped in disbelief. But I should have known from the beginning. Marc Eliot explained it to us in the “Liberty in Film” panel, and he was right: Hollywood uses shortcuts to tell us who the bad guy is. Even when a writer-director is planning the most delicious of twists for the end, he is helpless against his own Hollywood instincts. Nolan telegraphed it from the start: In modern movies, the business owner is always the bad guy. Even when you least expect it.


Editor's Note: Review of "The Dark Knight Rises," directed by Christopher Nolan. Warner Brothers, 2012, 164 minutes.



Share This


Still Entertaining, After All These Years

 | 

What is with all these superhero movies? Iron Man. The Hulk. Captain America. Thor. Do we really need yet another version of Spider-Man? Okay. We get it. Peter Parker gets bitten by an enhanced spider while visiting a science lab. His uncle is killed by a criminal whom Spidey could have stopped if he hadn’t been self-absorbed. He's misunderstood and mistreated. He gets the idea for his costume from a wrestling match. And he can't have a girlfriend because he has to save the world. The story has become so familiar, it isn't even Amazing anymore. So why do we have a new Spider-Man every other year?

The cynical answer is that superheroes are box-office gold. But I think there is more to the superhero craze than simple economics. Every culture has its myths &‐ larger-than-life stories that reveal the community's values, hopes, and fears. Superheroes are the American equivalent of the Olympian gods. Like the Olympians, they have human desires and human foibles. They can be lusty, angry, vengeful, and capricious. But today's superheroes are quite different from the gods of old. They no longer want to be worshipped. In many respects, they just want to be left alone.

Much can be revealed about our evolving culture by examining the evolving superhero. The latest version of The Amazing Spider-Man is quite good. The special effects of Spidey flying through the sky, somersaulting onto ceilings, and hanging from buildings are — OK, you knew it was coming — amazing. His arch nemesis, a lizard-man mutant, is well-developed and complex. The story is satisfying, amusing, and tense, especially in 3D. The casting is superb, especially Andrew Garfield as the new Peter Parker. His gangly youthfulness and spindly physique evoke the angular appendages and lightning speed of a spider. He’s cute, but somehow creepy and unpredictable too.

Even more interesting, however, are the metaphoric and mythic underpinnings of the new story. In many ways the superhero is a metaphor for adolescence. It's no coincidence that Peter is experiencing his first romance at the same time that his body is developing new powers and abilities. He is literally growing new organs, with goo that shoots out of his hands unexpectedly when he gets excited. Like many teens, he doesn't know his own strength, slamming doors and breaking handles with his new muscles. Moreover, he is self-absorbed and self-interested, experiencing pure joy in his own new powers. Superheroes of the previous century had an innate, almost Christlike sense of mission and nobility, but today's young superheroes revel in their newfound abilities. Like the teen mutants in February's Chronicle, Peter reacts joyously as he combines strength, speed, and gymnastic agility to fly from the rafters and swing from the buildings.

Myths always include a conflict between good and evil. A close look at mythic heroes and villains will therefore reveal much about the cultural fears and character values of a generation. In the original comic, Peter is bitten by a spider that has been exposed to radioactive particles. Like other mid-century science fictions, Spider-Man embodied a generation’s fear of the atomic bomb and radiation. In the 2012 version, the laboratory is studying interspecies genetic engineering, revealing a new generation’s fears and concerns about unintentional consequences to genetic meddling.

Another mythological mainstay is the quest for self-discovery. A moment of such discovery occurs directly as Peter enters his English class. His teacher tells her students, “A professor once told that there are only ten stories in all of fiction. I contend that there is only one: ‘Who am I?’” She may be wrong about the number of storylines, but she is certainly right about the importance of self-discovery in literature. It reaches all the way back to 500 BC, and Sophocles’ foundational play, Oedipus Rex. Oedipus discovers who he is by discovering who his parents were. The current story also starts as a quest for self-discovery, as Peter sets out to uncover secrets about his father.

When Peter settles down to thwarting criminals, his motives are far from altruistic. He is no Superman, fighting for “truth, justice, and the American way.” He just wants to find the man who killed Uncle Ben, and if he ties up a few other criminals along the way and leaves them for the cops to arrest, so much the better. Eventually, however, he accepts his mission to fight crime and protect his community. Could we really expect to see a superhero who is not expected to “give back”? As a voice from the dead, Uncle Ben tells Peter that when you are given a great talent, you have to share it with the world.

But this time he doesn’t have to do things all alone. Most revealing is director Marc Webb's treatment of the community at large — the people of New York whom Spider-man is trying to protect. Unlike the inhabitants of Superman's Metropolis or Batman's Gotham or the Avengers' Manhattan, they don't stand around looking up and pointing while the superhero does all the work. They get involved, helping Spidey help them. I love this newfound push toward self-reliance, even if it is a self-reliance that “takes a village.” (How I hate that metaphor!)

Over all, Webb has created a satisfying new version of this cinematic mainstay. I still don’t know why we needed a new one, but it held my attention, even though I know the story inside and out. Myth has a way of doing that.


Editor's Note: Review of "The Amazing Spider-Man," directed by Marc Webb. Columbia Studios, 2012, 136 minutes.



Share This


Prometheus Redux

 | 

Prometheus is the most libertarian of the Greek gods. His name has been used to signify choice and accountability in such stories as Mary Shelley’s Frankenstein or, The Modern Prometheus; her husband Percy Bysshe Shelley’s Prometheus Unbound; Ayn Rand’s Anthem, in which the protagonist renames himself Prometheus; and even in this publication, whose web address, libertyunbound, alludes to the Greek myth.

In that story, Prometheus and his brother Epimetheus, both Titans, are given the task of creating man and endowing him with gifts. The animals are created first, and Epimetheus is a bit too generous, bestowing all the talents and skills (courage to the lion, strength to the ox, cunning to the fox, sagacity to the owl) before man comes along. What to do about this blunder? With the aid of Athena, Prometheus flies up to the sun and steals a bit of fire, bringing it back as a gift to man.

This gift would truly set humans apart from the rest of the animal kingdom. With fire they could warm their houses, cook their food, forge tools to cultivate the earth, create art and musical instruments, and coin money to make commerce and wealth possible. They could also make weapons of defense. In short, they could become independent and self-reliant. But eventually those weapons of defense would become weapons of war, bringing such wickedness to the earth that Zeus would be compelled to destroy it with a flood and start all over again with a new founding family.

Zeus and the other gods, whose power comes from the adulation of humans, are not happy. As a punishment, the gods do two things. First, they form a woman named Pandora and give her to Epimetheus, along with a box from which Pandora releases sorrows and misfortunes into the world, misfortunes that will cause humans to turn to the gods for help. Second, Zeus has Prometheus chained to a rock, where an eagle comes each day to eat his liver. The liver grows back overnight, only to be eaten again.

His body changes — veins appear in his skin — he seems to become mortal — then he crumbles and falls, as his DNA spills like atoms into the water.

It is helpful, though not entirely necessary, to know this background when seeing Prometheus, the long anticipated prequel to the Alien (1979) / Aliens (1986) / Alien3 (1992) / Alien Resurrection (1997) tetralogy. Those films put Sigourney Weaver on the map as one tough mama and opened the casting door to women to become Hollywood action heroes. While the film does not adhere slavishly to the myth, there are enough allusions to make it satisfying intellectually even though it is mostly a science fiction thriller.

As Prometheus opens, the camera pans along what appears to be primordial Earth: uncultivated shrubbery emerges from rich, black, volcanic rock as water pours through fissures in canyon walls. The camera pans up to a gigantic waterfall that seems to be the source of life itself. (Iceland, I must say, provides the perfect location for a pre-human Eden!) At the top of the falls sits a man-like being. In his hand he holds a black and red substance. He hesitates with what appears to be a look of sorrow, and then he eats the substance. His body changes — veins appear in his skin — he seems to become mortal — then he crumbles and falls, as his DNA spills like atoms into the water. Watching this, I couldn’t help but think of Persephone banished to Hades for eating the forbidden pomegranate seed, Adam becoming mortal in the day he ate the fruit of the tree of knowledge, and Prometheus suffering eternal punishment for bringing life-giving fire to mankind. “Adam fell, that men might be” (2 Nephi 2:25), I thought, as the being fell, literally, into the waterfall. Powerful.

The rest of the film is a satisfying return to the Alien franchise, with all the expected elements. Aliens burst from stomachs (note the allusion to liver-eating here). Wise-cracking rocket drivers crack their last laugh. Space scientists hide from monsters in darkened shafts. And one strong, independent woman does her best to save the day. This time, archeologists Elizabeth Shaw and Charlie Holloway (Noomi Rapace, Logan Marshall) have discovered evidence of the origin of earth life — and it isn’t evolutionary amoebas. They believe they can trace the path of that original goo-eating Earth visitor back to his planet of origin, and there discover important truths about humankind.

What they find relates to the second part of the Prometheus myth — Zeus’s decision to destroy Earth’s warmongering civilization. The space travelers discover that aliens have been stockpiling gallons of the goo as a weapon of epic destruction, and their navigation system is targeting Earth. They have simply been waiting for humans to become smart enough to reach the founder gods, Prometheus style, and bingo — liftoff. Once again, Earth’s safety lies in the hands of a feisty, self-reliant, courageous, and in this case quasi-religious woman — Elizabeth is a crucifix-wearing Catholic who isn’t quite sure what that means.

Even with a slew of stomach-ripping aliens on hand, no modern blockbuster would be complete without a cold, heartless, corporate rep. Prometheus supplies two of them, in the guise of Meredith Vickers (Charlize Theron) and her boss, Peter Weyland (Guy Pearce), who is funding this mission not for its potential contribution to science or humankind, but for selfish personal reasons. Of course. (Interestingly, Weyland’s company logo is a triangle, perhaps suggesting the Trinity. That would go along with Elizabeth’s crucifix.) Unfortunately, the always wonderful Guy Pearce is wasted here under a gallon of age-creating prosthetics; if they wanted an old man in his role, why not simply hire an old man to play it? Unless flashbacks have been planned for the next installment of this prequel, there was absolutely no reason for this casting. As for Theron — she plays the cool queen magnificently.

The true stars of this film are Michael Fassbender as David, the lifelike robot servant of the crew, whose name suggests that either a Goliath slayer or a Messianic king (or both) is coming somewhere along the way of this new trilogy; and Rapace, who steps into Sigourney Weaver’s moonboots with a fierce determination and a welcome softness. She will do well as the new Eve, if that is where this trilogy is headed.


Editor's Note: Review of "Prometheus," directed by Ridley Scott. Twentieth Century Fox, 2012, 124 minutes.



Share This


Problems of Perspective

 | 

Perspective. Two people can look at the very same scene, or experience the very same event, yet come away with completely different ideas of what they have seen. That seems to be the point of Wes Anderson's latest film, Moonrise Kingdom, and he begins making that point, cleverly and creatively, with his opening scene.

We see a painting of a seaside house. As the camera comes closer, we enter the house. It is obviously a dollhouse, full of tiny dollhouse furniture. Then a boy walks into the scene, passes the tiny chair, and demonstrates that it is actually normal size. As the camera pans from room to room, similar anomalies appear. We see a giant set of binoculars at the far side of a room, until a young girl walks into the scene and comes toward the binoculars. Only then do we realize that they were normal sized binoculars sitting on the window sill in the foreground, not the background. Again, we see a full-sized lighthouse in the distance, until a car drives into the scene and we realize it is merely a mailbox in the foreground, decorated to look like a lighthouse.

These optical illusions are no accident, and they are not merely a filmmaker's cinematic game, although they are mighty fun. Anderson uses this technique to establish, early in the film, that what we see is not always what we get. Our perspective of anything we see is often skewed by our expectation of what it is. The girl carries her binoculars everywhere and sees almost everything through their lenses, suggesting that if we look at events more closely, and put people into the picture, we are more likely to gain a proper perspective.

Wes Anderson is known for his quirky story lines, dysfunctional families, vivid color palate, and deadpan direction. This film is no exception. Moonrise Kingdom is a story of young star-crossed lovers — a familiar story, here turned upside down. Suzy Bishop (Kara Hayward) is the oldest child of a pair of lawyers (Bill Murray and Frances McDormand) who speak in legal jargon and call their four children to dinner with a megaphone. At one point a shirtless Mr. Bishop walks through the living room, carrying an axe, and announces to no one in particular, "I'm going to find a tree to chop down." No wonder Suzy has anger-control issues.

Sam (Jared Gilman) is an orphaned "Khaki Scout" staying at a summer camp across the island from Suzy's house. Sam doesn't fit in with the other scouts. Authority figures in 1965, when this film is set, would probably have said he needs to "be a man"; certainly no one seems concerned about how the other boys treat him. Those same authorities today would probably say “he is being bullied.” It's all about perspective, isn't it?

Sam and Suzy meet by accident when the scouts attend a church production of Benjamin Britten's "Noye's Fludde," in which Suzy plays the raven. (Okay, it's not exactly by accident; Sam sneaks into the girls' dressing room to find out who she is.) Britten's music provides the score for much of the film, and "Noye's Fludde" foreshadows both the pairing up of the two young romantics and the tempest — figurative and literal — that is about to break forth.

After a year of clandestine correspondence and furtive binocular spying, Sam breaks out of his tent, Shawshank style, and runs off with Suzy into the woods. The shenanigans that follow, with scouts, family members, and a robotic matron (Tilda Swinton) known only as "Social Services" trying to find the runaways, is classic Anderson, with bizarre, illogical, unexpected happenings presented as perfectly natural events. The sweet budding romance between Sam and Suzy as they play house in the woods (also bizarre and illogical) is contrasted sharply with the mean-spirited antics of those who are sworn to protect them.

Under the direction of their gung-ho scoutmaster (Edward Norton) the rest of the scouts form a posse to track Sam down and bring him back to camp. "I resigned," Sam tells them simply, to explain why the boys have no jurisdiction over him. To this one of them asserts, "You don't have the authority to resign!" His perspective on group dynamics is funny and chilling, so obviously wrong and yet so socially accepted. Recalling the furniture in the film's opening scene, the boy appears to be a small GI Joe, but he is spouting grownup beliefs. Sam is correct when he says to the boys, "I don't like you and you don't like me, so why don't you just let me go?" But they won't let him go; they expect him to conform to the group.

All of this might be charming and delightful if only our star-crossed lovers were a little older. But to me there is something creepy and unnerving about 12-year-olds kissing in their underwear and talking about hard-ons and breasts. Yes, these children have faced some difficult obstacles, with Sam being sent to foster care after his parents died and Suzy spying on her mother's infidelity with the local cop (Bruce Willis) and being bathed by her mother at the age of 12. But I hardly think that running away to play house and have sexual experiences at that young age is the answer.

I also couldn't shake the realization that Kara Hayward and Jared Gilman were 12 themselves as they experienced their first "touching sessions" in front of cameras, boom operators, and director Anderson. As the film points out in its opening scene, a little perspective is wanted. Things that are large sometimes turn out to be small, and things that are small often turn out to be large. Children are small. They should not be placed in adult situations, no matter what the director — and their parents — tell them to do.


Editor's Note: Review of "Moonrise Kingdom," directed by Wes Anderson. Indian Paintbrush, 2012, 94 minutes.



Share This


The Brain, Explained

 | 

"Don't think, feel!" Bruce Lee's character exhorts his young son in Enter the Dragon (1973) as he teaches him to trust his instincts while learning to fight. By contrast, Ayn Rand favored "Don't feel, think!" when she wrote, "People don't want to think. And the deeper they get into trouble, the less they want to think." Like Plato, Rand proudly privileged reason over emotion. But which is the better approach for making decisions, Lee's feeling intuitively or Rand's thinking rationally?

According to Jonah Lehrer in How We Decide, they're both right. We humans would make better decisions if we understood how the brain reacts to various stimuli. The frontal cortex accesses different tools within its complex regions and uses that knowledge to choose when we should react intuitively and when we should figure things out rationally. Using fascinating real-life stories, studies conducted by respected psychologists and neuroscientists, and an entertainingly accessible style, Lehrer explains how the uniquely human frontal cortex sorts it all out and helps us decide.

For instance, Lehrer considers how quarterback Tom Brady surveys the position and forward direction of 21 moving players on a 5,000-square-yard playing field, anticipates where everyone will be next, and decides where and how fast to throw a football, all in less than two seconds, while other players are bearing down on him. Brainwave studies have shown that there isn't time for him to process the information and make a rational decision. The neural synapses aren't that fast. A quarterback's decision is made intuitively, through the part of the brain controlled by emotion. As Lehrer quotes Brady, "You just feel like you're going to the right place."

Lehrer also demonstrates what causes athletes, performers, public speakers, and everyday humans like you and me to "choke" on tasks for which we are perfectly prepared and skilled. He tells the stories of opera singer Renee Fleming, golfer Jean Van de Velde, and others to demonstrate the point. The problem comes from overthinking a task that the body has learned to perform instinctively. In short, the brain gets in its own way, as the reasoning synapses block the path of the emotional synapses. "A brain that can't feel can't make up its mind," Leher concludes (15).

Of course, mere feeling isn't sufficient for making the right decisions. A potential juror who says, "I can tell if someone's guilty just by looking at him" is more dangerous than a crook with a gun. Lehrer provides equally fascinating examples to demonstrate when the rational part of the brain needs to be in control. For example, he tells the compelling story of firefighters who tried to control a raging forest fire in the Rockies in 1949. When the blaze jumped a gulch and began racing toward them, most of them tapped into their brain's emotional side and tried to outrun the fire.

The captain, however, evaluated the situation rationally. He quickly took into account the dryness of the grass, the speed of the wind driving the fire, the slope of the hill they would have to run, and their unfamiliarity with the terrain on the other side of the crest. While his emotions screamed "Run!" his reason said, "Stop. Build a fire. Destroy the fire's fuel, and then hug the ground while the fire passes over you." He was the only man to survive. None of his young firefighters followed his lead. Today, building a firebreak has become standard training procedure because of this incident. But at the time, Captain Dodge's brain created the escape route entirely on its own.

Modern scientific tools, such as the MRI, electronic probes, and EEG, have made it possible to see exactly what the brain does when faced with a choice, a risk, or a dilemma. "Every feeling," Lehrer writes, "is really a summary of data, a visceral response to all of the information that can't be accessed directly" (23). This means that you and I will make better choices if we understand which parts of the brain to access for different tasks, and how to satisfy or tone down conflicting stimuli.

For example, one study asked subjects to memorize a list and report to someone in a room at the end of a hallway. On the way the subjects passed a table where they were invited to take a snack. Those who had a long message to remember — one that required them to remember seven things — usually chose a piece of chocolate cake, while those who only had one or two things to remember tended to take a piece of fruit. The practical application? When the rational brain is working at capacity (and according to psychologist George Miller's essay, "The Magical Number Seven, Plus or Minus Two," memorizing seven things seems to be the capacity), the emotional brain takes over, and the chocolate cake is irresistible. When the rational brain has less to remember, it can overrule emotion and make a wiser choice. No wonder we overeat and fall prey to other temptations when we have too much to do.

So when should we think rationally, and when should we act impulsively? Lehrer ends his book with several practical suggestions.

First, simple problems require reason. When there are few variables to consider, the brain is able to analyze them rationally and provide a reasonable decision. But when the choice contains many variables — as when one is buying a new house — "sleep on it" and then "go with your gut" really is the best advice. Overthinking often leads to poor decision making.

Second, novel problems also require reason. Before reacting intuitively, make sure the brain has enough past experience to help you make the right decision. Creative solutions to new problems require concrete information and rational analysis.

Third, embrace uncertainty. Too often, Lehrer warns, "You are so confident you're right that you neglect all the evidence that contradicts your conclusion." This is especially true in matters such as politics and investment decisions. He offers two solutions: "always entertain competing hypotheses . . . [and] continually remind yourself of what you don't know" (247). Certainty often leads to blindness.

Fourth, you know more than you know. The conscious brain is often unaware of what the unconscious brain knows. "Emotions have a logic all their own," Lehrer says. "They've managed to turn mistakes into educational events" (248–49). The reason superstars like Tom Brady, Tiger Woods, and Renee Fleming can rely on instinct is that they've been there before. Tom Brady has surveyed thousands of football fields and thrown thousands of passes; Tiger Woods has made thousands of putts; Renee Fleming has sung an aria hundreds of times. For them, the brain knows what to do, and thinking just gets in the way.

Fifth, think about thinking. Before making a decision, Lehrer warns, be aware of the kind of decision it is and the kind of thought process it requires. "You can't avoid loss aversion unless you know that the mind treats losses differently than gains," he explains. Knowing how the brain works will help us make better decisions in everything we do.

How We Decide is a book full of real-life stories, scientific experiments, and practical applications. It will help you understand how you make decisions, and will guide you to make better decisions in the future. Returning to Bruce Lee and Ayn Rand's conflict between thinking and feeling, Lehrer makes a strong case for "Think sometimes, feel sometimes. And make sure you know when to do which."


Editor's Note: Review of "How We Decide," by Jonah Lehrer. First Mariner Books edition, 2010 (Harcourt Brace, 2009), 302 pages.



Share This


Outsourcing: The Inner View

 | 

Many years ago a woman wrote a letter to Ann Landers, asking whether she should go back to school to get a college degree. She worried that it might be a waste of time so late in life, ending her letter with this: “If I go back to college, I’ll be 62 in four years.” I’ve never forgotten Ann’s cogent reply: “And how old will you be in four years if you don’t go back to college?”

We all have choices. We have no control over the amount of time we have in this life, but we do control what we will do during that time. Life is what we make of it. No matter how old we are.

This is the message of The Best Exotic Marigold Hotel, where a dozen or so fellow travelers have stopped for a while to share their stories, and their lives, to varying degrees. It is a poignant and funny Canterbury tale, Indian style. The young innkeeper, Sonny Kapoor (Dev Patel), serves as host and philosophical guide. Bubbly and bumbling, he is an optimistic and likeable fatalist. When the travelers express horror at his falling-down hotel, he tells them, “In India we have a saying: everything will be all right in the end. So if it is not all right, it is not the end!”

The best exotic hotel is exotic, but it certainly isn’t best. The phones don't work. The roof has holes. Some rooms don't have doors. The courtyard is cracked. But Sonny doesn't see it as it is; he sees his hotel as it can be. As it will be. Because if it isn't all right now, it just means it isn't the end yet.

The travelers have come to the Marigold Hotel for different reasons, most of them having to do with money.

Douglas and Jean Ainslie (Bill Nighy and Penelope Wilton) invested their retirement funds in their daughter’s startup company, and it didn’t start up. In their native England,they can’t afford more than a cramped bungalow for old folks, so they have come to the Marigold for cheap rent.

Recently widowed, Evelyn Greenslade (Judi Dench) has discovered that her husband mismanaged their money and left her deeply in debt. She embraces the Indian experience, blogging about it for readers back home and finding a job training telephone operators in an outsourced information company (yes, those infernal IT people you reach when your computer is on the fritz. But here they are earnest and likeable — as, I suppose, they really are).

The film gives us an unintentional inside look at socialized medicine, and what we see isn't pretty.

Madge Hardcastle (Celia Imrie) has traded on her good looks all her life. Now those looks aren't so good any more, and she must face the possibility that she has had her final love affair. She is looking for love, but she is also looking for a lasting sugar daddy. She likes nice things.

Muriel Donnelly (Maggie Smith) is an old-school racist, and by that I mean she is confident and self-assured in her belief that everyone around her shares her bigotry, including the people whom she considers inferior. As the film opens she is on a stretcher in a hospital hallway, complaining that she wants a “proper English doctor,” not the black man who has just tried to touch her. Because we understand she is an unhappy product of her cultural upbringing, we cut her some slack and enjoy her crotchety rantings, knowing that she will have a change of heart before the film ends. (And if she doesn’t, it will only be because it isn’t the end yet!)

The film gives us an unintentional inside look at socialized medicine, and what we see isn't pretty: cots in the hallways because there aren't enough examining rooms, months-long waits for necessary surgeries because there aren’t enough surgeons. "Six months!" Muriel exclaims. "At my age I don't even buy green bananas!" (I know, it's an old joke — but it always reminds me of the dear friend who first said it to me — just weeks before he died, as it turned out.) When Muriel’s doctor tells her she can have the needed hip replacement surgery immediately in India, she goes there, then repairs to the Marigold Hotel to recuperate.

Graham Dashwood (Tom Wilkinson) is the only character who has come to India for non-financial reasons. He is trying to find the love of his life, whom he met 40 years earlier while stationed in India, and from whom he was forced to part. He spends his days in the registry office, trying to track down the friend, and in the streets, playing cricket with young boys. Through him the characters learn the meaning of true love.

Despite the heat, the unfamiliar foods, the smells, and the “squalor,” as Jean describes it, India is still, in this film, a land of exotic wonder and happy faces. When asked what he likes about it, Graham responds, “The lights, the colors, the vibrancy. The way people see life as a privilege and not as a right.” Camels, elephants, and cows line the roads, along with rickety buses and colorful “tuk-tuks,” the ubiquitous three-wheeled taxis. These folks have “come to a new and different world,” as Evelyn writes in her blog, with voice-over narration. “The challenge is to cope with it. And not just to cope, but to thrive.”

The Best Exotic Marigold Hotel is filled with similar upbeat aphorisms and quotable quotes. “The person who risks nothing, does nothing. Has nothing,” Evelyn tells her readers.“The only real failure is the failure to try. And the measure of success is how well we cope.”

“Most things don’t work out as expected,” she concludes, “But what happens instead often turns out to be the good stuff.” With an outstanding cast of veteran actors portraying couples in various stages of love and marriage, an important message about taking charge of one’s choices, and a point of view that says old age doesn’t have to be outsourced (but it isn't so bad when it is), The Best Exotic Marigold Hotel is indeed the “good stuff.”


Editor's Note: Review of "The Best Exotic Marigold Hotel," directed by John Madden. Participant Media/20th Century Fox, 2011, 124 minutes.



Share This
Syndicate content

© Copyright 2013 Liberty Foundation. All rights reserved.



Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.