Earth Invaded by Metaphor

 | 

When I was a little girl, all the kids in my neighborhood would gather on summer afternoons to play Cowboys and Indians. I had never met an Indian (heck, I had never met a cowboy, either) but I saw them on TV. I knew the Indians were the bad guys because they were different from me. The men had long hair, seldom wore shirts, slept in round tents, and grunted "How" when they talked. The women wrapped in blankets and carried babies on their backs. The cowboys were good because they wore boots and hats and talked in complete sentences. Their women wore eye makeup and beehive hairdos. They were like us.

My kids never played Cowboys and Indians. The game has long fallen out of favor, being considered insensitive to Native Americans. But they did play Aliens. A lot. (They still do, in fact, mostly on Xbox.) Space is the new frontier where we can still hold onto our prejudices — the ones that assert, "My kind are good; the other kind are bad." I realize that we never were fighting against Indians, really. We were fighting against "other," that unknown quality of beings that are different from us. We called them "Indians," but they were really just "aliens" all along.

So the only surprise about the film Cowboys and Aliens that opened this weekend is that no one thought of it any sooner. I awaited it eagerly, knowing that it would be laden with metaphor and ripe for a review.

Director Jon Favreau makes the point about aliens quickly and clearly. Daniel Craig plays Jake Lonergan, an amnesiac drifter with a mean right hook; and Harrison Ford is Woodrow Dolarhyde, a rancher who's mean and rich (his name says it all). Initially the setting is populated by groups of people who don't like each other: city folk who don't like ranchers, bandits who don't like city folk, and Indians who don't like anyone white. Interestingly, however, on a personal level there is a lot of interracial connection in this movie — the white innkeeper is married to a Mexican woman, for example, and the rancher has a close relationship with the Indian who watches over his son.

When space aliens appear on the scene and begin kidnapping local residents, all the groups band together to fight the aliens. The message is clear. It has been used by government leaders (and tyrants) for centuries: to establish local harmony, simply unite the masses against a common enemy.

The "western" part of this western works well. It begins as a classic western would — with a sweeping panorama of the desert, complete with sage brush and sandy cliffs. The story is character driven, and as we learn the characters’ back stories we discover why children behave the way they do when they become adults. Favreau's point seems to be that the more we know about why people act as they do, the more we will come to understand and accept them. This point is made with special effect in the case of Woodrow Dolarhyde, whose personality warms throughout the film. Through Dolarhyde we also learn the true meaning of fatherhood, as we see his maturing relationship with three young men: his son, Percy (Paul Dano); the Indian hand (Adam Beach) who looks out for Percy; and Emmett (Noah Ringer), an orphan boy whom Dolarhyde takes on. It's a little heavy handed, but an important value nonetheless.

The casting is excellent. One of the standouts is Paul Dano as Percy, Dolarhyde's spoiled, juvenile delinquent son who shoots up the town with impunity, knowing that Daddy will fix things for him later. Another is Clancy Brown as Meacham, the local minister who spouts aphorisms while toting a gun. He's a practical kind of preacher, and I liked his philosophy, which offers such wisdom as "It's not who you were, it's who you are," and "Whether you go to heaven or hell isn't God's plan but your choice." Sam Rockwell is endearing as Doc, the innkeeper who must learn how to shoot a gun and "be a man." And 12-year-old Noah Ringer is marvelous as Emmett, the boy who also learns to be a man during the quest to destroy the aliens. My only complaint is Ella (Olivia Wilde), the obligatory girl who comes along for the ride. Her role eventually deepens, but for half the film she is simply a drain on the landscape.

However, as much as I loved the idea of this film, the manifestation of the idea doesn't quite work. The alien part of the movie is simply too alien for a western. For one thing, westerns are slow-paced and character-driven; space aliens have no character. The two simply don't mix. Moreover, the metaphor is so heavy-handed that the aliens never really enter the story. The humans never even question who the aliens are, where they came from, or how they are able to fly through the sky. We're just supposed to know what they represent — invaders seeking to plunder the minerals under the soil and turn them into fuel. Sound familiar?

This is the second "alien encounter" film produced by Steven Spielberg this summer, but oddly, although the aliens in both films look nearly the same, the message of the two films couldn't be more different. In Super 8 the message is "An alien is just a friend you haven't met." Here, the alien gets caught on earth while he's just passing through, and the nasty government scientists kidnap him. In Cowboys and Aliens the beings from outer space are plundering invaders and the message is "kick their asses back where they came from."

It was nice seeing the Indians, townies, ranchers, and even bandits becoming friends. I especially liked seeing the development of Dolarhyde's character. But I'm not sure I like the idea that we can only become friends by uniting against an enemy. The film tries hard to please, but the metaphor overpowers the story and collapses from its own weight.


Editor's Note: Review of "Cowboys and Aliens," directed by Jon Favreau. Universal Pictures, 2011, 118 minutes.



Share This


Anthem: The Libertarian Film Festival

 | 

Anthem may not be the first libertarian film festival, but after the success it enjoyed at FreedomFest this month, it may well become the most lasting. With 30 outstanding films by rising artists and ten provocative and entertaining panels, Anthem was called by many FreedomFest attendees "the best new idea you've had in years."

Full disclosure: by "you" they meant the management of FreedomFest, but where Anthem was concerned, they meant especially me, since the film festival was my baby. I might debate the "in years" part of that statement — the producers of FreedomFest have added great new programs every year — but I enjoyed the compliment, nonetheless.

FreedomFest is a big conference: 150 speakers, panels, and debates, and ten events going on simultaneously during breakout sessions throughout the three-day event in Las Vegas. Why add a film festival?

First, I love movies. I love entering another world, getting caught up in a conflict, and seeing how the conflict is resolved. I love being surprised. I even love being outraged. Storytelling provides a powerful way to reveal the truth, even when the story itself is fiction. Ayn Rand was well aware of this power. It's the reason she chose to devote most of her writing to fiction (including the novel Anthem) and scriptwriting.

Second, I want to encourage more filmmakers to produce works with libertarian themes, and a film festival is a good way to do that. In recent years film festivals have blossomed around the country, with thousands of small ones focusing on specific niche audiences. These festivals give low-budget filmmakers a following and a distribution chain. Libertarianism is one of those niches. A hefty grand prize offered by our co-sponsor, the Pacific Research Institute, provides still more motivation for films that fit the theme.

Third, the FreedomFest umbrella offers something most festivals struggle to provide: a convenient location and a ready audience. Most of our costs, and our risks, can be absorbed by the larger organization. It's a great way to start small and grow.

I have to admit, however, that my greatest asset in producing the Anthem Libertarian Film Festival was withoutabox.com. What a great company for festivals and filmmakers alike! Withoutabox acts as a meeting place for filmmakers and festival producers. Filmmakers receive notices every week about festivals and everything having to do with them, including themes, locations, deadlines, and requirements. I asked some of my filmmakers how they found Anthem, and most of them told me they looked through the withoutabox listings regularly. The name "Anthem" caught their attention as potentially Randian, and the details listed on my withoutabox account confirmed their expectations. A look at my website (anthemfilmfestival.com) suggested to them that it was a high-quality, professionally organized festival, and my submission fee ($15 for early bird registrations, up to $60 for last-minute submissions) was low enough to be worth a risk.

It was capitalism in action: efficient, smart, convenient.

For festival organizers, withoutabox offers even more advantages. They handle all the paperwork involved with accepting submissions — the application forms, legal releases, submission fees, and even the advertising, all for a startup fee of $500 and a graduated advertising program. Because I wanted to start small, I opted for the least expensive advertising option: four group ads to be sent out two weeks before each of my four deadlines. The people at withoutabox processed all my fees and sent me a check, keeping a small percentage for themselves. It was capitalism in action: efficient, smart, convenient. I was happy to pay them their percentage, in exchange for not having to hire someone else to do the work.

I received 60 film submissions from sources I would never have known without withoutabox. It was thrilling to discover young filmmakers with libertarian leanings who had never heard of Liberty Magazine, or Reason or Cato or Atlas for that matter. They simply understand instinctively the principles of liberty and want to express these ideals through their art. One of the best parts of the festival was getting to know these young filmmakers and encouraging them to continue making films with libertarian themes. All of them expressed a desire to come back to FreedomFest next year, with or without a new film, just to hear the speakers.

Of those 60 films, I selected 30 to present at Anthem. Our movies focused on issues of individual freedom, personal responsibility, and self-reliance, as well as the problems of government intrusion and overregulation. Many were satirical, some outrageously so. But these movies were not preachy or didactic. They were entertaining, moving, and motivating. They were movies first, and libertarian movies second. I think that's important—storytelling must touch the emotions first, and guide the listener or viewer to experience a truth. The new media available today offer great new venues for presenting a message in this way.

As might be expected in a festival like ours, we had more documentaries than feature length films. Several of our documentaries focused on education, including Indoctrinate U, about the loss of free speech on college campuses in the wake of political correctness; Zero Percent, about the remarkable education program at Sing Sing Correctional Facility that is entirely funded through private donations and inmate tuition payments; and The Cartel, about the plight of public education and how to fix it. This film, directed by Bob Bowdon, was awarded the PRI Prize for Excellence in Presenting Libertarian Ideals.

We also had documentaries about public policy issues such as the environment (Cool It), international finance and economics (Overdose: The Next Financial Crisis and Free or Equal), and the justice system (A Remarkable Man). Each of these films was followed by a panel discussion, with speakers from all over the world participating in lively and stimulating conversation. Bjorn Lomborg, who wrote The Skeptical Environmentalist and is the featured narrator in Cool It, flew in from Sweden for the festival,and Bob Bowdon of The Cartel moderated two panels, one on education and the other on how to use the new media. We even had a panel called, “What’s Wrong with Selling Sex?” that preceded Lady Magdalene’s, a narrative feature set in a Nevada brothel that stars Nichelle Nichols, the original Lt. Uhura of Star Trek.

The film judged Best Narrative feature was "alleged," starring Brian Dennehy and Fred Thompson in a fresh look at the famous Scopes “monkey trial” that challenged the teaching of evolution in schools. The film focuses primarily on the role of the press in shaping people’s opinions. Journalist H.L. Mencken, a darling of many libertarians, comes off as devious and mean-spirited — which shows that libertarian films aren't going to follow a party line. The film was especially timely in the wake of the high-profile Casey Anthony trial.

Marathon is a poignant true story about poet William Meredith and his partner, Richard Harteis, who faced the difficult decision of what to do when Meredith suffered a debilitating stroke. It has a particularly libertarian theme, because the two men don’t pity themselves or turn to government or other institutions for help. Harteis takes care of Meredith himself. The title refers to the fact that they were both marathon runners, but it’s also a metaphor for going the distance when life gets hard. Harteis produced the film and was on hand to discuss it, and the work won the jury prize for Excellence in Filmmaking.

I was particularly impressed with the short films, most of which were made by novice up-and-coming filmmakers. Usually they were five to 15 minutes long, and all of them focused on libertarian issues. Some were serious short dramas set in dystopian futures, demonstrating what might happen to individual liberties if governments continue down their intrusive paths. Others were satirical comedies using humor to make the same point. Final Census, which won the prize for Best Short Comedy, was so outrageous that I had to soothe a sweet old lady who didn't quite see the humor of a census taker who calmly determines the social value of the people he is hired to count, but I laughed out loud when I saw it the first time.

Bright, the film that won the jury award for Best Short Drama and the Audience Choice award for Best Short Film, is reviewed separately below. Its production values, from the quality of the acting to the music and lighting, were remarkable, especially for a film festival, where movies are generally made on a shoestring budget. Bright was made for $10,000, for example, and Final Census for a mere $150. Next year we will have a panel called "Fiscally Responsible Filmmaking" to showcase their feats of funding magic. For a complete list of the films we screened this year and the awards they earned, go to anthemfilmfestival.com.

One of the most difficult problems with starting a film festival is, of course, that of attracting audiences. Even though we had a ready audience of 2,400 people attending FreedomFest, each film still had to compete with ten speakers — and one other film, as a filmmaker emphasized with a hint of disgruntlement at my decision to screen two films at a time. This year our screening rooms were located in the Skyview Rooms on the 26th floor of Bally's, a long walk down the hall and up the elevators from the main action in the Event Center. Potential viewers had to be fully committed before coming to the films — they couldn't just poke their heads in and then decide whether to stay. Our late submissions deadlines also made it nearly impossible to promote specific films in advance.

But these are simple problems, easily rectified before next year's festival. In 2012, I will, for example, probably select fewer films and show them more than once, since word of mouth grew throughout the conference, and many people were disappointed to learn that a great film they heard about wasn't playing again.

The Anthem Libertarian Film Festival will definitely be back at FreedomFest next year with a new batch of long and short films expressing libertarian ideals. I can't wait to see them, and to meet the filmmakers who will produce them. I hope Liberty's readers will be there too.

But now, let me introduce you to Bright.

Describing the essential requirements of a "skillful literary artist," Edgar Allan Poe wrote in a review of Nathaniel Hawthorne's Tales from an Old Manse: "The unity of effect or impression is a point of great importance . . . Without a certain continuity of effort — without a certain duration or repetition of purpose — the soul is never deeply moved." Every moment, Poe said, must be "conceived, with deliberate care, [to create] a certain unique or single effect."

These movies were not preachy or didactic. They were entertaining, moving, and motivating. They were movies first, and libertarian movies second.

Director Benjamin Busch has created such a work of art with his short film Bright. It's about Troy (Eric Nenninger), a young man who must overcome a paralyzing fear in order to move forward with his life. Every moment in the film is skillfully and deliberately planned to create a particular effect in the viewer. From its opening moments on, the film establishes a rich atmosphere, filled with symbolic imagery, especially the imagery of light. Troy is raised by a blind adoptive father, Irwin (Robert Wisdom), who represents the iconic blind sage of mythology and guides Troy on what turns out to be a spiritual journey. Irwin is blind, but he can "see"; Troy is sighted, but his back is always toward the light.

In this dystopian future, Troy works as a restorationist, helping people regain a sense of continuity with their past by finding old-style original light bulbs for their homes. This spinoff from the current light bulb controversy is, of course, a metaphor for the conflict between what is natural and what is artificial, what is light and what is dark, in the search for courage and meaning in life.

The pacing is deliberately slow, filmed at "the pace of real thought," according to director Busch, who wants viewers to have time to hear the dialogue. Viewers are able to contemplate the film's philosophically provocative lines: "There's danger in all this safety" . . . "Someone who never sees, never knows" . . . "I miss the light but I can remember it" . . . "I loved and I lost, and I'm glad that I loved" . . . "How much would you pay to be happy?"

Bright is a film to be seen with friends, and discussed in long, leisurely conversations afterward. As Poe said of Hawthorne's Tales, "withal is a calm astonishment that ideas so apparently obvious have never occurred or been presented [like this] before." I think Poe would have been pleased with Bright.


Editor's Note: Review of "Bright," directed by Benjamin Busch. 2011, 40 minutes. Anthem Film Festival Best Short Drama and Audience Choice Award.



Share This


Just Super

 | 

Super 8 is the best Steven Spielberg movie to come along in years.

And it isn't even a Spielberg film.

Spielberg's name is on the project as executive producer, just as George Lucas' name is on Spielberg's Indiana Jones movies as producer. But Super 8 was written and directed by J.J. Abrams, who is better known for his work as a producer of "Lost," "Alias," and a variety of other television shows. Nevertheless, it is the most Spielbergian film to come along in many years,a veritable homage to the master of blockbuster films inhabited by preadolescent protagonists. Among the Spielberg effects that Abrams incorporates in this science-fiction coming-of-age thriller are the trademark bicycles spinning into getaway mode, the classic suburban settings, the snappy potty-mouthed dialogue among kids, and the Orwellian military bad guys, reminiscent of E.T.

Abrams creates the best kind of suspense, chilling us with the terror of what we don't see, rather than grossing us out with what we do see — thus doing what Spielberg did so effectively in the first half of Jaws. We know something scary is out there, but it is always obscured by the likes of train cars, bushes, or gas station signs. Our hearts pound and our imaginations run wild as we endure long moments of eerie silence while the camera takes us down paths we would rather not tread. Fearing the unknown is always more terrifying than facing a concrete enemy.

Best of all, Abrams employs the particular kind of coming-of-age storyline for which Spielberg is known. Yes, there's a monster out there, but the real monster is at home, in the form of an unnamed tension between parent and child that has to be resolved. In this story, the tension begins with a mother's funeral. Her son Joe (Joel Courtney) is not allowed to associate with Alice (Elle Fanning) because Alice's father (Ron Eldard), the town loser, was somehow involved in his mother’s death. Joe likes Alice — he likes her a lot! — and that creates tension between the two of them, as well as between Joe and his father (Kyle Chandler), who forbids Joe to see Alice. This iconic conflict between father and child, set against the backdrop of an unknown monstrous intruder, gives this film a satisfying heft.

The story centers on a group of middle-schoolers who have been friends since toddlerhood. Abrams' kids ring true. They're precocious and nerdy in a believable, unassuming way. Their dialogue also rings true, throughout. Charlie (Riley Griffiths) has been the leader of the gang. Like Spielberg, who began making movies with his super 8 camera at the age of 10, Charlie wants to make a movie to enter in a local film festival. He enlists his friends as actors, cinematographers, script consultants, and makeup technicians, and he barks at them throughout rehearsals and filming with the commanding voice of an artistic perfectionist. Charlie is a young Spielberg himself.

One evening Charlie takes his cast "on location" to an isolated train depot to film a scene as the train goes by. He's looking for a dramatic backdrop for his characters' climactic "hill of beans" speech. (This amounts to an homage within an homage, and it works.) But the train suddenly derails — in the most spectacular wreck ever created on film. Ever. The audience in the screening I attended erupted in spontaneous applause as the final piece of wreckage plopped to the ground.

Then, as another homage to Spielberg, Abrams focuses our attention down to a small piece of bloody wreckage, creates a sense of terror as we suspect what might be underneath it, and follows with a gotcha laugh that releases the pent-up tension that the train wreck built so skillfully. Pure genius. Pure Spielberg. In reality, collaborated Abrams.

The rest of the film becomes a typical kids-versus-the world story as these young people try to figure out what the government is trying to hide about the train and its contents. What isn't typical, however, is the quality of the dialogue and the acting of the kids. They are stunningly natural and believable. One of my favorite lines: Charlie says, "We need to develop this film right away. I'm gonna go steal some money from my mom." No hesitation over a moral dilemma. He's a director. He needs money. And he's a kid. He gets it.

With the exception of Elle Fanning (whose career has been active, though overshadowed by that of her older sister, Dakota) these are virtually unknown actors, fresh and new and ready to be molded by their director. I expect to see a lot more of them in the near future. The parents, also, are believable and natural. They are too caught up in their own grownup worlds to recognize what is going on with their children. As a result, their kids are free to roam the town, think for themselves, and learn how to make things happen.

The film has a message, and it's a good one. It argues for overcoming grudges and learning to understand one another. At one point a soldier is lifted by his rifle high into the air. If he holds on, he will die. If he lets go, he will live. Such a simple, subtle message: Let go of the guns. Let go of the grudges. One of Charlie's characters says, "You do have a choice. We all do!" That's an important truth to remember, the truth of individual responsibility and freedom.

Super 8 has it all: great entertainment, great characters, great special effects, great story, and a great message portrayed subtly at the micro and macro level. What more can you ask from a movie?


Editor's Note: Review of "Super 8," directed by J.J. Abrams. Paramount, 2011, 112 minutes.



Share This


War and Peace

 | 

I’ve said it before, and I’ll say it again: good movies begin with good stories. By that criterion, Incendies is not just a good movie, it is a great movie. Set within a backdrop of bitter hatred and torturous war, it is nevertheless a brilliant film about love for family and finding a personal peace.

The story begins with the classic Romeo and Juliet conflict: Nawal Marwan (Lubna Azabal), a young Christian Arab woman, is in love with a young Palestinian man, and her family disapproves. What happens next — retribution, abandonment, shunning, and revenge — sets the stage for an alternate story line, 35 years in the future, after the woman has died. In her will she asks her young adult children, Jeanne (Mélissa Désormeaux-Poulin) and Simon (Maxim Gaudette), fraternal twins, to find the brother they did not knew existed and the father they thought was dead. This will require them to leave their home in Canada and return to the land of their ancestors in the Middle East.

As Jeanne heads to Lebanon to begin the search for her father in her mother’s hometown, the film flashes back to the young Nawal and her lover, Wahad. The film continues to switch between the two stories as the brother and sister follow the cold dark trail of the mother they only thought they knew. These alternating points of view allow the audience to know Nawal’s story more intimately and completely than the young siblings do, enhancing our compassion for the protagonist and our growing sense of horror as the two slowly discover the truth.

As war breaks out, young Nawal tries to escape the fighting while searching orphanages for the son her grandmother forced her to give up. Along the way she observes the bitterness and retaliation of both religion-based factions. Two scenes stand out as representative of the senselessness and atrocity of this kind of conflict. In the first, Nawal quickly removes the cross from around her neck and rearranges her scarf to cover her, so she can avoid the wrath of Muslims. In the next scene, she quickly doffs the scarf and pulls out her cross to show rebel guerillas that she is a Christian. But she is still the same person, inside and out; only the label has changed. Changing the label saves her life — but the death and destruction she observes destroy her soul.

Incendies is a thrilling mystery about a family’s quest to reunite itself. But it also has a powerful symbolic message, revealing the bitterness that comes from assigning divisive political and religious labels. What does it mean to be a Christian, a Muslim, or a Jew? Beneath the labels, all in the Middle East claim the same ancestry. Arabs (Christian or Muslim) may hate Jews because Ishmael is their ancestor; Jews may hate Arabs because Isaac is their ancestor. But trace their roots back just one more generation, and all honor Abraham as their father. All are cousins under the labels. All are of the same lineage and family.

Incendies is the most engrossing film I have seen since last year's The Secret in their Eyes (also a foreign film). Yes, you will have to read the subtitles at the bottom of the screen — unless you speak French, Arabic, and another dialect I didn’t recognize. But it will be well worth the effort. Don’t miss this outstanding film if it comes to your town.


Editor's Note: Review of "Incendies," directed by Denis Villeneuve. Sony Pictures Classics, 2010, 130 minutes.



Share This


Good as Gold

 | 

James Grant is the best-spoken and most accomplished hard-money man in Wall Street. Plenty of people can inveigh against the Fed; the publisher of Grant’s Interest Rate Observer can put it in the context of today’s financial markets and also in the context of history.

Grant is also a historian. He wrote a biography of John Adams, and another of financier Bernard Baruch, and several other histories, including my favorite among his works, The Trouble with Prosperity: The Loss of Fear, the Rise of Speculation and the Risk to American Savings (1996). His new book, “Mr. Speaker! Mr. Speaker!” should be of particular interest to readers of Liberty.

That’s not because of any deep interest I have in his nominal subject, Thomas B. Reed (1839–1902). Reed was a moderately interesting figure. He was fat, he ate fish balls for breakfast, and he spoke French. He was a congressman from Maine, first elected in the centennial year, 1876. He was a man of his time, a Republican stalwart who supported the tariff and gold and opposed foreign wars. Though he was for woman suffrage, he was no progressive, at least in the sense that today’s progressives are. He had no desire to teach other people how to live. “Pure in his personal conduct,” Grant writes, “he had no interest in instructing the impure.”

During the 1890s, Reed was speaker of the House. His claim to fame is how he changed the House rules to give himself, and the majority, much more power to get things done. Whether that was good or bad depends on your point of view.

Parts of the book are about Reed’s parliamentary maneuverings and also about his wit. These parts are well-written, but I was not much interested in them. The greater part of the book, however, is about the financial events and national political battles from 1870 or so to 1899, which I found very interesting.

Now gold is the money of the distant past. In the 1870s, Grant observes, “Gold was the money of the future."

Grant covers several such things, from the collapse of the Freedmen’s Bank to the stolen election of 1876. He has a particular interest in the currency. One story he tells is of Lincoln’s printing of greenbacks during the Civil War, and the struggle afterward over restoring the link to gold. In the ideology of the day, restoration was necessary. It was part of honorable dealing. But people dragged their feet about doing it, because a national debt had been piled up during the war, and what was the need to repay investors in gold if they had bought bonds with greenbacks? And besides, there was a boom on, and shrinking the money supply would spoil it.

The boom went bust in 1873. For the rest of the decade the country was arguing about the commitment to return to gold. The way in which this argument was conducted was much different from the way it would be today. Now gold is the money of the distant past. In the 1870s, Grant observes, “Gold was the money of the future”:

“A 21st century student of monetary affairs may stare in wonder at the nature of the monetary debate in mid 1870s America. A business depression was in progress. From the south, west and parts of the east arose a demand for a cheaper and more abundant currency. Yet such appeals were met with, and finally subdued by, a countervailing demand for a constant, objective and honest standard of value.”

“Resumption day” was Jan. 1, 1879. There followed “a mighty boom.” In the 1880s, prices fell 4.2% and wages rose.

This was the era of laissez-faire. “The government delivered the mails, maintained the federal courts, battled the Indians, examined the nationally chartered banks, fielded the army, floated the navy and coined the currency.” The big social program was Civil War pensions — for the Union side, not the Confederates. By the standards of the day, the Democrats were the small-government party and the Republicans were the big-government party, but policy differences were at the margin only.

The federal government was funded by the tariff, which the Republicans, and Reed, thought was good for American labor. The Democrats quoted the free-trade arguments of Frédéric Bastiat. “The Republicans,” Grant writes, “having no ready answer for Bastiat’s arguments, were reduced to pointing out that he was, indeed, French.”

The ideological atmosphere started changing in the 1890s. The Panic of 1893 brought on a severe depression, and another political battle over the currency. This time the supporters of gold battled it out with the supporters of silver, with bimetallists arguing for both at once.

Grant provides the best explanation of the gold-versus-silver battle I have read, especially the futility of having two metallic standards at the same time. Here he is not so proud of Thomas Reed, who was by then speaker: “One senses, reading him, that he was not quite sure what the gold standard was all about.” Reed’s position “lacked clarity, or, one might even say, courage. [President Grover] Cleveland had one unshakable conviction, which was gold. Reed had many convictions, only one of which — no free silver — was strictly nonnegotiable.”

Reed’s final battle was over war with Spain. He was against it. The new president, William McKinley, seemed to be against it, but lacked the courage really to oppose it. Ex-President Cleveland was against it, as was William Jennings Bryan, the presidential candidate whom McKinley had beaten in 1896. But the Congress was for it, and even with his self-enhanced power as speaker, Reed couldn’t block the war resolution of 1898. It passed the House 310 to 6.

A year later, Reed resigned. He wrote to a friend, “You have little idea what a swarming back to savagery has taken place in this land of liberty.” Publicly he said nothing. Three years later he was dead.

And so Grant ends his book. It is a fine book. I recommend it. Don’t be put off by any lack of interest you may have in Thomas B. Reed. I wasn’t much interested in him, either. You don’t have to be.


Editor's Note: Review of "'Mr. Speaker! Mr. Speaker!' The Life and Times of Thomas B. Reed, the Man Who Broke the Filibuster," by James Grant. Simon & Schuster, 2011, 448 pages.



Share This


Appleby's Revolution

 | 

One thing that has always struck me about the Communist Manifesto is that Karl Marx and Friedrich Engels held capitalism in awe. “The bourgeoisie, during its rule of scarce one hundred years, has created more massive and more colossal productive forces than have all preceding generations together,” they wrote in 1848. They listed the wondrous accomplishments one by one, from steam navigation to canal building, and “whole populations conjured out of the ground.” Of course, their praise preceded a call for workers to overthrow the great capitalist conjuror.

Joyce Appleby’s book, The Relentless Revolution: A History of Capitalism, conveys the same sense of wonder. She praises capitalism, saying that its “distinctive characteristic . . . has been its amazing wealth-generating capacities. The power of that wealth transformed traditional societies and continues to enable human societies to do remarkable things."

Appleby certainly doesn’t recommend overthrowing the system. But her appreciation of capitalism diminishes as the book goes on, just as Marx’s and Engels’ did in the manifesto. At the start, The Relentless Revolution is like an exhilarating train ride, full of insights and a historian’s gold mine of information, but it loses steam and slows to a crawl once the Rubicon of the Industrial Revolution has been crossed. At the end, Appleby is praising the US government for trying to rein in the capitalist beast.

Appleby, who taught history at UCLA, describes herself as a “left-leaning liberal with strong . . . libertarian strains.” Given today’s academic history departments and her own attitudes, she deserves admiration for looking at capitalism as objectively as she can. “In this book,“ she writes, “I would like to shake free of the presentation of the history of capitalism as a morality play, peopled with those wearing either white or black hats.” For half the book, she achieves that goal.

The Relentless Revolution fits into the recent parade of big books trying to explain the rise of the industrial West — books such as Jared Diamond’s Guns, Germs, and Steel, Kenneth Pomeranz’s The Great Divergence, and Ian Morris’ Why the West Rules — For Now. These and other authors are trying to answer a couple of giant questions: why did the West (not China, India, or another part of the world) make the transition from subsistence living to a world of constantly increasing productivity? And how exactly did it happen? As I shall explain, Appleby offers two major responses that I found valuable.

Appleby identifies two main factors in Britain’s rise: higher wages providing the incentive to increase productivity, and coal providing the means.

The pivotal moment in world history is normally called the Industrial Revolution (although perhaps that’s a misnomer, given its slow gestation). Many factors contributed to this revolution, and part of the “game” motivating the big books is to offer new factors. It’s generally agreed that the process flowered first in Great Britain, but the reasons may stretch back to such things as the European practice of primogeniture, the separation of powers between the Catholic Church and the state, and the system of rights spawned by feudalism under changing population pressures.

Appleby focuses on 17th and 18th-century England (why she gives short shrift to Scotland is a puzzle). She points to two reasons why the Industrial Revolution occurred there: England had higher wages than the Continent, and also cheap sources of coal. Higher wages provided the incentive to increase productivity, and coal provided the means.

The idea that England’s wages were higher is actually new to me and undoubtedly has a complex history of its own; Appleby merely says that England’s population growth had leveled off in the 17th century. As for coal accessibility, she agrees with Pomeranz, who says that China fell behind Europe partly because its coal deposits were harder to reach.

Whatever the precursors, actual inventions — especially of the steam engine — required talent and knowledge, and herein lies the first of Appleby’s distinctive insights. In a way that I haven’t seen before, Appleby integrates two related forces: the British tendency toward scientific theorizing (or “natural philosophy”), and the British tendency to tinker. “Technology met science and formed a permanent union. At the level of biography, Galileo met Bacon,” she writes.

She elaborates: “Because of the open character of English public life, knowledge moved from the esoteric investigations of natural philosophers to a broader community of the scientifically curious. The fascination with air pressure, vacuums, and pumps became part of a broadly shared scientific culture that reached out to craftsmen and manufacturers in addition to those of leisure who cultivated knowledge.”

The Royal Society illustrates this integration. Created in 1662, it was both practical and scientific. Its members studied that New World import, the potato, but it also “brought together in the same room the people who were most engaged in physical, mechanical, and mathematical problems.”

Appleby’s second valuable contribution is to take a cold-eyed look at the role of New World slavery in creating the markets that nurtured the Industrial Revolution. Indeed, like Pomeranz, she sees slavery as a major factor behind the engine of progress.

In The Great Divergence, Pomeranz says that the vast expansion of agricultural cultivation in the New World enabled Europe to avoid the “land constraint” that kept the Chinese from developing an industrial economy. And slave labor played an enormous role in the cultivation of sugar in the Caribbean and cotton in the United States. Thus, Pomeranz says, Europe overcame the limitations of land by colonizing the New World, which made agriculture possible to an extent unlikely in tiny Holland or England.

Appleby’s take is a little different. She emphasizes more the three-way trade that we learned about in middle school (slaves from Africa were taken to the New World, where sugar was purchased and taken to Great Britain, where finished clothing and other products were bought, to be sold in Africa). Pomeranz and Appleby are not arguing that slavery was profitable, or that it was “necessary,” but, rather, that it was a significant element in the system of trade that led to the Industrial Revolution. Enthusiasts for capitalism such as myself tend to focus on the “trade”; critics of capitalism — along with Appleby — stress the “slave” part of the trade.

Furthermore, Appleby considers American plantations and British inventions as two sides of the same coin. “These two phenomena — American slave-worked plantations and mechanical wizardry for pumping water, smelting metals, and powering textile factories — may seem unconnected. Certainly we have been loath to link slavery to the contributions of a free enterprise system, but they [the phenomena]must be recognized as twin responses to the capitalist genie that had escaped the lamp of tradition during the seventeenth century.”

Whether she is right about this, I don’t know, but she brings to public attention the periodic debate by economic historians over the role of slavery, a debate that Pomeranz reawakened with The Great Divergence.

Unfortunately, after these interesting observations, The Relentless Revolution begins to wind down. As the book moves on, Appleby tends to equate anything that takes industrial might — such as the imperialist armies that took possession of Africa in the late 19th century — with capitalism. “Commercial avarice, heightened by the rivalries within Europe, had changed the world,” she writes in explaining the European adventures in Africa. I dispute that. While Cecil Rhodes and Henry Morton Stanley may have been private “investors,” for the most part Africa was overcome by government power, not by individuals or corporations.

While Cecil Rhodes and Henry Morton Stanley may have been private “investors,” for the most part Africa was overcome by government power, not by individuals or corporations.

Even greater blindness sets in as Appleby’s 494-page volume moves into the recent past and she becomes influenced by modern prejudices. In Appleby’s view, the financial crash in 2008 was caused by financiers; the only contribution by public officials was to “dismantle the regulatory system that had monitored financial firms.” No word about the role of the Federal Reserve, the Community Reinvestment Act, or Fannie Mae and Freddie Mac.

Indeed, capitalism becomes the same as exploitation, in spite of her more balanced initial definition of capitalism as “a cultural system rooted in economic practices that rotate around the imperative of private investors to turn a profit.”

In her last chapter, Appleby writes, “The prospect of getting rich unleashed a rapacity rarely seen before in human society.” This extravagant statement does not jibe with archaeological (not to mention historical) evidence of the human past. In Why the West Rules — For Now, Morris reports on an archaeological excavation at Anyang village in China. In 1300 BC, this home of the Shang kings was the site of massive deaths. Morris estimates that, in a period of about 150 years, the kings may have killed 250,000 people  — an average of 4 or 5 a day, but probably concentrated in large, horrific rituals — “great orgies of hacking, screaming, and dying.” This was the world before capitalism.

To be fair to Appleby, she is saying that the productivity unleashed by capitalism extended the breadth of human rapacity, and to some degree the example of slavery supports that notion. Eighteenth-century British gentlemen could think high-minded thoughts while enjoying their tea with sugar grown by often brutally treated slave labor — which they may have invested in. “This is the ugly face of capitalism,” Appleby writes, “made uglier by the facile justifications that Europeans offered for using men until they literally dropped dead.”

True. At the same time, it was the British who abolished the slave trade, at high cost in patrolling the seas. Before that, Africans were enslaved only because other Africans captured and sold them to the Europeans. (Malaria and other diseases kept Europeans out of the interior of Africa until late into the 19th century.) There is a lot of blame to go around.

So this book is a mixed bag. Even though Appleby increasingly departs from objectivity as she proceeds into modern times, I respect her project and appreciate her insights. I hope that her “left-leaning” historian colleagues read her book and expand their understanding of the past — just as I have.

fits into the recent parade of big books trying to explain the rise of the industrial West


Editor's Note: Review of "The Relentless Revolution: A History of Capitalism," by Joyce Appleby. Norton, 2010, 495 pages.



Share This


Enforcers of Health

 | 

Libertarians have uneasy thoughts about the enforcers of public health. In my state, the public health authorities banned tobacco billboards — an act that some thought a violation of the First Amendment. This year, they pushed in the legislature to ban the sale of flavored cigars and pipe tobacco. In both cases, they have said that their aim is to make tobacco less attractive to children.

A century ago the fight was much more immediately vital. Then people of libertarian mind were fighting with the public health enforcers over the control of smallpox.

That disease was eradicated in the 20th century by campaigns of vaccination, many of them not entirely voluntary. In the United States, a turning point came in the epidemic of 1899 to 1902, when outbreaks of the disease prompted a political battle over compulsory vaccination. The story is told in legal writer Michael Willrich’s new book, Pox.

The push for compulsory vaccination grew out of the facts of the disease itself. It was a killer. Of people infected by the main strain of the disease, between 20 and 30% died. Smallpox was highly contagious, so that an infected person was a threat to everyone around him who was unvaccinated. First symptoms appeared more than a week after exposure, so fighting the epidemic by treating sick people was a strategy of being perpetually behind. The disease could, however, be stamped out by vaccinating the healthy.

For the progressives, the model was the Kaiser’s Germany. As Willrich says, “German law required that every child be vaccinated in the first year of life, again during school, and yet again (for the men) upon entering military service.” Germany had “the world’s most vaccinated population and the one most free from smallpox.”

In the constitutionalist America of the 1890s, vaccination was a state, local, and individual responsibility. Americans, writes Willrich, were “the least vaccinated” people of any leading country. The federal government had a bureau, the Marine-Hospital Service, to tend to illnesses of sailors; it had been started in the 1790s under President John Adams. The Service had experts in smallpox control, but they were advisory only and expected local authorities to pay for local work.

The antivaccinationists argued that compulsory vaccination would lead to other bad things. And four years later, in 1906, Indiana enacted America’s first compulsory sterilization law.

In a smallpox outbreak, the public health enforcers would set up a “pesthouse,” usually at the edge of town. They would scour the community for the sick, paying particular attention to the shacks and tenements of blacks, immigrants, and the rest of the lower classes, where the disease tended to appear first. People in these groups were subjected, Willrich says, to “a level of intrusion and coercion that American governments did not dare ask of their better-off citizens.”

Public-health doctors, accompanied by police, would surround tenement blocks and search people’s rooms. The sick would be taken to the pesthouse under force of law, and everyone else in the building would be vaccinated.

Often a community would have a vaccination ordinance that on its face was not 100% compulsory. It would declare that no child could be admitted to public school without a vaccination mark. For adults, the choice might be vaccination or jail — and all prisoners were, of course, vaccinated.

The procedure involved jabbing the patient with a needle or an ivory point and inserting matter from an infected animal under the skin. Typically it was done on the upper arm; often that arm was sore for a week or two, so that people in manual trades couldn’t work. Sometimes, vaccination made people so sick it killed them — though not nearly as often as the disease did.

Such was the background for the outbreaks of 1899–1902. At that time, two new factors emerged. First, America had just had a war with Spain, and had conquered the Philippines, Cuba, and Puerto Rico. All these places were rife with disease — and their status as conquered possessions, Willrich writes, provided “unparalleled opportunities for the exercise of American health authority.”

There the authorities had no worries about people’s rights: “The army’s sanitary campaigns far exceeded the normal bounds of the police power, which by a long American constitutional tradition had always been assumed to originate in sovereign communities of free people.” And coercive methods worked. They worked dramatically.

The second new thing in 1899 was that most of the disease spreading in the United States was a less-virulent form that killed only 0.5 to 2% of those infected. That meant that many Americans denied the disease was smallpox, and didn’t cooperate. The public health people recognized what the disease for what it was, and they went after it with the usual disregard of personal rights — maybe even more disregard, because of “the efficiency of compulsion” overseas. And they stirred up, Willrich writes, “one of the most important civil liberties struggles of the 20th century.”

Their fight was with the antivaccinationists, whom Willrich calls “personal liberty fundamentalists” who “challenged the expansion of the American state at the very point at which state power penetrated the skin.”

He continues:

“Many antivaccinationists had close intellectual and personal ties to a largely forgotten American tradition and subculture of libertarian radicalism. . . . The same men and women who joined antivaccination leagues tended to throw themselves into other maligned causes of their era, including anti-imperialism, women’s rights, antivivisection, vegetarianism, Henry George’s single tax, the fight against government censorship of ‘obscene’ materials and opposition to state eugenics.”

Often, antivaccinationists denied that vaccination worked. Many were followers of chiropractic or other non-standard medicine; this was also the time of “a battle over state medical licensing and the increasing dominance of ‘regular,’ allopathic medicine.” Of course, the antivaccinationists were wrong about vaccination not working, but they were not wrong when they said it was dangerous. In 1902, the city of Camden NJ required all children in the public schools to be vaccinated. Suddenly children started coming down with lockjaw. They were a small fraction of those who had been vaccinated, but all of them fell ill about 21 days after vaccination. Several died, and the press made a scandal of it.

Our newly conquered possessions provided “unparalleled opportunities for the exercise of American health authority.”

Under the common law of the day, the vaccine maker — the H.K. Mulford Co. — was not liable to the parents, because it had no contract with them. Nor was the government liable, though the government had required the treatment. Thus, writes Willrich, “The arm of the state was protected; the arm of the citizen was not.”

The medical and public health establishment initially denied that the lockjaw had been caused by the vaccines. This was admitted only after a consultant to a rival vaccine maker, Parke-Davis, made an epidemiological case for it. Congress responded by quickly passing the Biologics Control Act of 1902, which ordered the inspection and licensing of vaccine producers, and required them to put an expiration date on their products. It was one of the first steps in the creation of the regulatory state.

The act calmed the public and drove some smaller companies out of business. The first two licensees were Parke-Davis (later absorbed by Pfizer) and Mulford (absorbed by Merck). And the purity of vaccines, which had been improving already, improved dramatically in the following decade.

The antivaccinationists turned to the courts in a fight about constitutional principles. The argument on the state’s side was that it had a duty to protect citizens from deadly invasion, which might be launched by either an alien army or an alien army of microbes. The argument on the other side was the individual’s right to control what pathogens were poked into his body, and, as the antivaccinationists’ lawsuit contended, his right to “take his chance” by going bare in a time of contagion.

They brought their case to the Supreme Judicial Court of Massachusetts, and lost. Citing the power of government to quarantine the sick and conscript soldiers in war, the court said, “The rights of individuals must yield, if necessary, when the welfare of the whole community is at stake.” Then they appealed to the US Supreme Court. In the same year in which the libertarian side won a landmark economic-liberty ruling in Lochner v.New York (1905), it lost, 7–2, in the vaccination case, Jacobson v.Massachusetts. Writing for the court, Justice John Harlan compared the case to defense against foreign invasion, and wrote, “There are manifold restraints to which every person is necessarily subject for the common good.”

It is an unlibertarian answer. The author of Pox thinks it was the right answer — and so do I, in this case. Smallpox was indeed a kind of invasion. In the 20th century it killed 300 million of Earth's people, and disfigured millions more. And though public health people can make similar arguments, and they do, about tobacco as a mass killer, I do not support their efforts to stamp out its use by ever more state intrusion. The difference is that with smallpox the individual’s refusal to be vaccinated put others in imminent danger of death. Collective action — compelled collective action — was the only way to defeat the invader. None of this is the case with tobacco.

Naturally, the progressives wanted to apply their wonderfully practical idea to all sorts of things. Referring to epidemics and the germ theory of disease, Willrich writes, “Progressives took up the germ theory as a powerful political metaphor. From the cities to the statehouses to Washington, the reformers decried prostitution, sweatshops, and poverty as ‘social ills.’ A stronger state, they said, held the ‘cure.’ ”

The antivaccinationists argued that compulsory vaccination would lead to other bad things. They made some fanciful statements; one wrote, “Why not chase people and circumcise them? Why not catch the people and give them a compulsory bath?” But they were right. Four years later, in 1906, Indiana enacted America’s first compulsory sterilization law. This was done in pursuit of another scientific, society-improving cause that excited progressives: eugenics. In 1927, in Buck v.Bell, the Supreme Court approved Virginia’s law of compulsory sterilization of the “feeble-minded.” That was the case in which Justice Oliver Wendell Holmes — supposedly the great champion of individual rights — pontificated: “The principle that sustains compulsory vaccination is broad enough to cover cutting the Fallopian tubes. Jacobson v.Massachusetts, 197 U.S. 11. Three generations of imbeciles are enough.”

Jacobson has been cited in many Supreme Court decisions, usually to approve state power. And the applications have not been confined to medicine. In 2004, Justice Clarence Thomas cited Jacobson in his dissent in Hamdi v.Rumsfeld, which was about a US citizen captured in Afghanistan. Thomas, often regarded as the most libertarian of the current justices, was arguing that Yaser Esam Hamdi had no right of habeas corpus and could be held on US soil indefinitely without charge.

Collective action — compelled collective action — was the only way to defeat the invader. None of this is the case with tobacco.

The smallpox epidemic of 1899–1902 was a pivotal event institutionally as well as legally. The Biologics Act created more work for the Marine-Hospital Service, which a few years later became the US Public Health Service. The service’s Hygienic Laboratory, which was empowered to check the purity of commercial vaccines, later grew into the National Institutes of Health.

From the story told in Pox, you can construct rival political narratives. The progressives’ “we’re all in it together” formula about the need for science, federal authority, officials, regulation, and compulsion is there. And in the case of smallpox, progressivism put its strongest foot forward: it was warding off imminent mass death. Smallpox is one of the least promising subjects for the libertarian formula of individual rights and personal responsibility. Yet here you can also find the bad things that libertarian theory predicts: state power used arbitrarily and unevenly, collateral deaths, official denial of obvious truths, a precedent for worse things later on, and even the favoring of large companies over small ones.

I don’t like the progressive formula. But in the case of smallpox it fits, and I accept it, looking immediately for the limits on it.

Fortunately for liberty as well as health, few other diseases are as contagious and deadly as smallpox. Even Ebola and SARS — two contagious and deadly diseases of recent decades — were mostly fought by quickly isolating the sick. State power was necessary — some people were forcibly quarantined — but there were no mass vaccinations of the healthy.

Still, vaccination surfaces as an issue from time to time. It is, on its face, a violation of the rights of the person. So is quarantine. And yet it is prudent to allow both of them in some situations.

That is an admission that the libertarian formula doesn’t work all the time. Liberty is for rational adults in a minimally normal world. That is a limitation, but not such a big one. Surely it is better to design a society for people who can think, make decisions, and take responsibility, while in a few cases having to break those rules, than to live in a world designed for people defined as children of the state.


Editor's Note: Review of "Pox: An American History," by Michael Willrich. Penguin, 2011, 400 pages.



Share This


Individualism in Real Life

 | 

Bethany Hamilton is one of those perfect libertarian heroes. When she wants something, she goes after it. When things go wrong, she looks for a way to make it right. She doesn't whine or complain, even when the thing that goes wrong is the horror of a shark taking off her arm. She relies on herself, her family, and her God.

The movie about Bethany, Soul Surfer, has its predictably maudlin moments, fueled by Marco Beltrami's heavily emotional musical score, but don't let that put you off. If you are looking for a film to demonstrate libertarian principles to your friends, take them to Soul Surfer.

The film is based on the true story of Bethany, a competitive surfer with corporate sponsorship who was on her way to professional status when a huge shark bit off her arm. She returned to competitive surfing within a matter of months, and is now a professional surfer. She also seems to be a really nice girl. I learned that not only from the film, but also from the articles I have read about her.

And the Hamiltons seem to be a model libertarian family. They ignored traditional middle-class expectations in order to follow the dreams they made for themselves. All of them, parents and children alike, live to surf. When Bethany showed a great aptitude for surfing, her parents opted out of the government school system and educated her at home so she could take advantage of daytime surfing. After her injury, they did for her only the things she absolutely could not do for herself, encouraging her quickly to find new ways of managing her "ADLs" (activities of daily living).

The film portrays the Hamiltons (Dennis Quaid and Helen Hunt, parents) as a close-knit family within a close-knit community of surfers who understand the true nature of competition. True competition isn't cutthroat or unfair. In fact, unfettered competition has the power to make every person and every product better. Even Bethany's surfing nemesis, Malina Birch (Sonya Balmores), is portrayed as competitively solid. After she paddles full out toward a wave during a competition instead of kindly slowing down to allow for Bethany's injury, Bethany (AnnaSophia Robb) thanks Malina for treating her as an equal and pushing her to be her best. It's a great example of the good that competition can accomplish.

Instead of turning to government support groups to help her deal with her injury, Bethany turns to three free-market sources: family, business, and religion. When she returns to surfing, she rejects the judges' offer to give her a head start paddling past the surf. Instead, her father designs a handle to help her "deck dive" under the waves. When finances are a problem, a news magazine offers to provide her with a prosthetic arm in exchange for a story, and a surfboard manufacturer sponsors her with equipment and clothing. The youth leader at her church (Carrie Underwood) gives her a fuller perspective on her life by taking her on a service mission to Thailand after the tsunami that hit in 2004. There she learns the joy of serving others — a kind of work that earns her psychic benefits rather than monetary rewards. She isn't "giving back"; she is "taking" happiness.

These examples of self-reliance and nongovernmental solutions to problems raise the level of this emotionally predictable film to one that is philosophically satisfying — and well worth seeing.


Editor's Note: Review of "Soul Surfer," directed by Sean McNamara. Sony Pictures Entertainment, 2011, 106 minutes.



Share This


Atlas at Last

 | 

The John Galt Line has finally pulled out of the station, and is barreling across the country picking up hitchhikers who may be wondering what all the fuss is about. After a spectacular pre-release advertising campaign that included multiple premieres in major cities and pre-purchased ticket sales that encouraged nearly 300 screen owners to give the film a chance, Atlas Shrugged Part 1 opened on April 15 (Tax Day) as the third-grossing film of the weekend (looking only at screen averages, not total sales).

"Mixed" is an understatement when describing the reviews. Professional critics on RottenTomatoes give it a 6% approval rating, perhaps the lowest rating I have ever seen for a film. Meanwhile, audiences gave it an unbelievable 85%.

In fact, the film doesn’t deserve either rating. It belongs somewhere in the low middle.

It is not as good as the 85% would indicate; audiences who saw it on opening weekend are rabid fans, bent on a mission to have everyone in America see the film and learn Ayn Rand's philosophy: free markets, free thinking, and self-reliance.

But it doesn't deserve the godawful 6% usually reserved for low-budget slasher flicks, either. It is not as bad as its low budget and relatively unknown cast of actors and producers would cause one to expect. It is respectable.

The cinematic quality is quite good, especially the outdoor scenes of Colorado and the special effects used to create the train and the bridge. The acting isn't bad, but it isn't great. Often I was painfully aware of Taylor Schilling being painfully aware of where Dagny should place her arm, or how Dagny should turn her head; I never felt that she embodied Dagny. Similarly, the background cast at the Reardens' anniversary party appeared to be made up of friends and family of the cast and crew (someone needed to teach them how NOT to mug for the camera).

For fans of Ayn Rand and Atlas Shrugged Part 1, the brightest compliment for this film is that it stays true to first third of the book. (Parts 2 and 3 are expected to follow.) For fans of filmmaking, however, the biggest problem is that it stays true to the book. The film is dialogue heavy, with very little action.

I’m not a Hollywood film reviewer; but I’m a watcher and a reader. I know that books and films are two different genres, and their stories have to be presented in two different ways. Books are primarily cerebral; films are primarily visual. Books can focus on philosophy and conversation; films must focus on action. Books can take days or weeks to read; films must tell their story in a couple of hours. When adapting a book to film, streamlining is essential. Unfortunately, the words in this film are so dense that the ideas become lost.

Atlas Shrugged Part 1 contains some great quotations, but it is not a film that will convince anyone but the Rand faithful of the supremacy of the free market. It makes the same mistake that most libertarians do when espousing philosophy: it assumes that everyone already sees the problems in the way libertarians do. It does not sufficiently engage the non-business person in seeing the long-term effects for everyone when government intervenes in the market. I can hear my middle-class neighbors and colleagues saying "So what?" when Rearden (Grant Bowler) is forced to sell all but one of his businesses. "How is that going to hurt me?" they might wonder.

Even the conflict between Dagny's pure free-market economics and her brother James's (Matthew Marsden) collusion with government is insufficiently portrayed; Dagny seems to be simply getting around the stockholders when she takes over the John Galt Line. Moreover, she and Rearden can hardly be seen as icons of virtue when they violate a freely made and morally binding contract (his marriage vows) by jumping into bed together. Even more damning is Ellis Wyatt's decision to burn his oil fields rather than let anyone else enjoy the fruits of his labor. My middle-class neighbors would howl with outrage at this decision. In short, I don't see how this film will convince anyone that returning to free-market principles will improve our economy and our way of life. It seems like everyone in the film is cutting moral corners somewhere.

"Not bad" is faint praise for a movie that has been 50 years in the waiting. Unfortunately, business pressures caused it to be rushed through with only five weeks in the writing, and another five weeks in the filming. Business is often an exercise in compromise, and this film's production is a classic example. I think, however, that if The Fountainhead's Howard Roark had been the architect of this film, it would have been burned along with Ellis Wyatt's oil fields. It's good, but not good enough.


Editor's Note: Review of "Atlas Shrugged Part 1" (2011), directed by Paul Johannson. The Strike Productions, 2011, 97 minutes.



Share This


How to Unblock Your Writing

 | 

Wouldn't it be great to have limitless access to all the cells in your brain? To have a Google feature of sorts that would allow you to immediately call up just the right fact or memory that you need at any given moment, and the ability to synthesize and analyze all that information instantly?

That's what the drug NTZ does for characters in the film Limitless, a mystery-thriller in theaters now. Eddie Morra (Bradley Cooper) is a sci-fi writer with a contract for a book, but a debilitating writer's block has prevented him from writing a single word. His life is a mess, his apartment is a mess, he's a mess, and his girlfriend Lindy (Abbie Cornish) has just broken up with him because of it.

Then he meets an old friend, Vernon Gant (Johnny Whitworth) who gives him a tab of NTZ. Within minutes Eddie experiences a rush of insight and intelligence. He can remember books he thumbed through in college, television shows he saw in childhood, and math equations he learned in high school. Within hours he has cleaned his apartment, resolved his back rent, and written 50 scintillating pages of his novel. But the next day, he is back to being the same sloppy Eddie who can't write a single word. More NTZ is in order.

If this story line sounds familiar, it is. Daniel Keyes explored this idea of artificially stimulated intelligence in his "Flowers for Algernon," which was later made into the movie Charlie starring Cliff Robertson. Phenomenon, a John Travolta film, uses the same premise. Even the spy spoof television show "Chuck" uses a similar premise when the title character is able to access information stored in "the intersect," as though his brain were a remote computer. What makes this film stand out, however, is its jazzy musical score, witty script, engaging mystery, and skillful cast, not to mention its unexpected ending.

The film begins with the resounding thump of a sledgehammer against a metal door that jars the audience out of its seat. The throbbing musical score (by Paul Leonard-Morgan) drives the story forward, lifting the audience into a feel-good mood.

Eddie's brain on NTZ is simulated artfully for the audience through special camera effects that make Eddie's consciousness seem to speed not just down the road but also through cars, down sidewalks, into buildings, and out of them again at dizzying roller-coaster speeds. When he begins to write, letters appear from his ceiling and drop like rain into his room. Later, when he starts using his newfound skill to make money in the stock market, his ceiling tiles become a ticker tape, composing themselves into the stock symbols that he should buy. Intensified color is also used to portray his intensified awareness; even Cooper's intensely clear blue eyes add to his character's altered sense of reality. These techniques are very effective in creating a sense of what Eddie is experiencing.

The story's suspense is driven by Eddie's shady relationships with a drug dealer (Whitworth), a loan shark (Andrew Howard), a stalker (Tomas Arana), and an investment banker (Robert de Niro).  Eddie cleverly draws on all the memories stored in his brain to thwart the bad guys, but when he unwittingly comes across a dead body, he reacts in the way a normal person would — completely terrified, knocking over a chair as he collapses, then hiding in a corner with a golf club for protection as he calls the police. It's refreshing to see a character react as we probably would, instead of displaying unrealistic aplomb.

Limitless is a surprisingly entertaining film, with its fast pace, skilled cast, creative camera work, and interesting plot. Well worth the price of a ticket.


Editor's Note: Review of "Limitless," directed by Neil Burger. Many Rivers Productions, 2011, 105 minutes.



Share This
Syndicate content

© Copyright 2013 Liberty Foundation. All rights reserved.



Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.