Enforcers of Health

 | 

Libertarians have uneasy thoughts about the enforcers of public health. In my state, the public health authorities banned tobacco billboards — an act that some thought a violation of the First Amendment. This year, they pushed in the legislature to ban the sale of flavored cigars and pipe tobacco. In both cases, they have said that their aim is to make tobacco less attractive to children.

A century ago the fight was much more immediately vital. Then people of libertarian mind were fighting with the public health enforcers over the control of smallpox.

That disease was eradicated in the 20th century by campaigns of vaccination, many of them not entirely voluntary. In the United States, a turning point came in the epidemic of 1899 to 1902, when outbreaks of the disease prompted a political battle over compulsory vaccination. The story is told in legal writer Michael Willrich’s new book, Pox.

The push for compulsory vaccination grew out of the facts of the disease itself. It was a killer. Of people infected by the main strain of the disease, between 20 and 30% died. Smallpox was highly contagious, so that an infected person was a threat to everyone around him who was unvaccinated. First symptoms appeared more than a week after exposure, so fighting the epidemic by treating sick people was a strategy of being perpetually behind. The disease could, however, be stamped out by vaccinating the healthy.

For the progressives, the model was the Kaiser’s Germany. As Willrich says, “German law required that every child be vaccinated in the first year of life, again during school, and yet again (for the men) upon entering military service.” Germany had “the world’s most vaccinated population and the one most free from smallpox.”

In the constitutionalist America of the 1890s, vaccination was a state, local, and individual responsibility. Americans, writes Willrich, were “the least vaccinated” people of any leading country. The federal government had a bureau, the Marine-Hospital Service, to tend to illnesses of sailors; it had been started in the 1790s under President John Adams. The Service had experts in smallpox control, but they were advisory only and expected local authorities to pay for local work.

The antivaccinationists argued that compulsory vaccination would lead to other bad things. And four years later, in 1906, Indiana enacted America’s first compulsory sterilization law.

In a smallpox outbreak, the public health enforcers would set up a “pesthouse,” usually at the edge of town. They would scour the community for the sick, paying particular attention to the shacks and tenements of blacks, immigrants, and the rest of the lower classes, where the disease tended to appear first. People in these groups were subjected, Willrich says, to “a level of intrusion and coercion that American governments did not dare ask of their better-off citizens.”

Public-health doctors, accompanied by police, would surround tenement blocks and search people’s rooms. The sick would be taken to the pesthouse under force of law, and everyone else in the building would be vaccinated.

Often a community would have a vaccination ordinance that on its face was not 100% compulsory. It would declare that no child could be admitted to public school without a vaccination mark. For adults, the choice might be vaccination or jail — and all prisoners were, of course, vaccinated.

The procedure involved jabbing the patient with a needle or an ivory point and inserting matter from an infected animal under the skin. Typically it was done on the upper arm; often that arm was sore for a week or two, so that people in manual trades couldn’t work. Sometimes, vaccination made people so sick it killed them — though not nearly as often as the disease did.

Such was the background for the outbreaks of 1899–1902. At that time, two new factors emerged. First, America had just had a war with Spain, and had conquered the Philippines, Cuba, and Puerto Rico. All these places were rife with disease — and their status as conquered possessions, Willrich writes, provided “unparalleled opportunities for the exercise of American health authority.”

There the authorities had no worries about people’s rights: “The army’s sanitary campaigns far exceeded the normal bounds of the police power, which by a long American constitutional tradition had always been assumed to originate in sovereign communities of free people.” And coercive methods worked. They worked dramatically.

The second new thing in 1899 was that most of the disease spreading in the United States was a less-virulent form that killed only 0.5 to 2% of those infected. That meant that many Americans denied the disease was smallpox, and didn’t cooperate. The public health people recognized what the disease for what it was, and they went after it with the usual disregard of personal rights — maybe even more disregard, because of “the efficiency of compulsion” overseas. And they stirred up, Willrich writes, “one of the most important civil liberties struggles of the 20th century.”

Their fight was with the antivaccinationists, whom Willrich calls “personal liberty fundamentalists” who “challenged the expansion of the American state at the very point at which state power penetrated the skin.”

He continues:

“Many antivaccinationists had close intellectual and personal ties to a largely forgotten American tradition and subculture of libertarian radicalism. . . . The same men and women who joined antivaccination leagues tended to throw themselves into other maligned causes of their era, including anti-imperialism, women’s rights, antivivisection, vegetarianism, Henry George’s single tax, the fight against government censorship of ‘obscene’ materials and opposition to state eugenics.”

Often, antivaccinationists denied that vaccination worked. Many were followers of chiropractic or other non-standard medicine; this was also the time of “a battle over state medical licensing and the increasing dominance of ‘regular,’ allopathic medicine.” Of course, the antivaccinationists were wrong about vaccination not working, but they were not wrong when they said it was dangerous. In 1902, the city of Camden NJ required all children in the public schools to be vaccinated. Suddenly children started coming down with lockjaw. They were a small fraction of those who had been vaccinated, but all of them fell ill about 21 days after vaccination. Several died, and the press made a scandal of it.

Our newly conquered possessions provided “unparalleled opportunities for the exercise of American health authority.”

Under the common law of the day, the vaccine maker — the H.K. Mulford Co. — was not liable to the parents, because it had no contract with them. Nor was the government liable, though the government had required the treatment. Thus, writes Willrich, “The arm of the state was protected; the arm of the citizen was not.”

The medical and public health establishment initially denied that the lockjaw had been caused by the vaccines. This was admitted only after a consultant to a rival vaccine maker, Parke-Davis, made an epidemiological case for it. Congress responded by quickly passing the Biologics Control Act of 1902, which ordered the inspection and licensing of vaccine producers, and required them to put an expiration date on their products. It was one of the first steps in the creation of the regulatory state.

The act calmed the public and drove some smaller companies out of business. The first two licensees were Parke-Davis (later absorbed by Pfizer) and Mulford (absorbed by Merck). And the purity of vaccines, which had been improving already, improved dramatically in the following decade.

The antivaccinationists turned to the courts in a fight about constitutional principles. The argument on the state’s side was that it had a duty to protect citizens from deadly invasion, which might be launched by either an alien army or an alien army of microbes. The argument on the other side was the individual’s right to control what pathogens were poked into his body, and, as the antivaccinationists’ lawsuit contended, his right to “take his chance” by going bare in a time of contagion.

They brought their case to the Supreme Judicial Court of Massachusetts, and lost. Citing the power of government to quarantine the sick and conscript soldiers in war, the court said, “The rights of individuals must yield, if necessary, when the welfare of the whole community is at stake.” Then they appealed to the US Supreme Court. In the same year in which the libertarian side won a landmark economic-liberty ruling in Lochner v.New York (1905), it lost, 7–2, in the vaccination case, Jacobson v.Massachusetts. Writing for the court, Justice John Harlan compared the case to defense against foreign invasion, and wrote, “There are manifold restraints to which every person is necessarily subject for the common good.”

It is an unlibertarian answer. The author of Pox thinks it was the right answer — and so do I, in this case. Smallpox was indeed a kind of invasion. In the 20th century it killed 300 million of Earth's people, and disfigured millions more. And though public health people can make similar arguments, and they do, about tobacco as a mass killer, I do not support their efforts to stamp out its use by ever more state intrusion. The difference is that with smallpox the individual’s refusal to be vaccinated put others in imminent danger of death. Collective action — compelled collective action — was the only way to defeat the invader. None of this is the case with tobacco.

Naturally, the progressives wanted to apply their wonderfully practical idea to all sorts of things. Referring to epidemics and the germ theory of disease, Willrich writes, “Progressives took up the germ theory as a powerful political metaphor. From the cities to the statehouses to Washington, the reformers decried prostitution, sweatshops, and poverty as ‘social ills.’ A stronger state, they said, held the ‘cure.’ ”

The antivaccinationists argued that compulsory vaccination would lead to other bad things. They made some fanciful statements; one wrote, “Why not chase people and circumcise them? Why not catch the people and give them a compulsory bath?” But they were right. Four years later, in 1906, Indiana enacted America’s first compulsory sterilization law. This was done in pursuit of another scientific, society-improving cause that excited progressives: eugenics. In 1927, in Buck v.Bell, the Supreme Court approved Virginia’s law of compulsory sterilization of the “feeble-minded.” That was the case in which Justice Oliver Wendell Holmes — supposedly the great champion of individual rights — pontificated: “The principle that sustains compulsory vaccination is broad enough to cover cutting the Fallopian tubes. Jacobson v.Massachusetts, 197 U.S. 11. Three generations of imbeciles are enough.”

Jacobson has been cited in many Supreme Court decisions, usually to approve state power. And the applications have not been confined to medicine. In 2004, Justice Clarence Thomas cited Jacobson in his dissent in Hamdi v.Rumsfeld, which was about a US citizen captured in Afghanistan. Thomas, often regarded as the most libertarian of the current justices, was arguing that Yaser Esam Hamdi had no right of habeas corpus and could be held on US soil indefinitely without charge.

Collective action — compelled collective action — was the only way to defeat the invader. None of this is the case with tobacco.

The smallpox epidemic of 1899–1902 was a pivotal event institutionally as well as legally. The Biologics Act created more work for the Marine-Hospital Service, which a few years later became the US Public Health Service. The service’s Hygienic Laboratory, which was empowered to check the purity of commercial vaccines, later grew into the National Institutes of Health.

From the story told in Pox, you can construct rival political narratives. The progressives’ “we’re all in it together” formula about the need for science, federal authority, officials, regulation, and compulsion is there. And in the case of smallpox, progressivism put its strongest foot forward: it was warding off imminent mass death. Smallpox is one of the least promising subjects for the libertarian formula of individual rights and personal responsibility. Yet here you can also find the bad things that libertarian theory predicts: state power used arbitrarily and unevenly, collateral deaths, official denial of obvious truths, a precedent for worse things later on, and even the favoring of large companies over small ones.

I don’t like the progressive formula. But in the case of smallpox it fits, and I accept it, looking immediately for the limits on it.

Fortunately for liberty as well as health, few other diseases are as contagious and deadly as smallpox. Even Ebola and SARS — two contagious and deadly diseases of recent decades — were mostly fought by quickly isolating the sick. State power was necessary — some people were forcibly quarantined — but there were no mass vaccinations of the healthy.

Still, vaccination surfaces as an issue from time to time. It is, on its face, a violation of the rights of the person. So is quarantine. And yet it is prudent to allow both of them in some situations.

That is an admission that the libertarian formula doesn’t work all the time. Liberty is for rational adults in a minimally normal world. That is a limitation, but not such a big one. Surely it is better to design a society for people who can think, make decisions, and take responsibility, while in a few cases having to break those rules, than to live in a world designed for people defined as children of the state.


Editor's Note: Review of "Pox: An American History," by Michael Willrich. Penguin, 2011, 400 pages.



Share This


Individualism in Real Life

 | 

Bethany Hamilton is one of those perfect libertarian heroes. When she wants something, she goes after it. When things go wrong, she looks for a way to make it right. She doesn't whine or complain, even when the thing that goes wrong is the horror of a shark taking off her arm. She relies on herself, her family, and her God.

The movie about Bethany, Soul Surfer, has its predictably maudlin moments, fueled by Marco Beltrami's heavily emotional musical score, but don't let that put you off. If you are looking for a film to demonstrate libertarian principles to your friends, take them to Soul Surfer.

The film is based on the true story of Bethany, a competitive surfer with corporate sponsorship who was on her way to professional status when a huge shark bit off her arm. She returned to competitive surfing within a matter of months, and is now a professional surfer. She also seems to be a really nice girl. I learned that not only from the film, but also from the articles I have read about her.

And the Hamiltons seem to be a model libertarian family. They ignored traditional middle-class expectations in order to follow the dreams they made for themselves. All of them, parents and children alike, live to surf. When Bethany showed a great aptitude for surfing, her parents opted out of the government school system and educated her at home so she could take advantage of daytime surfing. After her injury, they did for her only the things she absolutely could not do for herself, encouraging her quickly to find new ways of managing her "ADLs" (activities of daily living).

The film portrays the Hamiltons (Dennis Quaid and Helen Hunt, parents) as a close-knit family within a close-knit community of surfers who understand the true nature of competition. True competition isn't cutthroat or unfair. In fact, unfettered competition has the power to make every person and every product better. Even Bethany's surfing nemesis, Malina Birch (Sonya Balmores), is portrayed as competitively solid. After she paddles full out toward a wave during a competition instead of kindly slowing down to allow for Bethany's injury, Bethany (AnnaSophia Robb) thanks Malina for treating her as an equal and pushing her to be her best. It's a great example of the good that competition can accomplish.

Instead of turning to government support groups to help her deal with her injury, Bethany turns to three free-market sources: family, business, and religion. When she returns to surfing, she rejects the judges' offer to give her a head start paddling past the surf. Instead, her father designs a handle to help her "deck dive" under the waves. When finances are a problem, a news magazine offers to provide her with a prosthetic arm in exchange for a story, and a surfboard manufacturer sponsors her with equipment and clothing. The youth leader at her church (Carrie Underwood) gives her a fuller perspective on her life by taking her on a service mission to Thailand after the tsunami that hit in 2004. There she learns the joy of serving others — a kind of work that earns her psychic benefits rather than monetary rewards. She isn't "giving back"; she is "taking" happiness.

These examples of self-reliance and nongovernmental solutions to problems raise the level of this emotionally predictable film to one that is philosophically satisfying — and well worth seeing.


Editor's Note: Review of "Soul Surfer," directed by Sean McNamara. Sony Pictures Entertainment, 2011, 106 minutes.



Share This


Atlas at Last

 | 

The John Galt Line has finally pulled out of the station, and is barreling across the country picking up hitchhikers who may be wondering what all the fuss is about. After a spectacular pre-release advertising campaign that included multiple premieres in major cities and pre-purchased ticket sales that encouraged nearly 300 screen owners to give the film a chance, Atlas Shrugged Part 1 opened on April 15 (Tax Day) as the third-grossing film of the weekend (looking only at screen averages, not total sales).

"Mixed" is an understatement when describing the reviews. Professional critics on RottenTomatoes give it a 6% approval rating, perhaps the lowest rating I have ever seen for a film. Meanwhile, audiences gave it an unbelievable 85%.

In fact, the film doesn’t deserve either rating. It belongs somewhere in the low middle.

It is not as good as the 85% would indicate; audiences who saw it on opening weekend are rabid fans, bent on a mission to have everyone in America see the film and learn Ayn Rand's philosophy: free markets, free thinking, and self-reliance.

But it doesn't deserve the godawful 6% usually reserved for low-budget slasher flicks, either. It is not as bad as its low budget and relatively unknown cast of actors and producers would cause one to expect. It is respectable.

The cinematic quality is quite good, especially the outdoor scenes of Colorado and the special effects used to create the train and the bridge. The acting isn't bad, but it isn't great. Often I was painfully aware of Taylor Schilling being painfully aware of where Dagny should place her arm, or how Dagny should turn her head; I never felt that she embodied Dagny. Similarly, the background cast at the Reardens' anniversary party appeared to be made up of friends and family of the cast and crew (someone needed to teach them how NOT to mug for the camera).

For fans of Ayn Rand and Atlas Shrugged Part 1, the brightest compliment for this film is that it stays true to first third of the book. (Parts 2 and 3 are expected to follow.) For fans of filmmaking, however, the biggest problem is that it stays true to the book. The film is dialogue heavy, with very little action.

I’m not a Hollywood film reviewer; but I’m a watcher and a reader. I know that books and films are two different genres, and their stories have to be presented in two different ways. Books are primarily cerebral; films are primarily visual. Books can focus on philosophy and conversation; films must focus on action. Books can take days or weeks to read; films must tell their story in a couple of hours. When adapting a book to film, streamlining is essential. Unfortunately, the words in this film are so dense that the ideas become lost.

Atlas Shrugged Part 1 contains some great quotations, but it is not a film that will convince anyone but the Rand faithful of the supremacy of the free market. It makes the same mistake that most libertarians do when espousing philosophy: it assumes that everyone already sees the problems in the way libertarians do. It does not sufficiently engage the non-business person in seeing the long-term effects for everyone when government intervenes in the market. I can hear my middle-class neighbors and colleagues saying "So what?" when Rearden (Grant Bowler) is forced to sell all but one of his businesses. "How is that going to hurt me?" they might wonder.

Even the conflict between Dagny's pure free-market economics and her brother James's (Matthew Marsden) collusion with government is insufficiently portrayed; Dagny seems to be simply getting around the stockholders when she takes over the John Galt Line. Moreover, she and Rearden can hardly be seen as icons of virtue when they violate a freely made and morally binding contract (his marriage vows) by jumping into bed together. Even more damning is Ellis Wyatt's decision to burn his oil fields rather than let anyone else enjoy the fruits of his labor. My middle-class neighbors would howl with outrage at this decision. In short, I don't see how this film will convince anyone that returning to free-market principles will improve our economy and our way of life. It seems like everyone in the film is cutting moral corners somewhere.

"Not bad" is faint praise for a movie that has been 50 years in the waiting. Unfortunately, business pressures caused it to be rushed through with only five weeks in the writing, and another five weeks in the filming. Business is often an exercise in compromise, and this film's production is a classic example. I think, however, that if The Fountainhead's Howard Roark had been the architect of this film, it would have been burned along with Ellis Wyatt's oil fields. It's good, but not good enough.


Editor's Note: Review of "Atlas Shrugged Part 1" (2011), directed by Paul Johannson. The Strike Productions, 2011, 97 minutes.



Share This


How to Unblock Your Writing

 | 

Wouldn't it be great to have limitless access to all the cells in your brain? To have a Google feature of sorts that would allow you to immediately call up just the right fact or memory that you need at any given moment, and the ability to synthesize and analyze all that information instantly?

That's what the drug NTZ does for characters in the film Limitless, a mystery-thriller in theaters now. Eddie Morra (Bradley Cooper) is a sci-fi writer with a contract for a book, but a debilitating writer's block has prevented him from writing a single word. His life is a mess, his apartment is a mess, he's a mess, and his girlfriend Lindy (Abbie Cornish) has just broken up with him because of it.

Then he meets an old friend, Vernon Gant (Johnny Whitworth) who gives him a tab of NTZ. Within minutes Eddie experiences a rush of insight and intelligence. He can remember books he thumbed through in college, television shows he saw in childhood, and math equations he learned in high school. Within hours he has cleaned his apartment, resolved his back rent, and written 50 scintillating pages of his novel. But the next day, he is back to being the same sloppy Eddie who can't write a single word. More NTZ is in order.

If this story line sounds familiar, it is. Daniel Keyes explored this idea of artificially stimulated intelligence in his "Flowers for Algernon," which was later made into the movie Charlie starring Cliff Robertson. Phenomenon, a John Travolta film, uses the same premise. Even the spy spoof television show "Chuck" uses a similar premise when the title character is able to access information stored in "the intersect," as though his brain were a remote computer. What makes this film stand out, however, is its jazzy musical score, witty script, engaging mystery, and skillful cast, not to mention its unexpected ending.

The film begins with the resounding thump of a sledgehammer against a metal door that jars the audience out of its seat. The throbbing musical score (by Paul Leonard-Morgan) drives the story forward, lifting the audience into a feel-good mood.

Eddie's brain on NTZ is simulated artfully for the audience through special camera effects that make Eddie's consciousness seem to speed not just down the road but also through cars, down sidewalks, into buildings, and out of them again at dizzying roller-coaster speeds. When he begins to write, letters appear from his ceiling and drop like rain into his room. Later, when he starts using his newfound skill to make money in the stock market, his ceiling tiles become a ticker tape, composing themselves into the stock symbols that he should buy. Intensified color is also used to portray his intensified awareness; even Cooper's intensely clear blue eyes add to his character's altered sense of reality. These techniques are very effective in creating a sense of what Eddie is experiencing.

The story's suspense is driven by Eddie's shady relationships with a drug dealer (Whitworth), a loan shark (Andrew Howard), a stalker (Tomas Arana), and an investment banker (Robert de Niro).  Eddie cleverly draws on all the memories stored in his brain to thwart the bad guys, but when he unwittingly comes across a dead body, he reacts in the way a normal person would — completely terrified, knocking over a chair as he collapses, then hiding in a corner with a golf club for protection as he calls the police. It's refreshing to see a character react as we probably would, instead of displaying unrealistic aplomb.

Limitless is a surprisingly entertaining film, with its fast pace, skilled cast, creative camera work, and interesting plot. Well worth the price of a ticket.


Editor's Note: Review of "Limitless," directed by Neil Burger. Many Rivers Productions, 2011, 105 minutes.



Share This


Much More Than Moore

 | 

The hardest part of making a film is not writing the script, hiring the cast, choosing the locations, planning the shots, or editing the footage down to a moving and entertaining feature that tells the story in under two hours. The hardest part of filmmaking is finding the funding. It takes money to make a movie. Lots of money.

Ideally, the consumers (moviegoers) should pay for the product (the movie on the screen). And ultimately, they do, $10 at a time. But filmmakers need money upfront to make the product. Piles and piles of money. This is just Capitalism 101 for libertarians, and it makes me stare in disbelief when Americans glibly criticize the capitalist system for being corrupt and selfish. What could be less selfish than deciding to forego current consumption in order to invest in someone else's dream?

From the earliest days of filmmaking, films have been financed in several ways: using personal funds, either from one's own pocket or that of a rich friend or relative; applying for business loans; studio investment; and selling product placement. In recent years, product placement has become increasingly important as a way to fund the burgeoning cost of producing a movie, where a million dollars can be considered little more than chump change.

Morgan Spurlock, the new darling of the political-agenda documentary, exposes the process of selling embedded advertising in his new film, The Greatest Movie Ever Sold, which opens later this month. But, as I said, product placement is nothing new. From the start, radio programs and TV shows were "brought to you by" a particular sponsor; product placement was simply a way of getting the product into the show itself. Today product placement is a multibillion-dollar undertaking. Also called "co-promotion" and "brand partnering," this marriage of convenience provides money for the movie makers and brand recognition for the product. According to the documentary, businesses spent $412 billion last year on product placement ads, from the Coke glasses placed in front of the judges on American Idol, to the Chrysler 300s driven by Jack Bauer on 24 (after Ford withdrew its F-150s), to the kind of phones that Charlie's Angels carry.

The film is informative, intelligent, and laugh-out-loud funny, largely because of Spurlock's dry, self-deprecating humor as he goes about looking for sponsors for his film, which is simply a movie about Spurlock looking for sponsors for his film. Where Michael Moore made his mark in documentaries by humiliating his subjects through ambush journalism, Spurlock is gleefully upfront about what he is doing, treating his subjects with seriocomic respect and appreciation.

We all know we're being had, but he does it so openly that he makes us enjoy being had.

Spurlock doesn't just walk into business meetings unprepared, and beg for money. He does his homework, as good filmmakers (or any salesperson) should. He begins with a psychological evaluation to determine his "Brand Personality," which helps him identify what kinds of products would be a good fit for his film. Not surprisingly, his brand personality is "mindful/playful," so he looks for products whose makers think of themselves as appealing to consumers who are mindful and playful. He arrives at meetings with high quality storyboards and mockups to make his pitch. He listens carefully to the producers and accommodates their concerns. After all, if their needs aren't met, they won't fund the film. They are his consumers as much as the ticket buyers at the multiplex will be.

The film is liberally peppered with products, all of them described, worn, eaten, or presented with Spurlock's respectful glee. We all know we're being had, but he does it so openly that he makes us enjoy being had. Even his attorney is a product placed in the movie; after discussing a contract, Spurlock asks how much the consultation will cost him, and the attorney replies, "I charge $770 an hour. But the bigger question is, how much is it going to cost me to be in your movie?" (I wrote the attorney's name in my notes, but I'm not repeating it here. He hasn't paid Liberty anything to be mentioned in our magazine . . .)

Spurlock likens his movie to a NASCAR racer, and accordingly wears a suit covered in his sponsors' logos for interviews. The official poster shows his naked body tattooed with the logos, with a poster board of the film's title strategically placed across his crotch.  (Nudity sells, but I guess his manhood didn't pay for product placement.)

The film is funny but also informative. Despite Spurlock's gleeful presentation, he offers many serious ideas about product placement in movies and about advertising in general. For example, he discusses the potential loss of artistic control when the sponsoring company wants things done a certain way. This isn't new; Philip Morris reportedly told Lucy and Desi they had to be seen smoking more frequently on "I Love Lucy," the most popular show of the 1950s, and they complied. A filmmaker has to weigh the money against the control, and decide how much to compromise.

Truth in advertising is also discussed. Spurlock visits Sao Paolo, Brazil, where outdoor advertising has been completely banned by a new "Clean City Law." Now store owners rely more heavily on word-of-mouth referrals for new customers, which may indeed be a more honest form of testimonial, but highly inefficient — and inefficiency is generally passed along to consumers in the form of higher prices. In the film, local Brazilians glowingly praise their ability to "see nature" now that the billboards are gone, as Spurlock's cameras pan to the high-rise buildings that overpower the few shrubs and trees in the downtown area and block the view of the sky. Subtle, and effective.

Spurlock also interviews several people to get their opinions of truth in advertising. Ironically, one of the interviewees has bright magenta hair taken from a bottle, another has the unmistakable ridge of breast augmentation, another is wearing a sandwich board advertising a nearby store, while a fourth is pumping gas at the chain that has made a brand-partnering deal with Spurlock. Once again Spurlock is making gentle fun of his subjects, and we laugh gleefully along with him. (But I'm still not willing to reveal the name of the gas station until they pony up with some advertising money for Liberty.)

The Greatest Movie Ever Sold may not be the greatest documentary ever made, but it is mindful and playful, like its maker. If it comes to your town, don't miss it.


Editor's Note: Review of "The Greatest Movie Ever Sold," directed by Morgan Spurlock. Snoot Entertainment/Warrior Poet, 2011, 90 minutes.



Share This


Bones Crunch!

 | 

Liam Neeson fairly burst onto the big screen in 1993 with his compelling performance as Oskar Schindler, the man who saved over a thousand Jews from Nazi execution, in the Oscar-winning Schindler's List. It wasn't his first film by any means, but it was his first big film, and it garnered him an Oscar nomination for best actor. From there his career turned in the direction one would expect for an Irish-born, classically trained actor with a resonant voice and proclivity for accents. He played characters with stature: Rob Roy, Michael Collins, Alfred Kinsey, Jean Valjean, the god Zeus. He was the voice of Aslan. He also had fatherly, mentoring roles in such films as Batman Begins and Star Wars Episode I.

So how did this stately-but-slightly-sagging, now-middle-aged man suddenly morph into an action figure? A figure who has become a number one draw at the box office?

It started with Taken (2008), a film in which he plays a father determined to rescue his kidnapped daughter. Not an unlikely reach for a man his age — except that his character, Bryan Mills, is a retired CIA agent who is highly trained in combat and espionage. Taken was 10% distraught father's angst and 90% thrill ride, with enough hand-to-hand combat, gunfights, dead bodies, and car chases in a 93-minute thriller to satisfy the most avid video game player. (And that's what many of these new thrillers have become: video games without the controllers.)

From there Neeson has voiced a character in an actual video game (Fallout 3), and fought the bad guys with The A-Team. Now he is taking on a horde of assassins in the new psychological conspiracy thriller, Unknown.

With a more engaging plot than most action movies, Unknown offers a satisfying evening's entertainment. The story has numerous unexpected twists and subtle clues, with enough red herrings to keep even this staid reviewer off balance. Neeson plays Martin Harris, a biochemist arriving in Berlin with his beautiful wife (January Jones) to present a paper at a scientific conference. When his briefcase is accidentally left behind at the airport, he grabs a cab to retrieve it and ends up in the river when the driver swerves to avoid some falling debris. By the time Harris returns to the hotel, after spending four days in a coma, another man has taken his place, and his wife does not recognize him. Creepy men start following him. Cars race through traffic with guns blazing. Bodies crash through walls in the throes of battle. Bones crunch. Bullets fly. Blood flows. Necks snap. And in the middle of it all, Liam Neeson, looking more like Al Bundy than James Bond, is the unlikely action hero.

I don't get it. But I like it.


Editor's Note: Review of "Unknown," directed by Jaume Collet-Serra. Warner Brothers, 2010, 113 minutes.



Share This


More Than Just a Pretty Film

 | 

The Illusionist is a lovely animated movie by French filmmaker Sylvain Chomet — a movie that, despite its beauty, has a disturbing message.

Its leading characters are a kindly vaudeville magician and the young working girl whom he befriends. The story is sweet and full of pathos, as the older gentleman sacrifices his own comfort and well being to please the girl. Appropriately, the film is drawn in the soft-edged, old-school style that predates Pixar. Its French pedigree is obvious, from its watercolor backgrounds and exaggerated, non-realistic faces to its impressionistic musical score. The characters communicate with each other through a combination of mime and an odd pseudo-language reminiscent of the way adults speak in the old "Peanuts" TV specials. This adds to the dreamlike quality of the story, although it can be off-putting to those who aren't fans of French animation.

Based on a story by Jacques Tati (1907–1982), the famous French filmmaker, The Illusionist is intended to show the deep father-daughter connection between a lonely old man and an equally lonely young girl. Metaphorically, however, the film offers a powerful, though certainly unintentional, warning look at the relationship between the working class and the welfare class. The magician's relationship with the young cleaning girl begins innocently and sweetly. When her bar of soap slips away from her while she is cleaning the floors, he picks it up and "magically" turns it into a fancy box of perfumed hand soap, offering it to her with a flourish. She is thrilled. The next day she washes his shirt to show her appreciation, and he "magically" produces a coin from her ear to thank her — the way kindly uncles do when they visit little nieces and nephews. Noticing that the sole of her shabby shoe is flapping wildly as she walks, he buys her a pair of bright red shoes.

Before long the magician's gig at the local vaudeville theater ends, and he must move on to the next town. Without being invited, the girl follows him. When the conductor asks for her ticket, she points to the old man, miming her expectation that he will produce a ticket for her out of thin air. Not wanting to disappoint her, the poor man complies, again with a magical flourish. Throughout the rest of the film the girl stays with the man, pointing to new goodies that she wants — a new coat, high-heeled shoes, a new dress, and a coin from her ear every time they part. The man takes on extra jobs to pay for her increasing demands. He sleeps on the couch so she can have the single bedroom in his tiny apartment. Sadly, the girl never catches on to what is happening to the man. You can probably guess where this leads. Small- time magicians, like golden geese, eventually give out.

The film offers a powerful demonstration of what has happened to a whole generation of people who have grown up under the welfare state. They have no idea where money comes from, or how to earn it. They turn to the government for housing, food stamps, education, medical care, and even entertainment in the form of parks and recreation. They seem to think that money can appear out of thin air, and that people who work owe them all the goodies they want. Like the man in the film, tax-paying Americans are becoming threadbare and exhausted. The demands on them are too many, and they're tired of not being appreciated for meeting those demands. At some point they are going to stop working — also like the man in the film. What then?

A friend who teaches middle school in the Bronx asked her students to write an essay about what they want to be when they grow up — pretty standard fare for a middle-school essay. One young man wrote about going to college, becoming a lawyer, and representing clients in court. "I'll make a lot of money, and I'll wear nice suits and carry a briefcase," he dreamed. But he ended his essay with this chilling observation: "If I do that, I'll probably earn too much money and I'll lose my housing and food stamps. So maybe that's not such a good idea." What a self-defeating decision! Yet I see that idea in practice every day as I work with people from Yonkers and the Bronx. They are so afraid of losing their tiny apartments in crumbling buildings on potholed streets in seedy neighborhoods that they won't even consider moving to a different state with a lower cost of living, where they could get a job and provide for their families themselves.

How surprising, that the demise of the American dream would be so skillfully and artistically presented in the form of a French animated film. It is well worth sharing with friends as a cautionary tale of pending disaster.


Editor's Note: Review of "The Illusionist," directed by Sylvain Chomet. Pathé-Django, 2010, 90 minutes.



Share This


Assessing the Bailouts

 | 

Here are some libertarian warnings about the government rescue of General Motors:

“The current restructuring plan calls for the U.S. Treasury Department to have controlling interest in General Motors, which amounts to absolute nationalization. In GM's headquarters in Detroit there is a cluster of bureaucrats from the government's task force telling GM how to run its business. . . . Furthermore, the White House fired General Motors Chairman and CEO Rick Wagoner. When the executive branch intervenes in a private business and ousts management, bailout or not, it is a staggering violation of the American ideal of free enterprise. This sets a precedent for unlimited government trampling over the private sector. On March 30th, Obama said, ‘Let me be clear. The United States government has no interest in running GM. We have no intention of running GM.’ If that's the case — and we know it's not — then why scoop up majority ownership?” (Karen DeCoster, LewRockwell.com, May 23, 2009)

And:

“Will GM be run as profitably and efficiently as Amtrak? Will GM be paid not to produce, like the agricultural sector? Will it feed into an economic bubble like Fannie Mae and Freddie Mac? Will it boast the negligible oversight and waste of the so-called ‘stimulus’ package? Will it feature the fiscal irresponsibility of Social Security? Or will we see the runaway costs of Medicaid?” So many options. (David Harsanyi, Reason.com, June 3, 2009)

And:

“GM’s bankruptcy announcement today . . . might well be remembered as the company’s last act of capitalism. If GM emerges from bankruptcy organized and governed by the plan created by the Obama administration, it is impossible to see how free markets will have anything to do with the U.S. auto industry. With taxpayers on the hook for $50 billion (at a minimum), the administration will do whatever it has to — including tilting the playing field with policies that induce consumers to buy GM or hamstring GM’s competition or subsidize its costs — in order for GM to succeed.” (Daniel Ikenson, Cato@Liberty, June 1, 2009)

And:

“The more the Treasury lends, the more GM will come to be Government Motors. In Obama's America, that will mean becoming a development project for electric cars, plug-in hybrids, or whatever it is the politicians want. The result might be wonderful, but the history of state-controlled companies suggests boondoggles are more likely. The best thing is to let GM do what failing companies have always done: reorganize if possible, liquidate if necessary. . . . Let it go, and let investors put their money where the odds are better.” (Bruce Ramsey, The Seattle Times, April 1, 2009)

That last excerpt is, of course, by me. I didn’t condemn the firing of Rick Wagoner, because I would have fired him, too. But on the main point — that so much government money risked turning GM into a money-losing government pet — I agreed.

It hasn’t happened that way. As I write, the new GM has had three quarters in the black, the latest one earning five and a half cents of operating profit on each dollar of sales. That’s quite respectable. In November 2010, GM became a public company again with a stock offering. GM planned to offer the stock at $26 to $29, depending on the market, but demand was so strong that it sold the shares at $33; and as I write, the stock is trading above that. The stock offering reduced the Treasury’s stake in GM from 60.8% to one-third. If the US government can sell its remaining stock at $53, it will break even.

Chrysler hasn’t done as well, but it is also in the black, and looks like it will make it.

Of course, disaster could yet strike and turn the pessimists into prophets. But let’s be honest: so far, the affair has turned out much better than we feared. We ought to know why, especially if we expect to oppose the next bailout.

One place to start is Steven Rattner’s book, Overhaul, subtitled An Insider’s Account of the Obama Administration’s Emergency Rescue of the Auto Industry. Rattner, 58, was President Obama’s “car czar” on the GM and Chrysler rescues. Rattner is a Wall Street guy; he was a founder and partner of the Quadrangle Group, a leveraged buyout firm. Before that he was a reporter for the New York Times, which explains why his book reads so well.

Of course, you are free not to believe his account. I haven’t seen it contradicted, so I take it at face value. Reasons may come along to change that. So far they have not.

I don’t share Rattner’s politics. He is an Obama man. He has raised money for Democrats, and his wife Maureen is a former national finance chairman of the Democratic Party. He says in the book that he is a Democrat because “the Republicans had favored the rich at a time of growing income inequality, abandoned fiscal responsibility, and held unfortunate positions on social issues such as a woman’s right to choose abortion.”

The GM affair has turned out much better than we feared. We ought to know why, especially if we expect to oppose the next bailout.

As “car czar,” Rattner was leader of Team Auto, a group of people, mostly from outside government, whom he recruited. His bosses were Larry Summers, President Obama’s chief economic adviser; and Tim Geithner, Obama’s treasury secretary. Rattner says his task “was not designed to further a particular economic theory,” but was strictly to save those parts of GM and Chrysler that could be made economically viable. He says that in his meeting with Obama on March 26, 2009, the president told him, “I want you to be tough and I want you to be commercial.”

Rattner focused on GM. When his work began, in January 2009, the company already had been given billions by the Bush administration and was burning through it at a bonfire rate. The economy was in panic. The Dow Jones Industrial Average was shriveling to 6,600, the figure it would touch in early March. GM was no longer a viable enterprise. On his judgment of GM, Rattner sounds much like the libertarian accountant Karen DeCoster, who had been saying on LewRockwell.com for several years that GM was an ongoing industrial disaster.

Rattner is no partisan of unions; he recalls his days at the New York Times when the Newspaper Guild defended featherbedding and incompetents. But he does respect unions, and he doesn’t blame them for what happened at GM. It’s management that runs a company. As a leveraged buyout guy, he had expertise in firing, hiring, and evaluating managers — and at GM he was appalled by what he saw and heard.

“From Wagoner on down,” he writes, “GM seemed to be living in a fantasy.” To executives, the problem was entirely the recession, not the GM product line or the way the company was run. GM management’s idea, Rattner says, “seemed to be to trim only what was easily achieved and absolutely necessary, and use taxpayer dollars to ride out the recession.”

But GM had been bleeding market share continually for 33 years. After rebates, its cars sold for thousands of dollars less than their European and Japanese counterparts — a market measure of the people’s esteem for the products. For management not to see that was a kind of sickness. GM had a culture of excuses, of institutionalized failure.

Rattner’s killing comment: “No one at GM thought like an owner.”

Team Auto’s idea was that government money would come only with big changes to make GM viable, including sacrifices by management, labor, bondholders and creditors, and with a major mind readjustment at the top. The place to begin was the CEO.

There was a hullabaloo in the press because the government fired Wagoner. So it had; but the government was the investor, and an investor may put conditions on his investment.

“After nearly a decade of experience as a private equity manger, I believe in a bedrock principle of that business: put money behind only a bankable management team. To my mind, no private equity firm on the planet would have backed Rick Wagoner or GM’s current board. . . . A CEO who leads a company into a state in which its only recourse is a government bailout shouldn’t be in his job.”

And that’s right. There was a hullabaloo in the press because the government fired Wagoner. So it had; but the government was the investor, and an investor may put conditions on his investment. He who pays the jukebox calls the tune.

The critics wanted GM to be put into bankruptcy — either Chapter 11, in which creditors are partly stiffed and the company survives, or Chapter 7, in which the creditors are given the carcass. The critics tended to gloss over the fact that each of these procedures involves the government, in the form of a federal bankruptcy court. Each involves the breaking of private contracts. The difference is that in a bankruptcy the decisions are made by a judge (appointed by other judges) rather than by a politician or direct political appointee. The other difference is that a bankruptcy judge does not have a pipeline to public money.

Those are important differences, but it’s still the government.

An ordinary Chapter 11 would have taken perhaps 18 months, and Rattner says there was no way GM could get debtor-in-possession financing from a bank for such a period. Wagoner had waited too long, and GM was too weak. Wagoner told Team Auto that once the company was in bankruptcy, he believed nobody would buy its cars, because customers would be worried about warranty coverage. Because of that, Wagoner said, he had done no preparation for a Chapter 11. The warranty problem was real, but this was also a self-excusing argument from a guy lining up for a bailout.

There was a different kind of Chapter 11 called a Section 363. It had never been tired on an industrial company the size of GM or Chrysler. A Section 363 allowed Team Auto, the new investor, to fashion a new GM from the assets of the old, leaving creditors to fight over the bones of Old GM. In that way, Team Auto brought out what Rattner calls “Shiny New GM” in a little more than a month.

Maybe that tells something about bankruptcy. Critics write as if bankruptcy were a fixed thing, like the Ten Commandments. But it is a human institution, and may be changed so that it works better or worse.

In the GM case, the stockholders would get nothing, which is what they would have gotten anyway. The bondholders would get a share of New GM worth about 35 cents on the dollar — an amount Rattner says was more than they would have received in a straight liquidation and therefore too much, because the government wasn’t intervening on account of them. The workers’ medical benefit trust would get a piece of the new company in exchange for releasing the new GM from all of its retiree medical obligations, which, unlike pensions, are not guaranteed by the Pension Benefit Guraranty Corp. The United Auto Workers would get a seat on the GM board. The Canadian government, also investing capital, would get a 12% stake.

Given the choice, taking the equity was the right decision. If you’re going to pay the money, you may as well get something for it.

And the US Treasury would get almost 61%. That was a problem. Contrary to all the muttering about Obama being a socialist, the president didn’t want the government to own GM, nor did any substantial constituency in the Democratic Party. And yet, when the decision came about taking shares of stock, it was this, according to Rattner: “We can either get nothing for something or we can get something for something.”

Given that choice, taking the equity was the right decision. If you’re going to pay the money, you may as well get something for it. The Treasury did, and the result of that decision a year and a half later is that the taxpayers have been made one-third whole, and have a chance of coming out entirely whole.

And compared with things two years ago, that is an economic success.

Rattner wants to say there were no politics involved in these decisions, but he can’t quite do it. The choice to undertake the project in the first place was political. And always there was what Rattner calls “the Washington Post Test”: don’t do anything that you don’t want to see written about in theWashington Post.

But the bounds of the politically possible left a large space, and within this space Team Auto had discretion. Rattner writes, “No one in the Obama administration ever asked us to favor labor for political reasons.” Indeed, one of his chapters is called, “F**k the UAW,” a quotation he attributes to White House Chief of Staff Rahm Emanuel.

Organized labor took its lumps in the rescue — at GM it lost almost one-third of its North American jobs. But it came out ahead of where it would have been with no rescue, and it preserved its pensions and high wages for older workers by pushing pay far down for new workers.

With GM, Team Auto assumed from the start that there was a commercially viable core. This wasn’t obvious with Chrysler. It was smaller than GM, and Rattner says the economic case for saving it was much weaker. Obama’s economist, Austan Goolsbee, argued that Chrysler ought to be liquidated: the net loss of US jobs wouldn’t be so bad, because many of Chrysler’s customers would buy their cars from GM and Ford. Better to have two strong companies than three weaker ones. At one point Team Auto considered a plan to transfer Chrysler’s top brands — Dodge Ram, Jeep, and Chrysler minivans — to GM, and shut down the rest.

The political interference after the deal was announced came from Congress, on behalf of car dealers.

Team Auto was divided on whether to subsidize a deal with Fiat to save Chrysler. Rattner voted to do it, but he sounds as if he wished he hadn’t. The matter finally went to Obama, who said to do it. That is when Obama told Rattner to be “tough” and “commercial”—advice that Obama was not exactly following, himself.

The main theme of Rattner’s book is that the GM deal — the big one — was done much as a private investor would, if the investor had decided already to do something. Team Auto’s aim was to make the company profitable. That meant a new CEO, a new chief financial officer, a new chairman, and several new members of the board. It meant shedding Pontiac, Saturn, Hummer, and Saab, and narrowing the product line to Chevrolet, Buick, Cadillac, and GMC Trucks. It meant shedding almost 30,000 jobs. And it meant cutting out 1,124 GM dealers.

The political interference after the deal was announced came from Congress, on behalf of dealers. Each member had imperiled dealers in his district. Sen. Kay Bailey Hutchison, Republican, held up a war-funding bill on behalf of Texas car dealers. House Majority Leader Steny Hoyer, Democrat of Maryland, went to bat for a dealer who had sold only three Chryslers in all of 2008 — and later told Rattner he’d let the companies keep too many dealers alive. Or so Rattner says. Rep. Gabrielle Giffords, Democrat of Arizona, went to bat for a Chrysler dealer, Rattner writes, and “repeated her talking points over and over.”

The intervention by Congress, Rattner writes, was “an enormous, pointless distraction for the two companies at a critical time. Its interference left me wondering what in the auto rescue Congress might like to micromanage next — choosing factory locations or deciding which executives and workers stayed and which had to go?”

That’s politics. The incentives facing elected officials are alien to the imperatives of business. Of course, members of Congress would not have had the ability to intervene so easily if there had been no intervention by the executive branch.

Which gets back to the original question. Obama is an elected official. Why did he let this be done in a mostly businesslike way?

Rattner doesn’t ask the question, but the answer that floats to the surface is that Obama is responsible for the whole country, not just congressional districts with GM and Chrysler assembly plants.

But it was more than that. Probably the strongest reason why Obama didn’t politicize the bailouts to the extent that libertarians feared — and a reason not in Rattner’s book — is that the American people, Republicans and Democrats included, hated the bailout. The polls were clear about that. The first demos of the Tea Party showed it. So did the rise in the reputation of Ford, which became America’s most popular car company because it never took a bailout. There simply was little political gain for Obama in creating Government Motors, and much political hazard.

Probably the strongest reason why Obama didn’t politicize the bailouts to the extent that libertarians feared is that the American people hated the bailout.

So he didn’t do it. By the standards of capitalism — the standards of the market — the bailouts have turned out well. The new General Motors and the new Chrysler are viable companies. Libertarians have to admit that.

When they do so, they should also point out that much of this was due to luck, and not anchored in the nature of things. This time it worked out; but once you allow politicians to run about fixing broken industries with public money, all sorts of bad incentives are created. There is the bad incentive to be a Rick Wagoner, and wait to be saved. There is the bad incentive for the bailers-out not to be tough and commercial, but to be political instead, in order to get votes. In the spring of 2009 Obama wasn’t over a barrel for the electoral votes of Michigan and Ohio, but a future president might be. Rattner alludes to these problems, but his book is about the specific case, not a general case.

A car company rescue also gives the politician a chance to be tinkerer-in-chief. Obama isn’t a car guy, but I can think of politicians who would use the opportunity to push the Chevy Volt, or to go beyond it. Rattner is blunt about the Volt: it costs $40,000 to make, and it competes with cars that cost half as much. He writes, “There is no scenario under which the Volt, estimable as it may be, will make any material contribution to GM’s fortunes for many years.”

In his epilogue, Rattner compares the auto rescues, which resulted in two viable companies, with Obama’s “stimulus” package, which cost 10 times as much and spent the money “without anything like the rigor that private equity or venture capital investors apply.” He says that Americans “we will be disappointed by how little lasting benefit we got for those dollars.”

That is so. It is also a low standard for measuring government spending.

Libertarians still have strong reasons to oppose corporate rescues generally. But each case has its own facts, and sometimes a bad idea delivers good results.

I’m glad it came out well — I guess.


Editor's Note: Review of "Overhaul: An Insider's Account of the Obama Administration's Emergency Rescue of the Auto Industry," by Steven Rattner. Houghton Mifflin Harcourt, 2010, 320 pages.



Share This


Missing the Economic Boat

 | 

The Company Men is a highly relevant film about the financial meltdown of 2008, when hundreds of thousands of workers suddenly lost their jobs to downsizing and company closings. As a work of entertainment, it's overly didactic and preachy, but as a social commentary it provides many insights and offers great fodder for conversation and debate about the economy and how to fix it.

The film begins with a montage of the homes and toys of the super-rich. The camera pans through wealthy neighborhoods of elegant mansions with their gleaming kitchens, high-priced antiques, luxury cars, fancy swimming pools, and garages full of tony sports equipment. These are people who know how to enjoy their incomes. In the background we hear a voiceover of news reports about the financial meltdown, when, for example, 53,000 Citigroup workers lost their jobs in a single day.

The film highlights three mid- to high-level employees of a fictional conglomerate who lose their jobs in the blink of an eye. Bobby Walker (Ben Affleck) is a local boy who worked his way through college to earn an MBA and a well-paying job as a sales representative. Gene McClary (Tommy Lee Jones) is the head of a division that was once his own shipbuilding company, now owned by the conglomerate. Phil Woodward (Chris Cooper) is a 30-year veteran of the company who started as a welder in the factory and worked his way up to management.

All three lose their jobs to downsizing. One minute they're high-fiving each other about their golf scores; the next minute they're packing their boxes and heading out the door with twelve weeks' severance pay and the address of the Outplacement Center. There they sit in cubicles much like the ones where they worked in the company, only now what they're selling is themselves as they hunt for jobs that are increasingly scarce. As the weeks wear on, they become more discouraged, resentful, and scared. As Bobby complains, "I'm 37. How can I compete against MBAs fresh out of college who are willing to work for half what I was earning?"

What a swell guy! Wouldn't it be nice if everyone could work permanently for nothing?

The men's wives also become discouraged and resentful. At first, they all appear to be snippy housewives whose lives revolve around shopping and decorating, but each reacts differently to the news of her husband's layoff. One gets busy tightening the family budget and going back to work as a nurse. Another refuses to let anyone know, insisting that her husband continue leaving the house every morning, carrying a briefcase, and not letting him return until evening. A third marriage breaks up. In many ways, losing a job is as stressful as experiencing a death, and the reactions are as profound and as varied. Strong marriages become stronger; weak ones fall apart.

Writer-director John Wells has a definite point of view, and it is decidedly not in favor of people who make money by buying and selling. Borrowing a page from Marx's attitude toward money, he sees salesmen as useless middlemen who produce nothing and use nothing; they just handle the money. By contrast, Wells presents Bobby's blue-collar brother-in-law Jack (Kevin Costner) as the film’s true hero. Even Jack's modest home, where everyone drinks beer instead of wine and plays football in the backyard, is presented as being more fun than Bobby's fancy home.

Jack is a homebuilder by trade, and he hires people to help him. Unlike the corporate CEOs, who earn their bonuses no matter how poorly the company has performed and then fire employees to reduce costs, Jack actually underbids a job that will barely break even for him, just so he can keep his workers employed. What a swell guy! Wouldn't it be nice if everyone could work permanently for nothing? The film’s economic assumptions are hardly plausible, especially its Pollyanna ending (which I won't reveal here). And it conveniently overlooks the fact that the construction industry has taken a hard hit in the recession as well.

As I watched this film I thought about the plight of the people with which it is most concerned, the upper-middle class who in 2008 were working hard and following a path laid out for them by the preceding generation. They are people caught in the middle — they aren't important enough to have golden parachutes, like the CEOs, but they aren't poor enough to be able to maintain their lifestyles through unemployment checks, Section 8 housing, and government welfare. They don't qualify for Medicaid or food stamps, but they can't afford to pay their mortgages, make their car payments, or keep up with tuition and insurance when they lose their jobs. People who build their dreams on debt and long-term contracts just can't cut back when times get hard. They have no way to weather a storm.

The government’s intervention has created an artificial imbalance in the marketplace. It's time for government employees and union workers to start feeling the pinch as well.

Another thought that came to mind is that private businesses have borne the brunt of this recession. They are the ones making the difficult decisions to cut back or close down. A select few corporations have been bailed out by the government, but at the expense of others that are forced to contribute to their own demise through onerous taxation that funds the privileges awarded to their competitors. The government’s intervention has created an artificial imbalance in the marketplace. It's time for government employees and union workers to start feeling the pinch as well.

The Company Men makes some excellent points about how hard it is to deal with long-term unemployment. Affleck, Jones, and Cooper play their parts with a dignity, pathos, and poignancy befitting the characters they play in this film and typical of their own talented careers. But Wells is woefully underqualified to offer solutions to the economic problems we continue to face.

 


Editor's Note: Review of "The Company Men," directed by John Wells. Weinstein Productions, 2010, 104 minutes.



Share This


Breaking Out of the Box

 | 

It is a priori unlikely that a movie would enlighten viewers on subjects as different as autism and animal welfare, but there is such a film — a fine bioflick that aired on HBO last year and is now available on DVD. It’s the moving story of Professor Temple Grandin, its eponymous heroine.

Even today, autism is not well understood. It is a neurodevelopmental disorder that usually manifests itself in the first two to three years of a child’s life. It is typically characterized by severe difficulty in communication and social interaction, and by limited, repetitive behavior (such as endlessly eating the same kind of food or watching the same TV show). While the cause is still unknown, it appears to be a genetic defect afflicting about 1 or 2 children per thousand. While most autistic children are never able to live independently as adults, some — often called “high functioning” — are.

Temple Grandin is arguably the most famous high-functioning autistic person in the world. She was born in 1947, and was diagnosed with autism when she was three. With the help of speech therapists, she was able to learn to talk, and with the help of her extremely high intelligence, she went to an elite boarding school, where a gifted teacher mentored her.

She went on to get a bachelor’s degree in psychology from Franklin Pierce College in New Hampshire, a master’s degree from Arizona State University, and a doctoral degree from the University of Illinois in the field of animal science. She is now a professor at Colorado State University.

The focus of her work has been on the means by which animals, especially cattle, perceive and communicate, through the sounds they make and the way they move. Her autism seems to give her a distinct advantage here, because (as she explains it) she thinks the way cattle do, visually and concretely rather than verbally and abstractly, like ordinary people.

Her career has focused on the design of cattle lots, storage pens, and slaughterhouses that make them far more humane than in times past. Over half the slaughterhouses in America now incorporate her designs. The movie conveys her work beautifully. In one scene, we see her figuring out what is spooking cattle, as they move through a passageway, by getting down on all fours and walking the passageway herself, trying to capture visually exactly what is frightening them.

The movie also effectively conveys her view of our obligations towards animals, one that could be summarized as “Respect what you eat!” In her words, “I think that using animals for food is an ethical thing to do, but we’ve got to do it right. We’ve got to give those animals a decent life and we’ve got to give them a painless death. We owe the animals respect!”

The movie also deftly expresses certain aspects of Grandin’s autistic personality, such as her repetitive eating habits (she loves Jell-O) and her indifference to movies about emotional relationships (such as love!). In one charming scene, she is channel surfing and happens on the famous kiss-on-the-beach scene from the movie From Here to Eternity. She grimaces and quickly moves on to an action flick.

Temple Grandin is superbly done. Mick Jackson’s direction is sure and steady. He elicits perfectly pitched performances from the actors, performances that are emotionally true without being sentimental. And despite the serious nature of the story, the film is infused with humor. Julia Ormond is marvelous as Grandin’s mother Eustacia, evincing a combination of vulnerability and resilient strength. Catherine O’Hara is solid as Grandin’s Aunt Ann, whose ranch Grandin often visited. Another fine supporting actor is David Strathairn as Professor Carlock, a key mentor to Grandin in developing her understanding of science.

Especially wonderful is Claire Danes as Grandin. She shows us in sometimes painful detail what autism entails, and the suffering it brings, but she also shows us Grandin’s unique genius. Danes apparently studied her subject intensely, and it shows in her performance. She well deserved her Emmy for Outstanding Lead Actress.

The film was a clear success with both the critics and the audience. It was nominated for 15 Emmys, winning seven; besides Danes’ award for Outstanding Lead Actress, they included an Emmy for Outstanding Made for Television Movie.

This is a treat that should not be missed.


Editor's Note: Review of "Temple Grandin," directed by Mick Jackson. HBO Films, 2010, 103 minutes.



Share This
Syndicate content

© Copyright 2013 Liberty Foundation. All rights reserved.



Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.