Liberty Does the LNC!

 | 

Starting Wednesday, May 2, Liberty will be covering the Libertarian National Convention live from the floor of the Red Rock Resort in Las Vegas.

Can Gary Johnson take the nomination, or will the LP throw up another Badnarikian curveball? Who will emerge as VP? What clearly insane person will claim his or her 15 minutes of fame?

Join us here for daily (or sooner) reports on happenings at the LNC, and follow us on Twitter at @libertyunbound for up-to-the-minute bulletins on the stories and myriad oddities of this premier gathering of libertarian luminaries.



Share This


Lies, Damn Lies and the Bureau of Labor Statistics

 | 

Wise men going back at least as far as the great George Orwell have pointed out that statists have little but contempt for objective facts, truth, or reality. In the present day, this contempt is proved most clearly by the parade of false reports and statistical manipulations issued by the Department of Labor. Headed by former congresswoman and current troglodyte Hilda L. Solis, the Labor Department has become — primarily through its Bureau of Labor Statistics (BLS) and Employment and Training Administration (ETA) — the real-world version of the Ministry of Truth. Every report is revised; every revision changes numbers to favor the state.

And reality goes down the memory hole.

Since the first of March alone, the Labor Department has revised three major reports on U.S. labor productivity, job creation and unemployment. There are several official explanations for these revisions; the main one is Solis’ intellectually dishonest commitment to making employment data “seasonally adjusted.”

In theory, seasonal adjustment is supposed to smooth statistical bumps like temporary retail hiring around the Christmas or summer holidays and make it easier chart trends. Fair enough. But, in practice, Solis’ staff of statistical manipulators uses perpetual revision to create the illusion that newer numbers are always an improvement over older ones.

When Labor released these cooked numbers, the mainstream financial press parroted the line that the number was the “lowest in four years."

I’ve done enough statistical analysis to know that even (or especially) the most honest analyst will occasionally revise results. And, to be sure, Labor isn’t the only federal bureaucracy that does so. But few other agencies do so, so systematically . . . or with such partisan willfulness.

Here’s an example of the revision game from the ETA’s most recent (April 19) report on new unemployment insurance claims:

In the week ending April 14, the advance figure for seasonally adjusted initial claims was 386,000, a decrease of 2,000 from the previous week's revised figure of 388,000. The 4-week moving average was 374,750, an increase of 5,500 from the previous week's revised average of 369,250.

See what Solis’ hacks did there? They claimed that the number of initial claims for unemployment benefits was a “decrease” from the previous week’s “revised” number. What they don’t say is that the revision to the previous week’s number was to increase it from the 380,000 they reported at the time to the new number of 388,000. So, they raised the previous week’s number in order to claim the new number was a decrease.

Keeping the four-week moving average for unemployment claims below 375,000 is important because that’s a common inflection point between a rising and falling overall unemployment number. Given the accounting trickery in which Labor engages, a number so close to the inflection point has to be taken with a large grain of salt. The real number may be higher — and overall unemployment may be rising.

Solis’ subordinates play the same sorts of games with the so-called “jobs created” numbers. According to the BLS, “total nonfarm payroll employment” rose by 120,000 jobs in March. This was perceived as a bad number — significantly lower than the jobs created in the previous few months. If then-current numbers held, the trend for the first few months of 2012 was bad, indeed: January, 284,000; February, 227,000; March 122,000.

Focusing only on people who’ve recently been laid off, are actively looking for work, or are applying for unemployment benefits hides the backlog of unemployed people who’ve stopped looking for jobs.

But the Labor staff had a plan for making the downward line on a graph of those numbers less steep. They revised the January and February numbers, moving some of the January jobs into February. The “revised” decline: January, 275,000; February, 240,000; March 122,000.

The net effect isn’t much different; but the optic is better. The graph looks more like a plane gliding to a landing than crashing into flames.

Here’s an even more extreme example, from the April 19 ETA report I quoted above:

The advance number for seasonally adjusted insured unemployment during the week ending April 7 was 3,297,000, an increase of 26,000 from the preceding week’s revised level of 3,271,000. The 4-week moving average was 3,317,750, a decrease of 21,500 from the preceding week’s revised average of 3,339,250.

So, the current number of Americans receiving unemployment benefits was up. But the revised four-week trend was down. How could that be? Solis and crew used revisions to inflate the numbers at the back of the four-week chart.

They had used revisions to similar effect a few weeks earlier, on March 29, when it trumpeted a “decline” in initial unemployment insurance claims. It announced:

Initial jobless claims fell 5,000 in the week ended March 24 to 359,000, the lowest since April 2008 . . .

Later, in the fine print, the agency admitted that it had revised the previous week’s figure to 364,000 from an initially-reported 348,000. So, if not for the revision, the newer number would have increased by more than 10,000 new Americans on the dole.

Also in small type: the Department announced that it had revised weekly data on unemployment claims (and other key indicators) going back to 2007 — in part, to reflect seasonal adjustments. This caused all numbers in 2012 to rise about 4%. And cast the “since April 2008” part of the big announcement into doubt, too.

The guys from Enron went to jail or committed hara kiri because of tricks like this. Instead, when Labor released these cooked numbers, the mainstream financial press — including Bloomberg, Reuters, and the Associated Press — parroted the line that the number was the “lowest in four years.”

The numbers I’ve been considering so far aren’t the government’s formal unemployment numbers. Economists rightly consider the weekly unemployment insurance reports “additional data” and of less reliability than the BLS’ more formal unemployment surveys.

As you might already know, the BLS tracks six different measures of unemployment. These six unemployment statistics are:

  • U1: the percentage of the U.S. labor force unemployed 15 weeks or longer;
  • U2: the percentage of labor force comprised of people who lost jobs or completed temporary work;
  • U3: the “official unemployment rate” — people without jobs who have actively looked for work within the past four weeks;
  • U4: U3 plus “discouraged workers,” or those who have stopped looking for work because economic conditions make them believe that no work is available for them;
  • U5: U4 plus other “marginally attached workers,” or “loosely attached workers,” or those who “would like” and are able to work, but have not looked for work recently; and
  • U6: U5 plus part-time workers who want to work full time, but cannot due to economic reasons (underemployment).

These numbers give economists several different perspectives on the issue. Since all of the numbers are estimates, the combined perspectives are meant to offer a “three-dimensional” view of unemployment, more accurate than any single number. But even these more formal surveys are subject to manipulation and revision. During the Clinton administration, the BLS revised the formal stats — changing the “official” rate to U3 from a predecessor version of U5, which is always a higher number.

There were several problems with this cynical move.

The U3 statistic, with a methodology closest to the “additional data” numbers that Solis’ hacks manipulate these days, is the least reliable of the formal measures.

But there’s a more philosophical problem with the Clinton-era revision: focusing only on people who’ve recently been laid off, are actively looking for work, or are applying for unemployment benefits tends to downplay the number of able-bodied adults who are out of work. It hides the backlog of unemployed people who’ve stopped looking for jobs. A nearly-permanent underclass that includes the nation’s most incorrigibly unproductive people simply doesn’t appear in the U3 number.

Older unemployment numbers are revised upward; new jobs numbers are revised downward. Everything’s always getting better in this workers’ paradise!

So, Solis’ BLS can boast (as it recently did) that “the unemployment rate” fell to 8.2% in March 2012 from 9.1% in August 2011. But that headline ignores the fact that U5 unemployment was about 2 percentage points higher during that period — and U6 hovered above 15%.

This may be what statists want, though: to hide the economic effects . . . and very presence . . . of the hardcore unemployed. Who are, of course, the most enthusiastic supporters of big-government social welfare programs.

Throughout, the operatives continue to use seasonal adjustment to justify their manipulation of employment numbers. Here’s one example of a weasel-worded footnote explaining the spin:

Data in this release reflect the annual benchmark revision of BLS Current Employment Statistics program data on nonfarm employee hours, and revised seasonal adjustment of those data. . . . Quarterly and annual measures . . . for all sectors were revised back to 2007 to incorporate the annual benchmark adjustment and updated information on seasonal trends.

So much jargon, so little truth.

Solis’ hacks cling to seasonal adjustment because it serves as a blank check, an open-ended excuse for revising every employment report to show an illustrious victory for our valiant leaders. Under this administration, essentially every Labor Department employment survey or report that’s been revised has been so in a way that makes newer numbers look like improvements. Older unemployment numbers are revised upward; new jobs numbers are revised downward. So everything’s always getting better in this workers’ paradise!

Or, as one internet commenter noted: “We’ve got the Christmas season, summer season, and — most of all — we’ve got election season.”

Many politicians talk about abolishing the Department of Education; that’s become a kind of short-hand for commitment to limited government. But it might do more good to abolish the Department of Labor, whose truth-twisting under Hilda Solis has become so blatant.




Share This


Lost Lessons of Climate Science

 | 

Prior to the 1990s, most of us had never heard of climate science. Yet in a few short years, it was catapulted from obscurity to global prominence. As with many scientific disciplines today, climate science relies on fear as the basis for support. But it distinguishes itself from other branches in its use of unscientific means to achieve its largely political ends, and easy acclaim to reward its unscientific promoters. Such an arrangement has, to the dismay of legitimate climate scientists, fostered an unbridled arrogance that permits sketchy, surrogate temperature data to revise the past and sketchy, surrogate Nobel Prize winners to shape the future. The "new" history of the Medieval Warm Period (MWP) illustrates the former, and has led to the wanton green agenda of Barack Obama, which illustrates the latter.

The MWP occurred between AD 800 and AD 1300. According to an old college geology book, it was a climatically gentler time, 2–4º C warmer than today. Europeans prowled over parts of the northern world that are now completely inhospitable. The Viking Age roughly corresponded to this period, and in it, Viking explorers flourished. Erik the Red founded the first Norse colonies in Greenland, where Viking settlers enjoyed a sedentary lifestyle sustained by agriculture, livestock, fishing, and trade with Scandinavia. Erik's son,Leif, was a Norse explorer who reached America by a northerly route (about AD 1000) that would have been unavailable to other explorers only a few hundred years later.

From the beginning of the MWP, northwestern Europe was subjected to brutal and unrelenting Viking aggression. This ended when England was finally conquered in 1016 and Knut the Great became king — the first to rule successfully over a united and peaceful England. Knut's greatness was such that his courtiers believed he could control the tides. To demonstrate their folly, he sat in a throne placed upon the shore and commanded the oncoming waves to halt. As the water rushed over his feet, splashing his royal garb, he stood and spoke, "Let all men know how empty and worthless is the power of kings, for there is none worthy of the name, but He whom heaven, earth, and sea obey by eternal laws." A wise and humble king, Knut understood that man could not control nature.

The warm period was warmer than the cold period. No wonder NOAA scientists make the big bucks.

Then came the Intergovernmental Panel on Climate Change (IPCC). To promote its anthropogenic global warming (AGW) hypothesis, the MWP was abolished, in fine Orwellian fashion. The first IPCC climate assessment report (1992) contained a temperature history chart designed to illustrate the threat posed by recent warming. But this warming was dwarfed by MWP temperatures — on the same chart. Crack IPCC climate scientists soon recognized the "doublethink." The MWP warming rendered the recent warming neither unprecedented nor anthropogenic. As "Climategate" emails would later reveal, Jonathan Overpeck, a leading IPCC author, sought to "deal a mortal blow" to the MWP portrayed in the 1992 report. The IPCC notification process began in earnest. For example, US climate researcher David Deming was told, in a now famous 1995 email, that "we must get rid of the Medieval Warm Period." Ultimately, the mortal blow was delivered by the infamous Mann Hockey Stick chart. Based on cherry picked tree-ring data (a surrogate for instrumental temperature data), the hockey stick curve flattened away the entire MWP and became the centerpiece of the 2001 IPCC assessment report.

Already a staunch shill of the AGW movement, the mainstream media announced the new history with alacrity. To establish the revision permanently, objective and trustworthy websites were recruited to "the cause," provided they abandon, well, their objectivity and trustworthiness. For example, Wikipedia now tells us that the MWP was a time when "temperatures were probably between 0.1 °C and 0.2 °C below the 1961 to 1990 mean and significantly below the level shown by instrumental data after 1980." Even more disgraceful is the online version of Encyclopedia Britannica, which compliantly describes the MWP as a "brief climatic interval that is hypothesized to have occurred." That is, the MWP is but a theory. As another example, our taxpayer-funded National Oceanic and Atmospheric Administration (NOAA) states that the MWP was "warmer over the Northern Hemisphere than during the subsequent Little Ice Age." The warm period was warmer than the cold period. No wonder NOAA scientists make the big bucks.

Such is history in the world of political climate science. But there is a large body of uncensored scientific evidence confirming the existence and magnitude of the MWP; according to the Center for the Study of Carbon Dioxide and Global Change, it is published in peer-reviewed scientific journals by 1068 individual scientists from 615 research institutions in 45 different countries. The Medieval Warm Period existed, it was global and, with no help from industrialized humans, it was warmer than today. By suppressing this legitimate scientific information, climate alarmists deceitfully pronounce recent warming to be unprecedented and therefore worthy of onerous taxes, intrusive regulations, and wealth-stifling decarbonization.

It is worthy of objective scientific deliberation, but scientific inquiry corrupted by political ideology and rewritten climate history has led to little more than the foolish claims and emotional alarms of scientific dilettantes. No good can come from hastily spending staggering sums of money to avert a warming trend that is certainly exaggerated by manipulated temperature data, has an anthropogenic contribution inflated by unreliable climate models, and, ultimately, could be driven predominantly by natural climate variability. As MIT's Alfred P. Sloan Professor of Meteorology Richard Lindzen has said, "The fact that the developed world went into hysterics over changes in global mean temperature anomaly of a few tenths of a degree will astound future generations."

Much of the hysteria began with Nobel Prize winner Al Gore, who used the specious, MWP-less hockey stick graph to herald catastrophic manmade warming. (That the Nobel Prize could be awarded for "disseminating greater knowledge about man-made climate change" based on manmade climate data is itself a catastrophe.) Mr. Gore's arrogant assertions that mankind could cause such damage provoked an even more arrogant Nobel Prize winner into asserting his power to reverse it. In his 2008 nomination victory speech, Barack Obama proclaimed that he was "absolutely certain that generations from now, we will be able to look back and tell our children that this was the moment . . . when the rise of the oceans began to slow and our planet began to heal . . ." Evidently, it is Barack Obama who will astound future generations.

This is much more than grandiose campaign gibberish. Mr. Obama prefacedthe statement by asserting, "I face this challenge with profound humility, and knowledge of my own limitations." It is with breathtaking narcissism, not profound humility, that oneclaims he can reverse a planet-wide catastrophe. This breathtaking narcissism has been the hallmark of Obama's political career. In his 2004 Senate campaign, then-candidate Obama doodled during an interview for a fawning article in The Atlantic; the sketch he drew was a self-portrait. Even if he knew that he would win the election, who (except a pretentious twit possessed to write a 405 page autobiography, not four years after graduating from law school, at the sagacious age of 33) draws pictures of himself? Apparently he also knew that he would go on to become president and, during a fawning "60 Minutes" interview, be able to announce that, after only two years in office, his achievements placed him among the greatest presidents in American history.

Breathtaking narcissism has been the hallmark of Obama's political career.

Alas, planet healing was not among his feats. For all his shamelessly self-aggrandizingpatter and all his boneheaded green largesse ($100 billion from the stimulus program alone), President Obama has been unable to pick a single winner. The list of bankrupt and failing green energy companies continues to grow. The green jobs created and green energy produced are paltry at best and sustainable only through feckless subsidies, grants, loans, exemptions, and rebates. After more than three years in office, his planet healing achievement is a bombastic zero.

Flattery from his courtiers no doubt led Obama to believe that green technology would reward him with a rejuvenated economy and a soothedplanet, which — like his Nobel Prize — would be cheaply attained and would thrust him into the ether of greatness. The din of cheering deafened him to the sound of tax dollars flushing into the rising oceans. He has learned little from his failures, even less than his European counterparts have learned. Admitting some of their failures, they are drastically scaling back green energy programs. Meanwhile, undaunted by the forces of nature, the laws of economics, and the limitations of green technology, the audacious Mr. Obama plots to buy his dream with even more government spending — an "investment" whose rate of return, he seems to believe, can be enhanced simply by vainglorious rhetoric uttered from his throne. Facingrecord-breaking debt and deficits, and without a single green success to inspire further hopes, Obama is trying even harder to secure his lofty place in the annals of planet-saving history, possibly in the void left by the purge of the Medieval Warm Period.

As the 2012 election approaches, President Obama will certainly encounter many more fawning interviewers. He should consider a sketch of himself, standing on the shore, holding back the tide, while picking the pocket of the American taxpayer.




Share This


Rich, White, Mormon . . .

 | 

In a recent piece, I suggested that Romney would likely win the Republican nomination, and since he has no history of scandals and is obviously intelligent, the Obama campaign (which cannot run on Obama’s record in office, which is risible) would attack him in the only way it can. It will try to attack him for being rich, white, and Mormon. That is, it will try to arouse envy of the rich among the poor, hatred of whites among blacks and Latinos, and anti-Mormon prejudice among the populace at large (and especially among evangelicals).

So I am not surprised that Obama is now running this sort of campaign. I am, however, surprised — and pleasantly so — at how quickly the Romney campaign and the countermedia are hitting back.

I have already described in some detail the mainstream media’s concerted attack on Romney’s faith. Two recent illustrations have just occurred. MSNBC host Martin Bashir read on air from the Book of Mormon, somehow linking it to his condemnation of Romney to hell for allegedly lying about Obama’s stellar record of job creation (a record that, by the way, should condemn Obama to hell). And Montana Gov. Brian Schweitzer couldn’t resist reminding us all that Romney descends from a polygamist great-grandfather.

The countermedia hit back, pointing out that Obama’s grandfather and even his father were polygamists. So the Schweitzer attack quickly fizzled.

Then there were the attacks on Romney’s wealth. Obama felt obliged to remind us that he was “not born with a silver spoon in his mouth,” and his campaign advisor and pet harpy Hilary Rosen opined that Ann Romney had never worked a day in her life.

The pushback was rapid. If not a silver spoon, Obama clearly had some feet in his mouth. The blogs had a field day pointing out that his white grandmother was wealthy; she paid for him to go to the toniest prep school in Hawaii, then sent him to Occidental, a high-priced private college. The sharp-tongued Ann Coulter caused a stir when she observed that Obama was a clear beneficiary of affirmative action, which is certainly a case of being privileged.

As to Rosen’s slur, the backlash was quick and strong enough to make Obama disavow Rosen’s views and make her apologize rather quickly. Ann Romney had, as the countermedia reminded us, raised five boys, which every parent will attest must have involved endless work. The countermedia also showed that Mitt and Ann Romney started out with essentially nothing, and built their fortune together.

Of course, Rosen was just pushing a line, one found in a strain of feminism that goes back to Simone de Beauvoir. It’s the idea that women who choose to be stay-at-home mothers are being turncoats to The Cause. This may play well with Women’s Studies profs, but it clearly doesn’t resonate with most women.

Finally, it became all too clear how Obama aims to fire up his base among minority voters: look for any racial incident to exploit. One appeared, providentially, in the killing of Trayvon Martin. Knowing the game plan, the mainstream media quickly hyped the story: big, armed white man shoots innocent black child, and gets away with it in the Deep South. Obama quickly pushed racial grievance exploiters like Al Sharpton and Jesse Jackson aside (not an easy thing to do!) to get in front of the issue. “If I had a son, he’d look like Trayvon,” the prez solemnly intoned.

Obama’s plan was as obnoxious as it was obvious: find and dramatize a racial incident, get minority supporters to rally behind him, then turn to the rest of the country and say, “See? You need me to keep racial tensions under control! My opponent can’t do that, because he’s white!”

However, the countermedia rapidly discovered facts that shook the Narrative. Trayvon’s shooter, George Zimmerman, was half-Hispanic; he was smaller than Martin; Martin had had trouble with the authorities; Zimmerman had wounds on the back of his head, consistent with his story of self-defense; and so on.

The president and his henchmen in the MSM will continue to assail Romney along these obvious tracks. But the countermedia seem to be pushing back, and Romney seems to be catching on to the vicious assault headed his way. He has had only a tiny appetizer — the main course awaits him.

In the meantime: entitlement programs keep heading towards collapse, war clouds appear once again in the Middle East; we remain totally dependent on volatile oil supplies (Obama having canceled the Keystone pipeline and prevented oil and gas exploitation on public lands), and the pathetically tepid “recovery” seems headed for a stall.

A country gets the government it deserves. And it gets the economy it deserves as well.




Share This


Wage War on Dependence

 | 

Recently, I heard a school administrator promoting the importance of making all of the parents at our schools aware of the existence of government programs for the homeless. “Lots of people don’t even know that they qualify for these programs.” she enthused. “If they are living with family members and not paying rent, they can qualify as homeless!”

What would that do for them, I wondered?

According to the website of the Oregon Homelessness Prevention and Rapid Re-housing Program (HPRP), its “re-housing program” can provide these kinds of services to the "homeless":

Re-housing programs work with people who are already homeless to help them quickly move into rental housing. Re-housing programs can provide housing location, financial assistance including security deposits, rent assistance and payment of arrearages and case management. Both homeless prevention and rapid re-housing programs coordinate with other community resources to ensure that participants are linked to ongoing assistance, such as housing vouchers, intensive case management, or assertive community treatment.

So if a family (in this community often a new immigrant family) is managing their finances by living with relatives until they can get on their feet, government agencies can arrange to give them financial assistance in the form of security deposits to rent a place they otherwise couldn't afford to rent, and participate in a program of government “rent assistance” or “housing vouchers.” The person recommending this seems to think it would be a good thing to move someone into a situation where he was dependent on government for a place to live. Implied, but not stated, is the assumption that it is kind of stupid to prefer to take care of yourself when you can get something for free instead.

Connected to that assumption is the proposition that any well-meaning person, such as a teacher or school administrator, has an obligation to convince stiff-necked individuals that their pride is hurting their children, and they really should accept the government’s largesse. This assumes, however, that one’s quality of life is measured simply by the dollar amount of the things one receives, without regard to how one obtained them.

Implied, but not stated, is the assumption that it is kind of stupid to prefer to take care of yourself when you can get something for free instead.

Not so many decades ago it was commonly understood that there was something demeaning about being on "the dole.” People did not want to accept charity if they could make their own way in life. There were the pejorative terms “kept woman” and worse still, “kept man,” meaning a person who did not have a job but was supported by a sex partner. Many of the social programs we have today were sold with difficulty to an American public for whom public assistance and dependency carried a stigma.

According to Andrew Biggs of the American Enterprise Institute, Social Security was presented not as a needs-based program of charity in which today’s workers pay for the benefits of today’s elderly, but as “a system of social insurance under which workers (and their employers) contribute a part of their earnings in order to provide protection for themselves and their families if certain events occur. As a result of this 'earned benefit' status, collection of Social Security benefits has never carried the stigma associated with food stamps, Supplemental Security Income, or other welfare programs.”

That has been the pattern with a number of “entitlement” programs. Instead of being needs-based charities, which show one’s dependence, programs such as Medicare and Social Security are made for everyone. Therefore there is no stigma and everyone should be happy to receive benefits from the government. Of course, the effect is that these programs have ballooned in size and are currently unsustainable. (Odd that sustainable houses and buildings are all the rage, but sustainable social programs, not so much.) We have a huge financial burden looming ahead of us as these entitlement programs become ever more costly as more of us baby-boomers retire and expect to collect benefits. Because there is no stigma associated with these programs, we all intend to capitalize on them.

Here lies the problem — and also the solution to the problem. Instead of a War on Poverty, we should have a War on Dependence. All our social programs should have as their goal helping people become independent of government assistance. They would still require considerable effort and would still employ many social workers for years to come, but the war could be won! We could get to the point where everyone had a way to support himself.

How would that look different from today’s social programs?

For one thing, we’d begin by applauding all those who already take care of themselves. We would hold them up and give them recognition. We would put them on talk shows and news programs to tell their story of how they manage in life without government assistance. They would become our role models. We would applaud and appreciate the fact that they do not need to collect on the various social programs to which they are “entitled.”

For example, people over 65 who were working at a job or who could afford their own medical insurance would be honored for their ability to be independent of Medicare. Right now of course, you virtually have to take it, because no one will insure you at age 65 unless you collect all the Medicare benefits you can. So right now we are forcing dependency — but the War on Dependence would change that.

We should encourage everyone to avoid having to depend on Social Security as well. Anyone over 65 who doesn’t need to collect “benefits” from the payroll tax in order to survive in old age would be a hero in everyone’s eyes. If people keep working, that would be super, because they can be independent thereby. If people save enough to retire with dignity, that would be even better, because they would be permanently independent. What’s more, their children would be well on their way to permanent financial independence, when they inherited the principal of their parents' retirement fund. As part of the War on Dependence, social workers would help younger people set up various retirement savings plans. Each person who had a workable retirement savings plan could stand tall in the knowledge that he would not become dependent on Social Security.

All our social programs should have as their goal helping people become independent of government assistance.

One of the sad byproducts of the endless and hopeless War on Poverty is that self-sufficiency is no longer valued as it once was. Someone is considered a fool to turn down government benefits if he can “qualify” for them. What’s more, someone who gets a first-rung-on-the-career-ladder-job at a low wage still feels bad about himself. Instead of being proud of being independent, he sees that he is still in relative poverty, and that is what’s bad. People who are supporting themselves, no matter how meager their circumstances, should be encourage to take pride in not being dependent. We should make self-sufficiency the goal, the prize, the honor.

Social workers could help farmers who accept government subsidies find ways to become self-sufficient so they can be respected for making an “honest living” without help. Businesses that sold products abroad without help from the government would be recognized and patronized. Similarly, industries that did not ask for protectionist tariffs imposed by the government, but could stand on their own, would be new American heroes. Students who found a college they could afford without government help would be seen as more resourceful and valuable future employees. Colleges that keep themselves in business without whining for more government money would be seen as more competent than those that couldn’t manage on their own. This turn of events might even drive down the cost of college. Primary and secondary schools that focus on helping their graduates prepare for the real world would also be recognized and respected; the ability of their graduates to avoid dependence would be the final measure of the schools' own worth.

Success would no longer be a nebulous and ill-defined chimera, but would be identified as the ability to support oneself and one’s family. Families that took care of their own (whether the young or the elderly) without government assistance would be honored. People with disabilities would be helped to develop as much independence as possible, and honored for every bit they could obtain — instead of scorned for their efforts to contribute to their own support.

Industries that did not ask for protectionist tariffs imposed by the government, but could stand on their own, would be new American heroes.

Oddly, poverty could, in a sense, be eliminated overnight by simply writing checks of the proper amount to all the poor. It would help if all our programs of assistance were rolled into one program, so we could keep track of how much we were giving to each person. We might find that we had already eliminated poverty — that the cash value of all the various forms of assistance we provide to the needy would total enough to give them an income over the poverty line. But few people really believe, deep in their hearts, that mere dollars will eliminate the problems of the poor.

Independence is the solution — and we need to return to the habit of valuing it. There is still truth in the old proverb, “Give a man a fish, and you feed him for a day. Teach a man to fish, and you feed him for a lifetime.” That means focusing our efforts on reducing dependence instead of fostering it. A War on Dependence would be infinitely better than the old, unwinnable War on Poverty.




Share This


Libertarian Aphorisms

 | 

Running your own life is difficult. Running someone else’s is impossible.

There is no such thing as safety, but there is such a thing as courage.

The job of business is to make life livable. The job of government is to make business impossible.

If war is hell, then pacifism must be heaven.

Assuming that to want something you must also want to pay the price to get it, everyone always gets what he wants in a free market.

Taxes are the price we pay for living in a society that has not yet become truly civilized.

Wealth is what society gives to the owners to compensate them for bearing the risk of large-scale failure.

Democrats sacrifice the healthy to save the sick.

Government: ambitious thugs who proclaim themselves saviors — which is precisely what you would expect ambitious thugs to say.

The difference between libertarians and conservatives? Libertarians have more fun.




Share This


What Not to Do About Bullying

 | 

The Weinstein Brothers are champions at using controversy to garner publicity for their films. Recently they used that skill to the hilt to stir up public interest in their documentary Bully, an intimate look at the problem of bullying in public schools. First they inserted enough explicit language to earn an R rating. Then they complained vociferously to the rating board that the R would prevent the most important audience from seeing the film. The controversy was reported in the media, and reviewers like me sat up and took notice. I viewed Bully the day it opened, fully expecting it to knock Hunger Games off its throne as the most talked-about film of the season.

Katniss Everdeen need not worry; she still owns the throne. Bully is a good documentary. It might even be an important documentary. Those who have experienced any form of bullying will probably find it a very moving documentary. But I don't think anyone is going to be flocking to see it, regardless of the rating. It will probably end up in many school libraries, however, and many students will see it in their social studies classes for years to come.

The film follows the stories of five students from four states (Georgia, Iowa, Oklahoma, and Mississippi) who have experienced bullying for a variety of reasons. One is small for his age. Another suffers the physical effects of a premature birth. Yet another is a lesbian. Two have committed suicide, and are represented by interviews with their parents and in clips from home movies. The filmmakers were able to get surprisingly candid scenes of students abusing these kids on the bus, in the cafeteria, and in the hallways, as well as candid scenes of administrators, teachers, and parents — people who often make fools of themselves as they discuss the problem. I often thought to myself, "Didn't they realize they were being filmed?" I applaud the filmmakers' ability to elicit such open, unabashed realism. No one in this film was on his best behavior.

Except, perhaps, for the school administrators. They fairly glow in their obvious attempt to put their best feet forward and shine for the cameras. The principal at Alex's school stands in the hallway as the students arrive from the bus. One boy comes to her with his hand on his head and reports, "A kid slammed my head into a nail!" The principal puts on her sweetest, most understanding voice and says, "I'll bet you didn't like that, did you?" Another boy walks past, visibly upset, and she says to the camera, "He's such an unhappy child." In another scene she forces two older boys to shake hands as a way to resolve what seems to be a longstanding battle. The bully extends his hand, but when the victim of his bullying is reluctant to do so, she chastises him, saying that he is now the bully and that it's his fault. "Why don't you just get along?" she coos after the bully leaves. "He was willing to shake your hand. I think the two of you could be friends." The boy, nervous and confused, responds, "The cops told us to stay apart." This principal hasn't a clue. Not a clue.

When are parents going to wake up and realize that the public school system itself is broken beyond repair?

Another school administrator tries to whitewash the issue. "Is it an ongoing problem in our school?" she asks rhetorically. "No it is not," she answers herself, vigorously nodding her head as she says it. I would love to have an expert explain that body language! Another administrator offers a similarly Pollyannish response when parents complain about the abuse their son is experiencing on the bus. First she says, "Buses are notorious for abuse," and offers, "I can put him on another bus" — rather than trying to solve the problem. The mother suggests (wisely, I might add), "When I was a kid the bus driver pulled over and wouldn't take anyone home until everyone settled down." Isn't that kind of an obvious policy? But the principal simply says, "I've ridden Bus 54, and they were good as gold." Several of us in the audience laughed out loud at that idiotic response. Well, duh! You were on the bus! Of course they behaved!

The film does a fine job of revealing the problems experienced in the communities it covers, but it offers few satisfying solutions. Devon, who was bullied for four years, says in an interview, "I stood up for myself, and they leave me alone now." But when Ja'Maya tries this technique, she ends up in juvenile jail for several months. Alex simply gives into the abuse, acknowledging sadly, "At least it's attention. I don't mind it that much." His story is perhaps the saddest, because he feels so lonely in addition to being bullied. He just wants a friend. Telling school authorities and the police also accomplishes little. When a vice principal asks one of the bullied boys why he didn't speak up sooner, he responds, "Because you didn't do anything about it last year when I told you that X sat on my head."

Only one family makes what I consider the right choice: they take their daughter out of school and teach her at home. Why would any thinking, caring parent subject a child to this kind of torture day after day? When are parents going to wake up and realize that the public school system itself is broken beyond repair?

While Bully is a good movie, it is hardly a great one. My biggest complaint is that its scope is so limited. These children all lived in similar small towns in the South or Midwest, and all seemed to come from similar poor socioeconomic backgrounds. That is hardly a representative sampling. An estimated 13 million children experience bullying every year, representing every region of the country, every size of community, and every socioeconomic group. Moreover, children used to find a safe haven after school hours. But bullying has left the schoolground and now occurs increasingly at home, especially through the internet. Children simply can't get away from the painful words and public gossip. None of this is highlighted in the film. Nevertheless, I consider Bully must-see viewing for anyone who has a child attending a public school.

What made me saddest as I watched this film was not the funeral of little Ty, one of the boys who committed suicide, or even watching a boy ram Alex's head against the back of a bus seat. It wasn't hearing Alex's principal say, "Boys will be boys." It was the sight of Alex's own mother browbeating and chastising him for not telling her that he was being bullied at school, followed by his sister joking that she's embarrassed at school because none of her friends like him. He's surrounded by bullies at school, and by well-intentioned bullies at home. I just wanted to wrap my arms around that little boy and whisk him away from all of them. All of them.

Except, perhaps, for the school administrators. They fairly glow in their obvious attempt to put their best feet forward and shine for the cameras. The principal at Alex


Editor's Note: Review of "Bully," directed by Lee Hirsch. Weinstein Bros., 2011, 99 minutes.



Share This


The Scorekeeping Society

 | 

The aftermath of Hilary Rosen’s statement that Ann Romney “hasn’t worked a day in her life” has focused mainly on whether or not “mothering” is considered “work.” The Obama administration has fallen all over itself in an attempt to gain distance from Rosen’s statement, and Rosen herself has issued an apology. In fact, it would be hard to find anyone who would seriously assert that raising children and keeping house doesn’t require effort.

But the commentators are missing the real issue here. It isn’t how Ann Romney spent her time that bothers Rosen and others like her — it is the fact that Romney wasn’t paid by an outside source for her services. If she had operated a daycare center from her home, taking care of someone else’s five children for pay, or if she had gone into other people’s homes to clean and organize and drive carpool, no one would have suggested that she “hasn’t worked a day in her life.” It isn’t the nature of the work that angers them. The true, underlying objection to stay-at-home moms is that there is no way to measure the worth of their labor. We are a society that likes to keep score, and the way we keep score of an adult’s value is through dollars.

The truth is, most stay-at-home moms don’t stay at home. They are extremely active and productive. I was hoping Ann Romney would talk about some of the work she has done outside her home as well as how hard she worked inside her home raising her boys. She has worked as a teacher and as an administrator in many charitable organizations, particularly within the Church of Jesus Christ of Latter-day Saints. The Mormons are a lay church, meaning they have no paid ministry. As president of a congregation’s Relief Society, for example, a Mormon woman is responsible for ministering to the spiritual, social, and welfare needs of hundreds of families. She oversees weekly classes, coordinates compassionate service projects, counsels with women who are struggling with various problems, and delegates duties to an army of women who watch over the flock, all through voluntary service. In many ways, her job is similar to that of the director of a Red Cross or Salvation Army unit in a neighborhood that experiences the equivalent of a home fire every week. But because she is not paid for her services, there seems to be no acceptable way to measure the value of her work. And without a unit of measurement, the “score” is assumed to be zero.

For many years Ann Romney served as the teacher of a rigorous daily scripture-study course for high school students. The program is administered by the worldwide Church Educational System, which requires teachers to attend monthly faculty meetings and in-service training sessions. It also requires intensive daily study and preparation on the part of the teacher. True, a “real teacher” (i.e., “salaried” teacher) would spend the entire day leading perhaps five sections of the same course, instead of just one hour-long session. But the preparation required to teach a class is the same for one section or multiple sections. Ann Romney worked just as hard at just as respectable a job as any employed teacher. But she received no credit in the eyes of the world because she wasn’t financially remunerated. There was no way to keep score.

Romney is also an athlete. Despite being diagnosed with multiple sclerosis, she competes as an adult amateur in equestrian dressage at the national level. I suppose if she were a paid athlete, we would consider this a “job.” Certainly she puts in as much practice and effort to reach the national level as a professional athlete might. But since she is an unsalaried amateur, this is considered just one more example of Ann’s little hobbies as a wealthy stay-at-home mom. She has dedication and success, but it isn’t really “work,” is it?

This obsession with scorekeeping has invaded our school system as well, where it threatens to stultify the naturally creative minds of the young. Bush’s “No Child Left Behind” program has turned many of America’s children into mush-headed test-takers. “Teach to the test,” once the hallmark of the worst kind of teaching, has become the new mantra of public school education. With jobs and funding at stake, school administrators chastise teachers who introduce art, music, or even spelling (which isn’t on the standardized tests) to their students. “Get those scores up!” administrators fairly bellow, and that means focusing only on the tasks that are tested. It’s all about keeping score and bringing in the money.

In an advanced economic system, where money and exchange form the basis of measuring work, it is very easy for the capitalist to start viewing the world narrowly in terms of “making money” instead of “making useful goods and services.” But value is determined by much more than money. Interestingly, the people who characterize stay-at-home moms as “not working” because they don’t get paid are often the same ones who try to eliminate scorekeeping in Little League and other youth sports. “Children should play for the love of the game!” they proclaim.

I think they have this backwards. Games require scorekeeping. Goods and services require a medium of exchange. But caring for family, friends, and community can be done for the rich reward of merely a hug. Women who rear families and care for their homes do not need a paycheck for validation. Let’s put scorekeeping back on the soccer field, and take it out of our homes.




Share This


The Craze to Canonize Mike Wallace

 | 

Day after day, the big news — according to Yahoo, CNN, the old TV networks, and all the self-important newspapers — has been the sad demise of Mike Wallace, at the young age of 93.

Always he is called the “legendary” Mike Wallace, as if “legendary” were a title like Grand Exalted First Convener of the Muskrats’ Fraternal and Benevolent Association — an honorary title that for form’s sake must always be prefixed to the real name. Sounds good, means nothing — except, I’m afraid, to the fraternity of “newspeople” who put out this stuff.

Fools like Wallace were, and are, appointed to parrot the common opinions of a small circle of “opinion leaders.”

The man was a legend? He was a television “correspondent.” How can a man who read the news (or purported news) be the subject of some mysterious legend? If “legendary” is taken literally, it can only mean that to most Americans, his very being was mysterious; until now, most living people didn’t even know he existed. But to read the “news,” one would think that American peasants sat around the hearth fire every night, spinning yarns about Mike the Great.

Fortunately for the nation’s sanity, one of Wallace’s real, uh, um, accomplishments has now resurfaced. It’s a documentary about homosexuality that he concocted for CBS in 1967. In it, he opined:

“The average homosexual, if there be such, is promiscuous. He is not interested in nor capable of a lasting relationship like that of heterosexual marriage. His sex life — his love life — consists of chance encounters at the clubs and bars he inhabits, and even on the streets of the city. The pickup — the one night stand — these are the characteristics of the homosexual relationship.”

That was grossly and offensively untrue. It is untrue now; it was untrue then. It was known to be untrue to people who actually knew any gay people (which, however, Wallace did), or had read a book that went deeper into the mysteries of sexuality than, say, the collected sermons of Pope Pius XII. Wallace later insisted:

“That is — God help us — what our understanding of the homosexual lifestyle was a mere twenty-five years ago because nobody was out of the closet and because that’s what we heard from doctors.”

But supposing it’s true that “nobody was out of the closet” (and some of the people whom Wallace encountered were out of the closet), and also that this is what everyone understood (which it wasn’t): why, if Wallace knew nothing more than the falsehoods that everybody else — i.e., “we” — appeared to know, did he appoint himself to announce those falsehoods from the papal chair of CBS, as if they were news?

Here’s what’s going on. Fools like Wallace were, and are, appointed to parrot the common opinions — that is, the callow beliefs, misinformed notions, strange hunches, weird superstitions, and flat-out hysterias — of a small circle of “opinion leaders.” That is the “we.” These people are not particularly well educated. They know little or nothing about history. They aren’t curious about it, either. Their knowledge of science (“what we heard from doctors”) is extremely limited and completely unskeptical. If they read a book, they look for something that will confirm their pre-existing views, especially their assumption that everything important or “legendary” will naturally pertain to people like themselves. They know little of normal human life, by which I mean not only the vast field of sexuality but also such commonplace and obvious matters as how money is made, the nature of governmental power, normal forms and practices of religious belief, the fact that people actually enjoy getting drunk (I have an academic book on my table that attempts to explain why people drink), the various reasons why people value guns and other means of self-defense, the reasons why young families might want to live in the suburbs instead of some “green” high-rise, the resistance of parents to new-fangled notions of education, and above all, the sullen resistance of normal people to being “educated” by dopes like Wallace.

Yet these people — the “we” — are advertised (by themselves!) as heroes and “legends.”

You can say this about every broadcasting or newspaper “legend” you find. Cronkite. Murrow. And the next one who dies. He’ll get no eulogies from me.




Share This


Beyond Race

 | 

A hundred years from now, historians will probably cite Barack Obama’s election to the presidency as a momentous victory in America’s struggle against racism. The arrival of the first Africans in Jamestown in 1619 and the election of an African-American to the presidency in 2008 make a nice pair of bookends. But of course, only in Hollywood is a story line so tidy. Neither history and nor the struggle against racism has reached an end. So: What will those future historians look back upon as the victory against racism that followed the election of Barack Obama? Here’s a thought.

The election of President Obama was caused, at least in part, by the fact that many of his white supporters felt good about voting for him because he is black. This feeling was not a secret. It was much discussed at the time. And there was nothing mysterious about it. Many white Americans felt, and still feel, a pang of guilt and shame about the historical treatment of black people in the United States. To cast a vote for a black candidate was a way of alleviating the sting, a means of atoning for the collective sins of the past. To support a black candidate was a way of saying, “I am not a racist.” The wave of euphoria created by this public affirmation swept over millions of enthusiastic white voters.

Caution is called for here. The subject of race is sensitive. To be fair, there were probably thousands of white voters who did not take into account Mr. Obama’s ethnicity when they voted for him. There were probably thousands more who voted for him in spite of his race. However many voters there were in these two groups, the following is not about them.

To accuse someone of harboring racist motives is a serious matter. Nevertheless, the truth must be told, and it is this: the extent to which a voter selects a candidate based on race is the extent to which that voter is guilty of racial bias. Racism is, after all, discrimination on the basis of race. To vote for or against any candidate because of that candidate’s race is, simply put, racist. For example, to vote for John McCain would be one thing, but to vote for him because he is white is quite another. Sorry for being so blunt, but there it is.

Now imagine how the current presidential campaign season would look if the Democratic incumbent were white and all other variables were unchanged. Bear with me.

The national debt is many trillions and rising fast. The annual deficit is a trillion-point-whatever. The fed is creating money (not wealth) like there’s no tomorrow. The unemployment rate is at some painfully high level and falling oh-so-slowly. Record numbers of twenty- and thirty-somethings live with their parents. Millions of mortgages are under water. College debts are astronomical. Federal entitlement programs are fiscal time bombs. Food grows more expensive weekly. Gas prices are through the roof. The EU is slowly fracturing. There is no sign of peace in the Middle East. The Arab Spring is a mess. Iraq is sliding back into authoritarianism. The Taliban is running out the clock. Iran is nuking up like North Korea. Russian democracy is dying in the cradle. Mexico is overrun by drug gangs. China is our manufactured-goods crack dealer and T-bill loan shark. There’s more, but you get the picture.

Now ask yourself: in this set of circumstances, would a white Democratic incumbent have a prayer of reelection? (Hint: No.) Why, then, is the president doing so well in the polls? I’ll say it: President Obama’s fairly rosy reelection prospects are the result of the same racial bias that put him in office in the first place. For most African-Americans, not to vote for him is out of the question. For many guilt-ridden white voters, not to vote for him would just be too painful.

Still, there is hope. A hundred years from now, historians may look back at the election of 2012 and say that the American electorate closed its eyes to race, endured the pangs of guilt, chose to do without the second wave of euphoria, and won a major victory in the struggle against racism by making Barack Obama a one-term president.




Share This


Obamacare and Judicial Activism

 | 

Last week the Supreme Court of the United States heard oral argument on whether to overturn Obamacare. I had written previously in Liberty that I suspected Obamacare would stand, and estimated a mere 1% chance of the vile, disgusting step towards socialized medicine being struck down.

But amazingly, Obama’s Solicitor General, who argued the case, was, by most accounts, totally incompetent. He got so tongue-tied that he had to be verbally bailed out by Justice Ginsberg — several times. He could not articulate a limiting principle for where the powers of government would stop if Obamacare stands. This frightened some of the justices (although, in fairness, no such limit can be articulated, because Obamacare is a slippery slope towards socialism.)

Most importantly, Justice Kennedy said things suggesting that he would probably vote to strike Obamacare down. Kennedy is the moderate justice who holds the crucial swing vote between four liberals (all of whom are thought to support Obama’s health care bill) and four conservatives (who are believed to oppose it). So the legal community now suspects that Obamacare is doomed. The so-called “individual mandate” is most likely going to die, and the entire convoluted, ungodly abomination might get dragged down with it, thus ending America’s nightmarish experiment with socialized medicine.

This is great news for libertarians and bad news for President Obama.

How did Obama respond? This is how: by holding a press conference in which he bullied the justices, threatening them with the charge that overturning his law would be “judicial activism” and noting that the Supreme Court is not elected whereas Obama’s Congress, which narrowly passed his healthcare plan, was elected. His statement contains two glaring flaws.

1. Yes, Congress is elected and the Supreme Court isn’t. That is the beauty of the Founding Fathers’ scheme, that the rights of individuals are safeguarded by courts which do not answer to the whims and emotions of the hysterical and easily manipulated masses. Yet voters had sent a clear message that they did not want Obamacare passed, when they elected Senator Brown of Massachusetts. The Brown election was widely viewed as a referendum on Obamacare. It was an election in which a Tea Party candidate won in a strongly left-leaning state. The bill only passed because of procedural maneuvering by the then-Democratic House. The 2010 election of the Tea Party House was a resounding rejection of Obamacare by the American people. Once again, Obama has a mass of facts wrong.

2. The practice of “judicial review,” the name for courts overturning unconstitutional laws, dates back to the famous case of Marbury v. Madison (1803). Since that case was decided, it has been well established that the courts have the power to overturn laws that violate the Constitution.

It is true, of course, that conservatives often bemoan “judicial activism,” and now Obama is bemoaning it. So what is the difference between judicial activism and judicial review? Is it merely that if you like it you call it judicial review and if you dislike it you call it judicial activism?

I do not believe that’s the truth. I would offer a deeper libertarian analysis: the Constitution of the United States was designed to limit the powers of government and protect citizens from the state, as a reaction by the American Revolutionaries to the tyranny of the British empire, which they had recently defeated. Democrats love to say that the Constitution is a “living document,” which means that the Constitution changes to reflect the desires of the public (which, they believe, have become ever more leftist since the American Revolution). But the meaning of the Constitution is clear, and it does not change. The argument to overturn Obamacare comes from the fact that Congress has only the enumerated powers given it by the constitution. Obamacare sought to use the Commerce Clause, which gives Congress the power to regulate “interstate commerce,” in order to effect a partial nationalization of the healthcare industry. But as I argued before, and as Justice Kennedy implied at oral argument, this is far beyond what the Commerce Clause and the cases interpreting it explicitly permit.

So it will not be judicial activism but judicial review, which consists of faithfully conforming the law to what the Constitution allows, if the Supreme Court overturns Obama’s health care plan. It is judicial activism when leftist judges follow the philosophy embodied in the legal theories called “legal realism” and “critical theory.” These theories hold that there is no such thing as an objectively correct or incorrect interpretation of the law, and therefore a judge is free to rule as his or her subjective feelings on morality and justice dictate (and note that somehow these feelings are almost always Marxist or leftist feelings).

Critical theory, which explicitly attacks the legitimacy of “legal reasoning,” is hugely popular on many law school campuses. Many of the lawyers and judges of the future may buy into it. But when the Supreme Court rules on Obamacare in June of this year, I hope it will be clear to the Marxists that they don’t run America quite yet.




Share This


State Tests vs. School Choice

 | 

A few months ago, Richard Phelps attracted attention with an article in The Wilson Quarterly called "Teach to the Test?” Its argument is that "most of the problems with [school] testing have one surprising source: cheating by school administrators and teachers."

Last week an investigative report published in Sunday’s Atlanta Journal-Constitution found indications of standardized test cheating in school systems throughout the US.

Certainly cheating of various types is a big problem in education. But it is not really that surprising. Where else would the highest stakes of evaluation be left up to the individuals or groups being evaluated? But these articles proceed from the unquestioned assumption that state tests are an appropriate way to hold schools accountable for quality. For instance, Richard Phelps wrote, “Without standardized tests, there would be no means for members of the public to reliably gauge learning in their schools.”

The state tests are wrong both for what they leave out and for what they include.

I agree that the purpose of education is to increase academic skills. I agree that tests ought to be used to determine what students have learned. I agree that more learning is better. I do not agree with folks who say that testing is bad and that schools should not give tests because that stifles teacher creativity. I do not agree with the proposition that tests can’t measure what is important in education.

Neither do I agree, however, with the use of state-constructed tests to attempt to hold schools accountable for quality. It has taken me several years to come to this position. I have three main reasons.

First there is the issue of alignment. Whatever the state chooses to put on the test becomes, in essence, the required curriculum of all the schools in the state, even if it is wrong. The state tests are wrong both for what they leave out and for what they include. For example, state tests for elementary age students in reading and math ignore fundamental areas of the curriculum. I refer to accuracy and fluency in decoding the meanings of words, in the statement (memorization) of mathematical facts, in mathematical calculations, and in spelling. State tests simply don’t bother to measure these pillars of an elementary education, even though they are critical to future educational success.

I run six charter schools, which due to our use of a trend-bucking curriculum called Direct Instruction (DI), mostly achieve better test scores than the school districts in which we reside. DI is a specific, scripted, sequential elementary curriculum (grades K through 5) that takes much of the guesswork out of teaching. The lessons are carefully crafted to be easily understood, build only on what has been taught in earlier lessons, and prepare students precisely for what is to come. There are programs for reading, math, spelling, and writing. All but the very lowest special education students can learn from these programs and emerge from elementary school with average or above average skills. DI is hated by the progressive educators at universities, but we love it, and so do our students and parents.

Curricula such as DI that focus on bringing all the fundamental student skills to mastery (including the ones not tested) must do so on top of teaching the things that are measured on the test — while other schools focus all their efforts on the test material. A majority of American elementary schools no longer teach spelling, for example, simply because it is not measured on the state tests. While learning how to spell is an essential skill, the state tests have pushed it out of the curriculum. Not to mention all the other critical content not tested and no longer taught.

Conversely, state tests focus strongly on a number of things that, although they sound good, are not skills to be taught but attributes of intelligence that we desire. These attributes are such things as the ability of bright elementary students to make inferences from unfamiliar texts, to write interesting imaginative stories, and to find creative solutions to unique word problems in mathematics.

These attributes, and their application, are not an emphasis of the very strong DI elementary curriculum. But if schools that use DI, such as my own, taught what is in our curriculum (what kids need) and ignored the less relevant, they would get lower state test scores and be branded as poor schools. Schools ought to be able to use their own tests to measure what their own curriculum plans to teach, and be evaluated on how well the school does what it claims it will do. Parents, of course, could select schools according to the nature of their claims as well as their performance.

Second, people forget important facts about state tests. One is that the results have no consequences for the children. Another is that these are children taking these tests. Children are subject to wide swings in their performance, often depending on testing circumstances. In our schools we have found children who have been well taught but who for years have failed the test. Yet they can reach not only "proficient" but “exceeds proficient” if their teacher sits next to them and makes them read the test aloud and gives them breaks when they get tired. Essentially we are making certain that they actually do their best on what to them is a very long test. This is not cheating. These practices are specifically allowed by the state rules for students who need them; they are called an “accommodation.”And it is an appropriate accommodation. It just shows the best that the student can do. Guess what? Children don’t always do their best. Sometimes they just guess their way through the test to get it over with. If those children go to another school, where no one they know or care about is monitoring their test performance or where they are allowed to do fun stuff when they are “done,” they will probably turn in a failing score the next year.

If we expect teachers and administrators to want to work with populations that are below average in some way, we have to stop proclaiming that those who teach the smarter students are better teachers.

Third is the issue of students' ability. Obviously, the more able students are, the easier it is for them to learn. The less able they are, the harder the teacher and the school must work to teach them. Scores on state tests are as much a measure of how smart the student body is, as they are a measure of how well the teachers teach. It is ridiculously unfair to ignore this fact and proclaim that high test scores mean a school is good and low test scores mean it is bad. That would be true only if the student bodies of the schools were evenly matched in IQ — which is never the case. It is a heavier lift to raise test scores in a school that enrolls many students with low ability, or learning difficulties; and until we begin to measure the weight of the load, we cannot claim to know who is stronger. If we expect teachers and administrators to want to work with populations that are below average in some way, we have to stop proclaiming that those who teach the smarter students are better teachers, just because their students get higher test scores.

We would be far better off if the states stopped giving their tests, instituted more school choice, and left it up to schools to find a way to prove they were doing a good job for the consumers — just as it happens in every other service industry. We could do it easily in our schools, without a state test. If we gave aligned end-of-year final exams for each of our DI programs and shared the results with parents, they would be blown away by what we teach. Few students outside of our schools could match that performance. That’s how you prove quality, not with bogus, we’ll-decide-what’s-important-to-learn, state tests.




Share This


The Three R's: Reading, Reading, and Reading

 | 

“People asked, Should directors also be able to write? I said it was more important that they be able to read.” — Billy Wilder

If you know any teachers, you’ve probably had many opportunities to participate in a certain kind of conversation. It’s the conversation that begins with the teacher saying, “What I can’t understand is . . .” This is followed by one example, then another example, then a long list of examples of incomprehensible things that students do, or fully comprehensible things that they cannot seem to comprehend.

Yesterday I heard from a friend of mine, a teacher, and I made the mistake of asking how her class was going. This is what she wrote in reply: “I like my class. I like my students. In fact, I like them even better than the class. They’re smart; they’re unprejudiced; they’re charming and energetic. What I can’t understand is why, since virtually all of them are native speakers of English and have taken college prep courses, they still think that ‘most’ is the same as ‘almost.’ They think that to ‘service’ someone is the same as to ‘serve’ him. The difference between ‘prophecy’ and ‘prophesy’ is invisible to them, as is the difference between 'lose' and 'loose.'

“And I don’t understand why, since many of them are the children of immigrants and have no difficulty ‘relating to’ at least two languages on the level of daily conversation, much of what they read in English strikes them as an embarrassing foreign language. When they read aloud, they stop and wonder about every ‘foreign’ word, most of which aren’t foreign at all. ‘Interrogate’ turns into ‘introcate’; ‘Jonathan’ is ‘Johnnythan’ (except for their own friends named Jonathan, whose names present no difficulty, because they don’t have to be read). And heaven help you if you’re teaching about Herodotus.

“I teach a senior college prep class, and we have readings from the Bible. One of my paper assignments asks them to discuss a topic in Genesis. The assignments say clearly, ‘Do not underline or italicize the word ‘Bible’ or the names of its various books.’ As a result, their papers start in this way: ‘In The Book of Genesis, which is part of The Holy Bible, god says to Abruham . . . .’

“In short, my students are bright young people who are incapable of reading, in the sense of noticing what they read.”

I’ve quoted my friend’s message at length, because I think it identifies a general problem. This isn’t basically a writing problem or a speaking problem; it’s a reading, or should I say a not-reading, problem. By “reading” I don’t mean “reading messages on the internet, or your telephone.” I mean reading something that makes you comfortable with complex linguistic resources, employed in complex rhetorical contexts. And I mean focusing your attention on it, not just looking at it without noticing any more than its general current and tendency.

If you don’t notice what you read, or if you don’t read anything much, you may not notice what you say, either.

Here’s an example. I happen to be reading a book about American religious beliefs and customs. It’s written by two statistically fascinated social scientists, but in this case, statistical fascination doesn’t imply bad writing. They are perfectly competent. They have a good idea of the linguistic resources at their disposal, including the many means that good readers learn to simplify or complicate the sentences they write.

So, at one point in their book American Grace: How Religion Divides and Unites Us, Robert D. Putnam and David E. Campbell say the following about the results of their statistical research on marriage and religion: “The propensity for marriage within the faith is much higher among more religious people — not surprisingly. We can’t say for sure whether low religious commitment is a cause or a consequence of intermarriage — or more likely, both cause and consequence — but intermarriage is strongly associated with lower religious observance.”

I’m concerned with that phrase “both cause and consequence.” It’s a shorthand way of saying “each is both a cause and a consequence of the other.” Long ago, writers of English learned from writers of Latin that you don’t need to say all that. As long as you’ve set up a sequence of parallels, “cause . . . consequence,” which are both complements of “is,” you can follow it with another complement, “both cause and consequence,” and that’s it — you’ve cut to the chase.

This is an almost trivial example of linguistic competence. But I would fall off my chair if I found one of my sophomore college students using this technique. It’s not that they’re dumb; they’re very bright indeed. But “both cause and consequence” is the kind of thing you don’t find when you’re listening to television, or reading messages online; and that, I believe, is most of the reading and “serious” hearing that students do.

As the president is always saying, let me be clear about this. I read online messages myself, sometimes all day and all night. Some of the most interesting, individual, and incisive writing I’ve ever read has appeared in online commentary. Several years ago, I paid enthusiastic tribute to writing done in online communities ("The Truth vs. the Truth," Liberty, Sept.-Oct. 2003). But it appears that online reading (except, of course, when enjoyed on Liberty’s website) doesn’t give people adequate experience of what can be done with the sophisticated written language of the normal serious book. The decline of complex verbal experience began with television, which like the internet ordinarily uses a very restricted variety of words and sentence patterns. The internet accelerated the decline by rewarding simplified communication with simplified but usually agreeable and often immediate response.

When people focus on complex sentences and arguments, they learn to pay close attention to verbal cues — closer attention than they need to pay to words that come naturally in conversation. I know a highly intelligent person who cannot remember, and does not care, whether her best friend’s name is “Katherine” or “Catherine.” Apparently the first name doesn’t appear in the person’s email address, or my friend doesn’t pick up on the cue — and why should she? It makes no difference in either email or direct conversation. I know many intelligent people who have everyday fluency in two languages but haven’t the faintest idea of how to construe a really complicated sentence, in either one.

The problem is, if you don’t notice what you read, or if you don’t read anything much, you may not notice what you say, either. While the civilized world, and even the world of television news, is saying, “I saw it yesterday,” you may keep saying, “I seen it yesterday.” One of the brightest people I know keeps saying, “I coulda swore that program woulda worked.” And while everyone who reads words, notices them, and reflects on them tries to avoid trite statements, inaccurate diction, and unintended implications, the nonreader will keep saying, “I was literally blown away by the judge’s disinterest.”

A long time ago, before the internet was fully in use, by everyone, all the time, I was visiting my friends Muriel Hall and Mary Jane Hodges, who lived in a small town in New England. We were going out to eat, and on the way we discussed the bad things that happen in restaurants. I mentioned the repulsive way in which waiters ask, “You still workin’ on that?” while they’re trying to grab your plate. After all, who wants to think of dinner, especially a dinner you pay for, as some kind of work you have to do?

Muriel and MJ looked at me in horror. “My God!” they said. “That’s what they say in California?”

Oh yes.

“Well, let’s hope it never happens here.”

A year later, when I returned to their little town, I found them sunk in gloomy meditations. “It’s here,” they said. “You still workin’ on that?”

Yes, it was there. And here, there, and everywhere, it has remained. “You still workin’ on that?” is now what nonreaders call an integral part (as if there were another kind of part) of the ritual of dining. It’s chanted everywhere — as difficult to avoid as “How Great Thou Art” in a Methodist church. Every waiter I meet is a college student or a college graduate, so we’re not talking about linguistic behavior that is innocently illiterate. No, this is illiteracy practiced as a faith, a faith that, having resulted from a long process of education and social reinforcement, has become second nature to its devotees.

The standards of political correctness have them in awe. But normal linguistic cues, the kind that come from reading, not social suasion — that’s another matter.

Apparently, authors no longer wait tables while they’re trying for their first big break. Or if they’re still doing that, you won’t want to read anything they eventually get published. Like other Americans, they’ve lost their ability to pick up cues — with one exception. They are sensitive to the subtlest possibilities of offending anyone “politically.” The standards of political correctness have them in awe. But normal linguistic cues, the kind that come from reading, not social suasion — that’s another matter.

One cue that most people used to notice was the red light that comes on in your brain just before you say something that’s just naturally offensive. Used to. When “that really sucks” is taken as acceptable in all social circumstances, you know that the alarms have shorted out. Those who know me understand that I have warm respect for profanity, appropriately applied; but otherwise-useful terms are merely disturbing when they’re employed without anyone’s appearing to care what they suggest.

As I write, a man is wandering down the street outside, discussing in a loud voice (why not?) his intimate problems with his girlfriend. There are two possibilities. (1) He is using a hands-free cellphone. (2) He is insane. The salient fact is that these days, you can’t tell the difference. In the ranter’s head, no warning bells can ever ring. He might avoid the path of a passing car, but language presents no risks. Not in his perception, and not in reality, either. How often will anyone accost him and explain how unfavorably he affects the neighborhood? And if anyone did, he probably wouldn’t pay any attention. He wouldn’t pick up the cue. He’d think that the other person was crazy. Or he’d just ignore the whole thing.

That’s what happens when you make some instructive response to “You still workin’ on that?” I used to try, “Yes, I’m still eating.” Then I tried, “Yes, I’m still eating.” Then, “Do you mean, am I still eating?” On and on. The answer was always, “Right — I see you’re still workin’.” You see what I mean about missing cues.

So far, I’ve been picking on waiters, nameless persons in the street, commonplace users of the internet, normal victims of high-school education, and so on. Now I’m going to pick on the president.

We can’t blame President Obama’s linguistic failures on the internet, although we might go after his high school, or the doting folks who brought him up to think that whatever he said or did would establish some new standard of excellence. But whatever the cause, his verbal warning system never got turned on. I suspect that he would be ten points higher in the polls if he’d had the sense not to say that his nomination for the presidency would stop the oceans from rising, or that people who weren’t inclined to vote for him were clinging pathetically to their guns and their religion, or that his grandmother made racist comments, or any of a hundred other offensive things he’s said, which he simply had no clue were offensive.

There was a new one, just the other day. It was the president’s statement — no doubt carefully planned, but certainly not well meditated — about the killing of Trayvon Martin, a young man who happened, like the president, to be of African descent. Obama expressed his sympathy by saying, “You know, if I had a son, he'd look like Trayvon.” In other words, all black men look alike to President Obama — because the only thing that made Trayvon Martin look like him was his race and gender. This is patently offensive, but the president didn’t have a clue that it was.

Only a person who doesn’t read many words — real words, written by real writers, not reports from campaign agents or White House officials — or think about the meanings and implications of words, could possibly have made such an offensive and silly statement.

Why did he make it? Because “someone who looks like me” is a cliché used to elicit votes from people who share your ethnicity. Its cheapness and foolishness never dawned on President Obama.

Oh, but the president is an author himself! He must know language, and be thinking deep thoughts about it! Of course, it doesn’t take much reading to write a book, as any casual perusal of an airport “bookstore” will inform you. But let’s consider the possibility. Maybe playing golf and watching basketball aren’t really the president’s favorite pursuits. Maybe that’s just a lie, put out to cover his real though unmanly interest in reading good books. Maybe, during his speeches, what he’s really thinking is, “I wish I could get back to Tolstoy.” Maybe in his private restroom there’s a copy of Montaigne sitting on the toilet. Maybe he goes to bed early so he can curl up reading Shakespeare in the original Klingon.

Whatever the cause, President Obama’s verbal warning system never got turned on.

Barack Obama wrote two books about himself and his opinions. But tell me, how many authors does he quote or mention in his speeches (which are long and frequent) or his interviews and known conversations? To which good authors does he allude? Madison? Voltaire? Thucydides? Elbert Hubbard? Mother Goose? Tell me, that I may be instructed.

None? Can’t think of any? The president’s intellectual context is television talk, stray bits of college texts, the oracles of dead “progressives” . . . even his legal education (witness his recent comments about the Supreme Court) didn’t seem to command his full attention. But context is all. If the context of your intellectual life wasn’t formed by reading books — serious, complex books — your signal system just isn’t going to work. Sorry. And the only thing you can do to fix it is (horrifying thought) to read a book.

p




Share This


Arab Spring, Winter for Christians?

 | 

In a recent piece, I suggested that the fall of a number of Middle Eastern dictators — most notably Hosni Mubarak of Egypt — actively pushed by the Obama administration, and collectively dubbed “the Arab Spring,” has shown a remarkably ugly side.

One of the ugly features I noted was the removal, in the case of Egypt, of a regime that had been actively fighting the practice of female genital mutilation (the removal of most or all of the clitoris from adolescent girls). Some of our readers were offended by my piece, either thinking, somehow, that I advocated going to war with Egypt, or else shocked that I would dare to criticize the practice at all.

Of course, I was merely commenting on a dubious Obama foreign policy initiative — replacing a disreputable US ally by an unknown force, and hoping for the best.

Well, the situation has developed a more ominous aspect. The Arab Spring is turning out to be not only a winter for women, but also a winter for Christians. Several recent stories bring this to light.

Let’s begin by reviewing the results of the first round of elections for Egypt’s parliament. In a turn eerily reminiscent of what happened in Iran decades ago — when Jimmy Carter, a president as feckless as Obama, withdrew support from the Shah so that “democratic forces” could take over — the resulting elections were victories for hardcore Islamist parties. Once the Islamists consolidated their power, they created a state far more repressive and authoritarian than the Shah could ever have imagined. The consequence was the mass murder of political dissidents, people deemed “deviant,” and worshipers of religions other than Islam (Baha’is, Christians, Jews, and Zoroastrians). It also created a state quite supportive of terrorism abroad.

Once the Islamists consolidated their power, they created a state far more repressive and authoritarian than the Shah could ever have imagined.

In the recent Egyptian elections, Islamists won two-thirds of the seats. And by “Islamist” I am not exaggerating. The Muslim Brotherhood, an extreme organization, from which sprang Al Qaeda, won about 39% of the seats. But the even more extreme Salafists won an astounding 29%. Together, the two liberal parties (the Wafd Party and the Egyptian Bloc) won a pathetic 17% total of the vote.

So much for the idea that waves of freedom and modernization are sweeping over the largest Arab country.

This should have come as no surprise, since earlier elections in Tunisia and Morocco saw Islamist parties win by large majorities. The results for Christians are ominous. The largest group of Christians in the Arab world — the Coptic Orthodox Church — resides in Egypt, where it constitutes 10% of the population. Mubarak, dictatorial bastard that he was, provided protection for them. He is now gone, and the Copts are at the mercy of the Islamists. Mercy, indeed!

Already reports have come in of the killing of Copts, such as the slaughter of 25 or more during a protest they staged in downtown Cairo recently.

The Copts are now deeply demoralized. If they do as the Muslim Brotherhood does — load supporters on buses and drive them to the polls to vote en masse (Chicago-style voting — maybe that’s why Obama supports the Brotherhood!) — they risk civil war. But if they do nothing, the Islamists will target them and slowly turn up the heat. As an American-based Coptic Christian put it, “They [the Copts] are a cowed population in terms of politics. They are afraid and marginalized.”

This is such a familiar pattern. The Islamists kill off or expel the Jews (if any are left by the time the Islamists take over); then they target other religious minorities (Bahai’s, Zoroastrians, pagans, or whatever). The pressure then mounts on Christians.

This is no less than religious ethnic cleansing.

The Egyptian government has recently taken the necessary first step in setting up the apparatus to carry out religious cleansing. It has raided 17 nongovernmental agencies, including three American agencies that are supposed to monitor the “progress” of “democracy” in Egypt — specifically, Freedom House, the International Republican Institute, and the National Democratic Institute. One witness to the raid on the Future House for Legal Studies said that a policeman taking part in it held up an Arabic-Hebrew dictionary he found and said it proved the organization was engaged in sabotage against Egypt.

One predictable result of the Egyptian war against minorities is happening already: an exodus of Copts to America. One story reports that thousands of Copts have come to America since Obama’s chosen “democracy” swept Egypt. The emigrants report growing levels of overt persecution and violence. One recent émigré, Kirola Andraws, fled to America on a tourist visa and applied for asylum. He was an engineer, but now works as a cook and a deliveryman in Queens. His story, unfortunately, is likely to prove typical.

The report also notes that already this year a number of Coptic churches have been burned down. Islamist-spawned mobs have rampaged against Coptic homes, stores, and church schools. Think of it as the Muslim Brotherhood’s take on Kristallnacht. Yet the US Commission on International Religious Freedom was recently rebuffed by the Obama administration’s State Department when it asked State to put Egypt on its list of countries that violate religious freedom.

This is only the beginning. Right now, the Muslim Brotherhood only controls the legislature, and it is still held in check by the military. But a very recent article reports that the Brotherhood is planning to run some of its chosen “leaders” for the presidency — something it had earlier promised to do. Should the Islamists take over the executive branch, the military’s influence will rapidly wane, and Egypt will likely go the way of Iran.

The report observes that the military and the Muslim Brotherhood have been in a struggle for 60 years, with the military coming out on top, until now. The military controls about a third of the manufacturing industry in Egypt, for example, so is not likely to surrender power easily. The Egyptian liberals, now seen to be a small minority, seem to be rethinking whether the military is at this point the main threat to them.

Think of it as the Muslim Brotherhood’s take on Kristallnacht.

Whether the military will back down and let the Brotherhood take control is unclear. If the military reacts by dismissing the legislature, Egypt could be in for a protracted and internecine civil war. In either case, however, Christians can expect to be demonized and targeted by the Islamists.

Christians are also being targeted by Islamists in other countries besides Egypt. Nigeria — to cite one such place — recently experienced a wave of terror attacks against Christians, with at least 39 killed. Most of them died when Muslim radicals blew up St. Theresa Catholic Church last Christmas. Shortly thereafter a Protestant church was bombed as well.

Christians in Iraq and Syria have been fleeing, as violence directed at them increases. Since the US toppled Saddam in 2003, 54 Christian churches have been bombed in Iraq, and over 8,900 Christians have been murdered. The number of Christians remaining has of course dwindled, down to 500,000 from 800,000 to perhaps 1.4 million in 2003. With American troops now gone, one suspects that this trend will dramatically increase. In an interesting twist, Christians are fleeing other areas of Iraq and moving to the Kurdish-controlled region, because the Kurds have offered them protection. Yet there are Islamists even among the generally pro-Western Kurds, and Christians have faced some attacks in their territory.

There is in the end the law of unintended consequences, in foreign policy no less than in domestic policy. Progressive liberals — and even conservatives — should start paying attention to it. It is all well and good to desire an “outbreak of freedom,” but one ought to be careful about what one desires, as he might just get it. Many on the Left and the Right welcomed the “Arab Spring,” but it may not turn out to be an explosion of tolerant democracy, as it first seemed to them.

Lest any reader mistake this story for some kind of call to arms, let me make my view explicit: I do not advocate going to war against anyone. But should the Muslim Brotherhood complete its takeover of Egypt and continue its vicious religious persecution of the Copts, our high level of foreign aid to Egypt — $1.3 billion in military aid alone — should certainly be stopped. And this should be made clear to the Egyptians in advance.




Share This


Swimming Against the Tide

 | 

At the beginning of the First World War, Robert Frost wrote in Mending Wall (1914), “Good fences make good neighbors” — suggesting metaphorically that borders and boundaries help to prevent war and aggression. But in that same poem he acknowledged,

Something there is that doesn’t like a wall,
That sends the frozen-ground-swell under it,
And spills the upper boulders in the sun;
And makes gaps even two can pass abreast.

Nature herself, he said, works to break down manmade walls through the simple power of water finding cracks and breaking rocks. Nature doesn’t like boundaries.

Borders are good when borders are necessary. They are preferable to war. But more than a century earlier, weary from the destruction and expense of war, Benjamin Franklin recommended wise foreign policy when he wrote: “The system of America is commerce with all and war with none.”

Business brings people together. I may not like your politics, your religion, your clothing, or your neighborhood, but if you produce something I want and I produce something you want, and if we have a justice system that protects our right to property, we will manage to get along, if only for the benefit of mutual exchange. War and aggression may provide short-term solutions to shortages, and walls may keep aggressors at bay. But commerce and free trade promote lasting relationships that increase prosperity and living standards for all. Understanding this simple fact could solve many of the current problems in the Middle East.

Commerce vs war. That's what I thought my review would emphasize when I headed out to see Salmon Fishing in the Yemen, an indie flick about a British fisheries expert (Ewan McGregor) who is hired to create a salmon fishery in Yemen. What a great a new industry for an emerging nation, I thought. This is the way to be good neighbors and promote peace and prosperity — through commerce! Who needs war?

Sigh. This is what happens when I start writing my review before seeing the movie. Salmon Fishing in the Yemen is not about commercial fishing at all, but about a wealthy sheik (Amr Waked) who loves salmon fishing at his massive estate in Scotland and is willing to spend £50 million or more to be able to fish in Yemen. He isn’t interested in creating jobs and industry; he just wants to fish in his own backyard. Sheesh!

Nevertheless, Salmon Fishing in the Yemen is a wonderful little film, one that is well worth seeing. Salmon fishing is actually a metaphor for the uphill relationships presented in this funny romantic drama, which follows two couples who become unintentionally entwined. The fisheries expert, Dr. Alfred Jones, is the very prim and proper husband of Mary Jones (Rachael Stirling), a financial analyst who seems more committed to her job than to her marriage. Meanwhile, the sheik’s consultant in the project, Harriet Chetwode-Talbot (Emily Blunt), who entices Dr. Jones with a money-is-no-object offer, is in love with a soldier (Tom Mison) who has suddenly been deployed to the Middle East.

Encouraging them in the fishing project is Patricia Maxwell (Kristin Scott Thomas), the Prime Minister’s press secretary, who seizes this “goodwill” story as an opportunity to counteract some bad war-related publicity coming out of the Middle East. (Chillingly, Maxwell uses her access to high-security government search engines to scan through private emails for references to the Middle East. And they pop right up on her screen, including the "private" communication to Jones from the sheik's representative, requesting a salmon fishery in Yemen. Yikes!) Normally so drab and serious in her roles, Thomas displays an unexpected talent for humor in this film. With her delightfully droll delivery; she effortlessly steals every scene. And that is no easy theft, for a film in which every actor is so adept at displaying that bemused, self-effacing kind of British humor that always seems to say, “Oh, did I do something funny?” The film is simply charming, through and through.

One of the sheik's chief concerns when Jones and Co. are ready to transfer the salmon to Yemen is whether farm-bred fish that have never seen a river will run, or whether they will just swim passively in circles. This underscores the film’s theme: do people who have been domesticated to the point of emasculation still have the instinct to know when they have been set free? Alfred Jones proclaims, “It’s in the very core of their being to run. Even if they never have. Even if their parents never did!” He’s talking about himself, of course, although he doesn’t know it. Juxtaposed against this hopeful declaration is his wife Mary’s cutting remark, “It’s in your DNA to return to a dry, dull, pedestrian life.” Where is his true home? And how much effort will it take to find it? That's what the film asks its viewers.

Ultimately this is a film about swimming against the tide. As Dr. Jones deliberates on whether to accept the whimsical challenge of bringing salmon to a desert, we see an overhead shot of him hurrying along a crowded sidewalk within a school of gray-suited businessmen. Suddenly he turns and makes his way through the crowd in the opposite direction, to tell Harriet that he accepts the offer. He is swimming upstream (yes, to spawn!), and we know that he will be caught before the film is over.


Editor's Note: Review of "Salmon Fishing in the Yemen," directed by Halle Lasstrom. Lionsgate, 2011, 107 minutes.



Share This


Somebody’s Gonna Win

 | 

Our cleaning lady was smiling widely as she polished our dining room table. She was in such a good mood she didn’t drop or break a single goblet or plate on the granite kitchen counter — as was her custom, usually two a visit.

Had she finally found her Prince Charming? I impulsively asked Betty, let’s call her, “Why so happy?”

“I’m gonna win the Tennessee State Lottery,” she giggled, then uttered the declaration immortalized by losers: “Somebody’s gonna win — might as well be me.” The booty amounted to hundreds of millions, because counter to the mantra, nobody had won for eons. Today they’d pluck the lucky number out of a barrel. And Betty had 25 tickets. How could she lose? They only sold 100,000 or so.

Lotteries? A verification of Bastiat’s contention that the state’s one form of expertise is “plunder.” Whatta racket! Of course, you’ll find better odds at your local racetrack or casino. And to heighten the state’s hypocrisy: gambling is illegal in your own home. The letter of the law says you and your Wednesday night poker club can end up in jail, subsisting on grits and bread crusts.

It wouldn’t grate so badly on my sense of justice if the state solicited gambling gangs in a formal competition in which the main variable was payout vs. ticket income. As an 87% libertarian, I believe people should be allowed financial ruin if they so desire.

But it should not be sponsored by the same state that prohibits gambling. It prohibits theft, too, of course, but its main source of revenue is taxes — a euphanism for theft. Theft is theft, so what if they patch my street with tar every three years? They provide schools, too, but the real value of roads and schools and such services never quite equals the take from taxes. And you can’t sell hooch, either, but the state can.

By the way, I’m quite proud that my state, Alabama, amid flagrant corruption from the gambling lobby, rejected a lottery. It’s Tennessee that caved in. Coincidentally, the purchase of mansions, yachts, barrels of caviar, and Rolexes by state legislators skyrocketed. Supply and demand, you know. I’m waiting for the states to open up a string of escort services. Quite legal, if run by the state.




Share This

© Copyright 2016 Liberty Foundation. All rights reserved.



Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.