Explaining the US Debt Crisis — To My Teenagers


Hey kids, pause American Idol for a moment and let me "lay some wisdom on you," as your generation likes to say.

It's about the US debt and what it means to you. If you pretend convincingly that you're fascinated by what I have to say, we'll go out for pizza, OK? Great.

Now, it’s hard to wrap our heads around billions and trillions of dollars, so let’s bring these numbers down to a human level so we can get some perspective. Let's scale it down to the size of a family budget. We'll divide the actual numbers by one billion, and talk about months instead of years.

So let's pretend that we are a family of criminals [the US government] living in a pretty rough neighborhood [the world].

Our family takes in a lot of money [government revenues] by picking pockets, mugging, and running shakedowns and con games. Heck, we even do some counterfeiting. And we spend it all as soon as we get it [government expenditures]. But every month we need $1,500 more than we take in. So every month I borrow that shortfall [the deficit] from the neighborhood loan shark, and we spend it.

The funny thing is, I already owe him $14,000 so far [the debt]. But he’s been pretty patient, for a convicted homicidal maniac [the Chinese government]. I guess he’s been cleaning up his image.

What’s worse, I’ve been running a pyramid scheme that's gotten a little out of control, and people are beginning to catch on. I've already promised about $100,000 [total unfunded liabilities], spread out over the next few months, to pretty much everyone in our extended family [voters with entitlement programs]. Some are well-off and won't miss their share, but some are desperate and could really use the money I scammed. Either way, they’re all going to sue me when they find out — but too bad, I already blew the money long ago.

Anyhoo, I sat down with Mom to figure a way out of this mess. I [a Republican] bravely proposed spending $100 less this month, but she [a Democrat] screamed, “What about the children!” and said it would be an unacceptable cut in our standard of living. She said we should open more credit cards in your names [borrowing], mug more people [taxation], and try some more counterfeiting [“quantitative easing”].

Finally we agreed that things will work out somehow. We just need to spend $30 less this month [the 2011 budget deal].

So I called up the loan shark and explained that he’ll eventually get all his money back —  we crossed our hearts and promised on our kids' lives  —  if we can just borrow another $1,470 this month, while we figure things out.

I figure we can always just give him counterfeit money, so we're good!

By the way, he said to say "Hi" and that he's really, really, looking forward to meeting you kids soon.

So that's what the debt crisis is basically all about. Now, who's up for pizza?

What? Whaddya mean you're not hungry?

Share This

French Intellectuals: Telling Stories


There is a French intellectual class. It's better defined — around the figure of the public intellectual — than it is in other countries that I know. French men and women from a handful (that's one handful) of schools dominate public discourse in France to an extent difficult to imagine in this country.

Here is an illustration: One evening, I found my septuagenarian mother transfixed before her television with the pious look on her face she reserved for moments when she thought she was listening to elevating material. She was following an exposition about Claude Lévi-Strauss' “Structuralism.” My mother only had a high-school education (but she did read quite a bit). Of course, no one understands Lévi-Strauss' scholarly work. I am not referring to his young man's slim travelogue, Tristes tropiques. That is a charming and intelligent book. I refer to the impenetrable Le cru et le cuit, for example. And no, dear former colleagues, the translation is not to blame; the original French is rather worse! In fact, Professor Lévi-Strauss himself once famously confessed that there was little chance that anyone who had not taken his seminar would understand any of his writings. It's difficult to imagine an American author combining so much opacity with so much public favor, the latter entirely on the basis of trust. I mean, the trust the general public places in other intellectuals who declared Lévi-Strauss great and essential.

I am not arguing that American scholars and American journalists don't try to become public intellectuals in the French model but that they meet with little success. Different historical conditions, different outcomes. Nevertheless, the French intellectual class is well worth studying because it seems to offer a ready, constant, and tempting model to many in this country, mostly on the liberal end of the spectrum.

In my experience, Americans with higher degrees who are also politically of the Left are almost to a man and woman ardent Francophiles. Of course, I am well aware that my personal encounters may not add up to a representative sample. Or maybe they do, or maybe they are representative enough to allow for a fair degree of generalization. (We worry about samples only because we want to be able to generalize, from “pismire to parliament,” so to speak.) In any case, I suspect that American liberals' Francophilia does not spring from admiration for French cuisine, an admiration that would be amply justified, or from aesthetic appreciation of the beautiful, civilized French countryside, or from their thorough assessment of French culture in general.

The latter explanation is especially unlikely to be correct because American Francophiles — except some language teachers — seldom know the French language well enough to listen to it or to read it with ease. A small remark about the last assertion: American journalists, scholarly authors, and novelists often spice up their narrative with a few words of French, or with a single word. Their batting average in getting both the usage and the spelling right at the same time hovers just above zero, in my experience. In this case, I operate from a reasonably good quantitative basis: I read in English one newspaper a day, one newsweekly, three magazines a month, around fifty books of all kinds a year. About half the books are fiction. I have been doing this for 40 years. Until six years ago, I read a little less fiction, but I perused a medley of scholarly journals every month. And yes, I also know that a large sample is not always representative. See the disclaimer above about samples in general.

Lévi-Strauss himself once famously confessed that there was little chance that anyone who had not taken his seminar would understand any of his writings.

So, by a process of reasonable elimination strengthened by numerous concrete examples, I speculate that many left-leaning American intellectuals admire France because that countryseems to give pride of place to its intellectuals, and recognition, and honor, and often, considerable financial rewards. They wish America would become more like France, in part so they will have a better crack at becoming public intellectuals themselves, with all the attendant material advantages, or even for the glory alone.

I think the French model is poisonous but ultimately not fatal. More credentials are needed here, since I present a strong opinion unsupported by systematic data: I lived in France until I was 21. French is my native language. I read in French all the time. I have published a couple of things in that language. My proficiency in the language, I am confident, is superior to that of most well-educated people living in France. That is true, although I flunked out of French high-school. (This is a confession, not bragging. It's needed because the issue of sour grapes is sure to surface below.) I follow the French media on the internet and through cable television. (I subscribe to TV 5, the French language international television channel. I watch it nearly every day.) I go to France and to other French-speaking countries often enough to make me confident that I have not lost touch with my culture of origin. More importantly, I spy daily on my French nieces and nephews through Facebook. I have a doctorate in sociology from a good American university. I believe it helps me gaze detachedly at the society in which I grew up. I taught — in disciplines other than French — in American universities for about 30 years. Below, I use several tiny anecdotes and three more substantial true stories to try to show why the French model of the public intellectual is poisonous. Those are just anecdotes and stories, of course; they don't prove anything, of course. I just think they might give pause and feed faculty-club conversations, even if only on a modest scale.

Incidentally, and in spite of the acidic tone of this introduction, I do not despise all French intellectual workers who are also public intellectuals, or who aim to be. I have nothing but admiration for the social philosopher Raymond Aron (1905–1983), for political commentator Jean-François Revel (1924–2006), and for my high-school buddy, the sociologist Jean-Loup Amselle. Those are individuals who have escaped the general curse, for reasons I don't quite grasp. It might just be character. And of course, if Frédéric Bastiat (1801–1850) did not exist, one would have to invent him. The French may just be reinventing him right now incidentally, because they have to, after ignoring or reviling him for more than 160 years.

First, what I am not talking about: I do not refer here to the general pokeyness of much contemporary French culture that manages to come through small, haphaphazardly selected cultural imports into the US. I mean some movies and a very few singers. (Translator's note: “Pokey” would be a 55-year-old divorced accountant who has his ear pierced for a big gold ring although he does not even own a motorcycle.) Pokeyness and the poisonous nature of the French intellectual model are probably related, but the discussion of such a relationship would take me too far onto thin ice.

Many left-leaning American intellectuals wish America would become more like France, in part so they will have a better crack at becoming public intellectuals themselves.

The general idea I am promoting is that to switch to French is to enter a world of uninformation, of disinformation, and of involuntary poetry. I recall fondly an hour and half I spent driving through the admirable Loire Valley, intrigued by an animated discussion of the damage being inflicted on Cuban society by the “American blockade” (le blocus). No, the year was not 1962 when the US imposed a partial blockade for a couple of weeks. I was listening to that discussion nearly 40 years later. The correct descriptive term, “embargo,” is the same in French as in English. So, here again, don't blame translation. The discussion took place on a specialized radio channel called France-Culture. Would anyone make this up?

That was the aperitif. By way of appetizers, here are three pearls I collected from TV5’s Le Journal (“The News”) in a span of two weeks (February-March 2011). I admit, they were all gleaned from the same news anchor:

“The Rosenbergs, a couple who were executed because they were Communists. . . .” The speaker was announcing that the soldier accused of leaking military secrets to Wikileaks theoretically risks the death penalty. He added, gratuitously and of his own accord, that Wikileaks’ Julian Assange “knows what to expect if he is extradited to the US.”

To switch to French is to enter a world of uninformation, of disinformation, and of involuntary poetry.

Reminder: at the time of this report, no authority, civilian or military, had called for the soldier to face capital punishment. Assange was facing extradition from the UK to Sweden to respond there to accusations of sex crimes. The US has not asked for his extradition here, something that would be easy to expedite with the UK. Whatever penalty, if any, Assange would face in the US, the editor of the New York Times would face the same. There is solid legal opinion that the dissemination of illegally obtained information breaks no federal law.

Only a few days later, the announcement comes on the same Le Journal that the “North American cougar is officially extinct.” (Translator's note: “cougar” is another name for “mountain lion” and “puma.”) The same perverse announcer makes his melancholy comment while standing before an old Western poster.

This time, I am stunned instead of merely annoyed. Mountain lions, cougars, are stealing pets and carrying goats across eight-foot fences five miles from my house. Oversize alley cats are not committing the misdeeds, I am sure. I’ve even written a story in French about the animal's ubiquity in California. (“Les pumas de Bécons-les-Bruyères.” It's on my blog at factsmatter.wordpress.com.) I am so astonished that I take the trouble to follow through online, where I find a big title in an environmentalist magazine (Planète, March 4, 2011):

“Les cougars ont officiellement disparu.”

But the text beneath the title does not say that at all. The real news, three paragraphs down, is about as newsworthy as “Madagascar dodo officially extinct.” The announcement is of the official extinction of the eastern “sub-species” of the cougar that no one had seen for a century anyway. It matters not; there are dozens of readers' comments about how the demise of the mountain lion is just another horrible case of humans raping the earth.

I know, it's just one zealous announcer blindly following one misleading title. But the news anchor is not a nobody, and he is not a kid. He has an important job, one much prized. I was unable track him down, but I would bet good money that he is a graduate from one of the elite Paris schools. You don't get his kind of position of influence and power without the right credentials.

Finally, here are the main stories, the several main dishes, if you will.

It's 2005. I am living in Morocco for a little while, pretending to gather material for a scholarly book on ethnic identities I never wrote. I go to the central railroad station news kiosk to pick up several Moroccan weeklies in French, because it's convenient. A delivery man dumps a pack of newspapers practically on my feet. I look at them, of course. It's a bundle of Le Monde, fresh from Paris. Le Monde is the French sociological equivalent of the New York Times, except more so. It's resolutely highbrow and unashamedly elitist. Working for Le Monde in any capacity elevates one's social standing. Often, being employed by Le Monde is the only thing that stands between a person and precipitous downward mobility. Le Monde is not exactly the informational equivalent of the NYT, however. It's thin in content, although it's an evening paper. Maybe it's because it defines itself strictly as a political periodical. (No entertaining “Life Style” pages.) For years, it has been run as a kind of “collective” with the unavoidable consequence that its journalists are co-opted from the same small circle. Invited guests play a minimal role, in part because they also come from the same small circle.

The host enjoys a good rapport with the general French television audience because of his fluency in the childish, empty, mindless slang that is the ordinary French language of the mainstream today.

By that day, in Morocco, I have already forsworn reading Le Monde for several years. It annoys me without enlightening me so, why bother, I say? But the unexpected appearance of the French paper in my field of vision induces an involuntary reflex to glance at the front page. A big title catches my eye. It's on the revival of the Baghdad theater. The subject is right down my alley. It's about Iraq and it’s about a performance art. I like Arab culture and I am endlessly curious about it. My guard is also down because the French often do that kind of reporting well. My fugitive favorable prejudice is multiplied in an instant by the realization that a French reporter in an Arab country is about five times more likely to be accompanied by an actual Arabic speaker than is an American reporter. (Hundreds of thousands of French people speak Arabic as their native tongue.)

Favor for Le Monde suddenly returns to my addled heart. I pay for the paper and read the first few lines of the article, standing in the train station.

The story is by a woman who seems young. (Don't ask me how I know; I know, and that's another story.) Here is her opening sentence, remembered, I think, word for word, except possibly for the relevant month (it's not important):

“It's early April 2003; Iraqis are dragging armchairs from the National Theater into the street outside, encouraged by an American G.I.”

Stop right there!, I think. I was not in Baghdad, but I would bet thousands (again) that there was no such encouragement. The US military was under orders not to interfere with looters. That's what it did, not because it's perfect but because Americans under military discipline don't encourage looting by others. They don't, period. (Some members of the American military may engage in looting themselves when the occasion arises. That's another story entirely; greed differs from vandalism by proxy.) And there was no motivation for any single one of them to proffer such encouragement. There was no “encouragement,” I am sure.

With her opening sentence, the French reporter was accomplishing two things. First, she was setting the scene, as any good narrator does: we are in Baghdad. It's shortly after the US invasion. Disorder reigns. It looks like the Iraqi theater is irreversibly damaged. Second, she was establishing her credentials with her crowd at Le Monde and with the general Le Monde readership: I am one of you. That Americans are barbaric, uncouth bullies is a given. Whatever I am about to tell you about my visit to the Baghdad theater circles under American occupation; please remember that I am as anti-American as the next guy or gal. Nothing can change this basic fact. I am me. “Me” is sophisticated and therefore expects nothing but stupidity and gross behavior from Americans.

You must be thinking what I was thinking at the time, because the generous American habit of fairness is difficult to break: this is just one reporter, probably a young, inexperienced one. She has not had time to think things through. Besides, her anti-Americanism at this point, in Iraq, is no greater than that felt by many Americans, including leftists and quite a few libertarians. Hers is primitive anti-Americanism.

Reel forward 9 years.

I am watching something wonderful on TV5 that has no equivalent in the US media. It's the show “On n'est pas couché” (“We Are Not Asleep”). It's the kind of show I mark on my calendar to make sure I don't miss it, although it lasts for two or even three hours. It's frankly an intellectual show but not uniformly highbrow. And it's not mincing, if you know what I mean. I like its format and I like its content. The format is original: there is a general host who acts as an MC and also as a referee or a judge in an American court. He is an affable man, a comedian of no great intellectual weight but endowed with an excellent sense of à propos. (This expression is spelled right and used right here; take note!) He enjoys a good rapport with the general French television audience because of his fluency in the childish, empty, mindless slang that is the ordinary French language of the mainstream today.

The show is shot before a live audience; in France, it's also broadcast live. It presents authors of newly published books, small and big, low and high, film directors and stage directors, and often, film and stage actors, singers, dancers, some athletes, and pantomime artists. There was even a clown once, though one with a famous circus family name. Sometimes, the show bags a national-level politician for a no-holds barred interview.

Two reasonably well-groomed professional intellectuals who know how to act on television are at the heart of the show. They preview the films and the stage shows, and above all, they really read the books whose authors will sit on their hot seat. They actually give the books, deserving or not, a thorough reading. The authors, directors, and actors are grilled about their work and their lives, but all in a non-scandalous way. There is no attempt to find out who bedded whom, except 20 years or more in the past, where the answer may have historical interest. The interviewing is cordial for young actresses, but it is fierce grilling for famous people. The jury of two suspects many of them of using their celebrity to palm off mediocre works (memoirs in particular) on a busy and possibly naïve public. The jurors show no indulgence toward the powerful. One recently told a popular television personality that his autobiography was “nulle” (“hopelessly void”). Then he asked the author why he had written the book at all. The author did not cry but I would have, in his place.

I am guessing, only guessing, that one could even make Zemmour admit that American might is the last rampart against the violent jihadism he openly fears.

It's to the great honor of the show that the same juror was himself called “nul” a week later by a popular soap-opera director whose work he had criticized, perhaps on the wrong grounds. The lady director, a beautiful Muslim woman with striking black hair, and her two main actresses, also attractive Muslim women originating in North Africa, came close to lynching the critic right in the studio. (I know the tension was not staged because I have watched the Jerry Springer trash-show dozens of times, which makes me an expert on rigged fights.) In the end, the argument over a television soap generated so much heat that the irate director called the critic out. What I heard sounded ambiguous precisely because there was so much heat. It was not completely clear in the end whether she was offering to beat him to a pulp outside the studio or if she was threatening to have her way with his body.

This juror who was thus placed under bodily threat is called Eric Zemmour. He is a man I would have for dinner any day, and I would also be glad to drink either coffee or beer with him any time. I might even share my oldest Calvados with him, because I like him so much. He is a journalist and also the author of ten books. He is refined, immensely cultured yet humble, and devastatingly witty. Zemmour often demonstrates a solidly conservative temperament in his comments about others people's work. One of his own best books is about the feminization of society, which he deplores, of course. Another book was the initial public signal in France that multiculturalism had taken stupid and self-destructive forms. Zemmour's is one of the few public voices in the country to undermine the ambient misérabilisme by pointing out that the French welfare net is quite adequate. (“Misérabilisme” is a word that does not exist in English but should, in my estimation. It refers to the emotional and intellectual propensity to feast on the misery, real or imagined, of others in one's society or, often, in the world. It's one of the emotional foundations of leftism.)

Zemmour is quite able to say good things about American movies and about the American cinema in general. It's obvious he has read American authors. No primitive anti-Americanism for him. Yet, very often, and most often when confronted with any manifestation of American power at all, he displays reactions of intense dislike and dismissal. In such circumstances, he seems to become a little stupid. He sounds as if his IQ had dropped 30 points. Maybe “stupid” is too strong a word. He is still an intelligent man at those times, but just intelligent in an ordinary way, not brilliant. I mean by this that his comments do not differ in such moments from what you might hear in any caféclose enough to the Sorbonne.

Zemmour is Jewish. He and his parents are from Algeria. Had it not been for the massive exercise of American power in North Africa in November 1942, there is a fair chance the Vichy regime would have completed the job already started of turning over its Jews to the Nazis for disposal. Later, his family moved to France. This man is also old enough to know that, absent American power, the Soviet Union would have eaten Western Europe alive, as it did Eastern Europe. It's impossible to believe that he does not know that the US threatened the Soviet Union into staying put and that the threat was credible thanks to America's obvious and abundantly displayed power. And there is no doubt that he knows and recognizes that there was no critical show like his under any Communist regime. And he knows full well that to the extent that there were public intellectuals in Communist-ruled countries, their place was most often in prison or in the gulag. I am sure he has signed several letters in support of jailed Cuban intellectuals. I am guessing, only guessing, that one could even make Zemmour admit that American might is the last rampart against the violent jihadism he openly fears.

So, here we have it: this intelligent, cultured man, endowed with a superior critical sense, this man who daily demonstrates his independence of mind in his newspaper columns, often in his books, and weekly on television, cannot put two and two together: American power created and continues to create a sphere of freedom where the likes of him can perform their good work. There might be an alternative to American power, but there is not. There has not been one ever since he was born.

Zemmour appears to suffer from self-inflicted blindness, a kind of hysterical reaction to his macro-environment. The French intellectual context is so mindlessly leftist and anti-US that there are few people with enough resources of mind and character to confront it head on. We are social beings. When enveloped in a warm blanket of homogeneity, most of us will succumb, unless we are trained not to. We all need rocky seas, every so often, just to calibrate our compasses.

Some French people are well enough aware of the political dimension of intellectual homogeneity that there is a French expression for it: “la pensée unique.” There are exceptions to the rule of French intellectuals yielding to the attraction of homogeneity but they don't last. The philosopher Bernard Henry-Lévy, originally a little prince of the French Left, broke publicly with leftism and with the anti-Americanism of his milieu in the 80s. Then he became a publicity hound and a bad writer. I recommend warmly that you don't buy his 2005 tour of the US, In the Footsteps of De Tocqueville, where he grandiosely claimed to follow in De Tocqueville's intellectual footsteps, but definitely did not. French intellectuals who throw off the warm blanket of homogeneity just seem to get lost.

But here is my third and last story. It's so outlandish that I wouldn’t dare tell it if I were not reasonably sure there were archives that could back it up if necessary.

I am guilty of many sins but failing to make myself comfortable is not one of them. Everywhere I am, I nearly always manage to begin the day in the same way — a cup of coffee in one hand and a newspaper in the other. This comforting habit sometimes has to be reduced all the way down to instant coffee and reading up on the exploits of a local high-school basketball team in a language I can barely decipher, but the form, the ritual, at least, is respected.

The French intellectual context is so mindlessly leftist and anti-US that there are few people with enough resources of mind and character to confront it head on.

So, it's late June 1998. I am in Brittany to write something on the largely phony origins of Breton Celticism. (No merit there; I am just following in the footsteps of the English historian Trevor-Roper in his iconoclastic The Invention of Scotland.) I am beginning the day sitting outside a good caféin a pleasant coastal town. There is a cool breeze from the sea. A grand crème warms my hand. (There is no such thing as café-au-lait in French cafés. It's another case of phony Frenchism, among many others.) This is before I rejected Le Monde as a morning sacrament. So, I am unfolding yesterday night's issue.

A front page item immediately claims my attention. It's a long report on India's nuclear policy by Le Monde's “envoyé spécial” in New Delhi.

It's been a little over a month since India detonated nuclear devices with obvious weaponization potential. I start reading with unwavering attention. After about two or three minutes, a small warning bell starts tinkling in my brain. Something is wrong in what I am reading. Now, I don't know about other people's warning bells, but mine is not sophisticated. It does not have special timbres for “lies,” “inaccuracies,” or “involuntary misinformation.” It just says, “Something is wrong here.” So, I go back to the head of the article and I realize with stupor that I have been reading all along that India's national policy regarding nuclear weapons had been until then one of “nuclear disappointment."

The words did not describe a collective feeling among Indians or anything close to such an emotion, as you might be tempted to guess. The envoyé spécial clearlycoupled the word “déception” to the word for “nuclear” in connection with an official policy, a series of connected statements designed to guide state action. The juxtaposition of those two words makes absolutely no sense, of course. A brief explanation is in order (and, by the way, this was one of those times when it is useful to know English in order to understand French).

The French word déception means “disappointment.” The distinguished French journalist sent especially to India at significant expense to investigate momentous events went on and on for thousands of words developing the thesis that India's policy regarding the touchy topic of nuclear weapons was guided by disappointment! There was no trace in the article of the fact that the French correspondent had almost certainly been briefed by Indian colleagues. I can guess what they had told him, in English — for years, India's informational strategy regarding nuclear weapons was one of “deception”: “We have them; we don't have them; we may have more than you think; keep wondering.” Finally, with its public explosions, India had come out of the era of nuclear deception.

I bought Le Monde faithfully for three weeks afterwards and examined it carefully, expecting an author's self-correction or a Letter to the Editor denouncing the non-sense. Nothing, absolutely nothing!

The least interesting observation about this informational debacle is that the correspondent did not know English, although he was judged competent enough by his editors to be put in charge of an important investigation that would have to take place in English. Periodically, I find the same kind of impairment of perceptiveness about language proficiency in American newspapers. In fact, I could name names if I were as mean as I sometimes hope I am. There is a permanent Paris correspondent for several respected American periodicals who, I am certain, does not know much more than high-school French. I mean, enough to shop and to travel by the Metro but definitely not enough to read Le Monde, to take an example at random. I think she often sounds credible merely because she has developed good French sources. Occasionally, not often, her analyses appear both larded with clichés and way off the mark, as if she had tried to go it alone for once.

The main implication of the Indian story is that a highbrow French intellectual from Le Monde spouted utter nonsense for two long pages and never caught on to the evidence that he was spouting non-sense. Worse, in the vast, distinguished readership of Le Monde, there was no one to notice, or to be motivated enough to ask for a correction — or else the management of the paper received corrective communications and decided to bury them, because they knew could get away with it.

Here is the inescapable conclusion, it seems to me: the French intellectual class, and those who follow it, have been listening to abstruse, absurd logic for so long that they assume there are parallel philosophical universes. In those parallel universes, words can be linked together any which way, logic is variable, upside may be downside, and black is as likely to be green, or even white, as black. And above all, clarity of expression is vulgar.

This can only happen when a group prizes collective peace above all, when dissent about ideas is so severely punished that otherwise intelligent people practice unconscious self-censorship. I refer to unconscious self-censorship because France is undeniably a democratic country with no government censorship of any kind. In fact, French intellectuals sometimes point with glee to government censorship in this country. (They refer to lThis juror who was thus placed under bodily threat is called Eric Zemmour. He is a man I would have for dinner any day, and I would also be glad to drink either coffee or beer with him any time. I might even share my oldest Calvados with him, because I like him so much. He is a journalist and also the author of ten books. He is refined, immensely cultured yet humble, and devastatingly witty. Zemmour often demonstrates a solidly conservative temperament in his comments about others peopleocal censorship in matters of sexual display, guided by the vague-sounding doctrine of “community standards.”)

The French example proves that intellectuals don't matter much as far as quality of life, or much of anything else, is concerned.

The unavoidable logical consequence of the French state of affairs for envious American intellectuals seems to me inescapable. France is a fairly well-run society where people don't normally die in the street. It maintains high standards of civility most of the time, in spite of the fact that a normal weekend sees several hundred cars burned by vandals. The French quality of life is high. In many respects it compares favorably to the American quality of life. And the demonstrated superior longevity of Frenchmen may just be due to their ancestral habit of drinking a little wine with each meal. The French system is fine but it's not sustainable, as we, I, used to say. After the 2008 US-originated financial, then economic, meltdown, it's more difficult to maintain the same view without grimacing, at least inwardly.

Similarly, there is much to object to in French foreign policy, but it's no more hypocritical, mendacious, supine, or irrational than the policies of its neighbors. Terrorists never won French elections and they never redirected national foreign policy as al Qaeda did in Spain in 2004. The French army is alongside US forces in Afghanistan, where it's more likely to be doing actual fighting than its German counterparts, for example. France is not as reliable an ally as the United Kingdom or Australia, but it acquits itself better than many others.

So, we see in France a reasonably good society, and a society that does not act especially erratically or shamefully on the international stage but whose cherished intellectuals are often blind and sometimes collectively insane, as my stories show. The inescapable conclusion: The French example proves that intellectuals don't matter much as far as quality of life, or much of anything else, is concerned.

Of course, there are some downsides to this blindness and insanity although they don't add up to poison. One is the propensity to invent madcap intellectual adventures under a serious mien, such as “déconstructionisme." Another is to produce with huge government support movies that no one understands and in which the action is too slow to be worth following. Yet another consequence is awarding the highest decoration in the land to Jerry Lewis. But it all amounts to nothing tragic.

P.S. There is little in this essay to disincline anyone, even intellectuals, from spending enjoyable time in France. You just need to be aware that if you are a rational person, the more French you know, the less restful, the less enjoyable your stay will be. My travel advice: See the sights; read guidebooks in English; don't listen to the radio; don't watch television; and, for Pete's sake, don't talk to anyone who is not working in a restaurant.

Share This

War and Peace


I’ve said it before, and I’ll say it again: good movies begin with good stories. By that criterion, Incendies is not just a good movie, it is a great movie. Set within a backdrop of bitter hatred and torturous war, it is nevertheless a brilliant film about love for family and finding a personal peace.

The story begins with the classic Romeo and Juliet conflict: Nawal Marwan (Lubna Azabal), a young Christian Arab woman, is in love with a young Palestinian man, and her family disapproves. What happens next — retribution, abandonment, shunning, and revenge — sets the stage for an alternate story line, 35 years in the future, after the woman has died. In her will she asks her young adult children, Jeanne (Mélissa Désormeaux-Poulin) and Simon (Maxim Gaudette), fraternal twins, to find the brother they did not knew existed and the father they thought was dead. This will require them to leave their home in Canada and return to the land of their ancestors in the Middle East.

As Jeanne heads to Lebanon to begin the search for her father in her mother’s hometown, the film flashes back to the young Nawal and her lover, Wahad. The film continues to switch between the two stories as the brother and sister follow the cold dark trail of the mother they only thought they knew. These alternating points of view allow the audience to know Nawal’s story more intimately and completely than the young siblings do, enhancing our compassion for the protagonist and our growing sense of horror as the two slowly discover the truth.

As war breaks out, young Nawal tries to escape the fighting while searching orphanages for the son her grandmother forced her to give up. Along the way she observes the bitterness and retaliation of both religion-based factions. Two scenes stand out as representative of the senselessness and atrocity of this kind of conflict. In the first, Nawal quickly removes the cross from around her neck and rearranges her scarf to cover her, so she can avoid the wrath of Muslims. In the next scene, she quickly doffs the scarf and pulls out her cross to show rebel guerillas that she is a Christian. But she is still the same person, inside and out; only the label has changed. Changing the label saves her life — but the death and destruction she observes destroy her soul.

Incendies is a thrilling mystery about a family’s quest to reunite itself. But it also has a powerful symbolic message, revealing the bitterness that comes from assigning divisive political and religious labels. What does it mean to be a Christian, a Muslim, or a Jew? Beneath the labels, all in the Middle East claim the same ancestry. Arabs (Christian or Muslim) may hate Jews because Ishmael is their ancestor; Jews may hate Arabs because Isaac is their ancestor. But trace their roots back just one more generation, and all honor Abraham as their father. All are cousins under the labels. All are of the same lineage and family.

Incendies is the most engrossing film I have seen since last year's The Secret in their Eyes (also a foreign film). Yes, you will have to read the subtitles at the bottom of the screen — unless you speak French, Arabic, and another dialect I didn’t recognize. But it will be well worth the effort. Don’t miss this outstanding film if it comes to your town.

Editor's Note: Review of "Incendies," directed by Denis Villeneuve. Sony Pictures Classics, 2010, 130 minutes.

Share This

Taxing the Ether


Here’s the instinctive mindset of the Democratic Party: “If it moves, tax it. If it doesn’t, tax it even more.” If you need proof, consider the frantic attempts by desperate Democrat governors in high-tax states to tax commerce conducted on the internet.

One story about this comes out of California, notoriously one of the most economically ignorant and fiscally incontinent states in the nation. It appears in a Los Angeles Times editorial lauding the efforts of Democrats in the state legislature to try to apply California’s outrageously high sales taxes — nearly 11%, counting state and localities together — to purchases on the internet, targeting especially the dominant internet retail giant Amazon.com. The LAT (always an affirming voice for redistributionist tax-and-spend government) argues that the state is “owed” millions in tax dollars for sales over the net. The paper, natch, supports a bill by Berkeley Dem Nancy Skinner to require internet retailers to collect sales taxes.

The LATviews this as fair — what is the difference, it asks, between buying your shoes at the local store and purchasing them at a store based in Nevada? And, the rag pompously avers, this is the law.

It cites the 1992 Supreme Court ruling (Quill Corp. v. North Dakota) that held that out-of-state mail-order companies (and presumably, by inference, internet retailers) with no physical presence (i.e., no actual stores or warehouses) in a state could not be compelled to collect sales taxes from customers in that state — although the court allowed states to try to collect taxes from such customers directly. So this is the law.

According to the LA Times, people who buy over the internet are both legally and morally (morally?) obligated to pay sales taxes on their purchases. It argues that Amazon and other online stories deliberately encourage consumers to evade their legal and moral obligation by failing to inform them of that obligation on their websites. Not only must the internet help to suck in taxes; it must also lecture people about their ethics.

In an effort to grab more taxes — as opposed to cutting spending — Gov. Quinn cost his state jobs.

The LAT not only endorses legislation that would require any internet company to collect sales taxes from purchases by Californian customers if that company has any affiliates (suppliers) in the state; it also recommends a national bill that would explicitly require all internet companies to collect sales taxes on half of all states that want their citizens’ purchases taxed—and which of them wouldn’t? The LATconcedes that so long as the Republicans have a check in Congress, such a bill won’t ever be passed, but the grand vision is of every vendor of five-dollar trinkets to become an IRS agency, assiduously divvying up its surplus value in accordance with the 50 tax codes of the 50 states, plus Puerto Rico, Guam, and the District of Columbia.

At the time the LAT piece was published, rumors were circulating out of Sacramento that the state Board of Equalization — the agency responsible for collecting California state taxes — would be hiring computer geeks to find out ways of looking at internet traffic to discover which criminal Californians are daring to buy on the web. This, needless to say, caused considerable consternation—not to mention considerable concern about the morals of internet aficionados who would thus be involved in killing the internet.

But the LAT’s case is patently defective. Why the devil should a business like Amazon, which uses none of California’s police or fire services (since it has no bricks-and-mortar locations in the state), much less its educational enterprises, have to pay the state a nickel? And why should Amazon customers within the state have to pay any more than they do right now? They already support the schools with their property taxes. Their sales taxes, collected at the stores that actually exist within the state, support the police, the fire department, and the other agencies that protect those stores. Where does one’s moral and legal obligation stop?

And the consequences from trying to tax the internet are likely to be counterproductive to the states that do it, as a piece in the Wall Street Journal reports. The WSJ — which understands economics approximately a thousand times better than the LAT understands it — points out the obvious: if a state (like California) tries to saddle (say) Amazon with collecting sales taxes for that state because Amazon has affiliates within it, then Amazon will just drop those affiliates.

Indeed, as the WSJ piece recounts, this is just what happened recently in Illinois (a state in even worse fiscal shape than California, if that be possible). The tax-happy Democratic Governor Pat Quinn signed a law applying the state sales tax to internet purchases in Illinois, and it took Amazon only a few hours to announce that it was immediately halting purchases from and affiliation with the 9,000 small Illinois businesses with which it had been doing business — business profitable for Illinois as well as for Amazon.

So, in an effort to grab more taxes — as opposed to cutting spending — Quinn cost his state jobs. Either a discontinued affiliate will stay in Illinois and see its sales plummet (which will then necessitate cutting its workforce), or it will — as some are already doing — move to an adjacent state (such as Indiana) that manifests less tax madness.

Rhode Island, which like Illinois and a few other states (Colorado, New York, and North Carolina), had earlier passed an “Amazon tax bill,” has collected only peanuts in extra sales tax revenues. A study by the Tax Foundation shows that when you factor in the lost jobs from affiliates cutting back, closing down, or moving away, the state probably lost revenue.

The LAT editorial suggested that to prevent internet companies from dumping affiliates in a state that imposes an Amazon tax, what we need is a federal law forcing all internet companies, wherever located, to collect taxes from all customers, wherever located, and remit those funds to the customers’ respective states.

That insipid argument is based on the absurd premise that if we pass a national Amazon tax, Amazon couldn’t drop all of its national affiliates. But it sure as hell could, and just move its central operations to (say) Mexico and all its affiliations to businesses in other countries. That would be yet another example of government greed, triumphant.

Share This

Poverty and Crime


If you’re like me, you have been instructed, from your youth on up, that crime is caused by “conditions” — meaning economic conditions, meaning poverty. You’ve also been told that crime can be reduced, or even eliminated, by the abolition of poverty — which, of course, can come only by means of massive government action.

These were some of the ruling theses of President Lyndon Johnson’s War on Poverty, which continues in its thousand institutional forms unto this day, more than four decades after he declared it.

The theses were always vulnerable, on their face. What do you mean by “crime”? Do you mean a Hollywood producer’s rape of a female staffer, who is too afraid to report the crime? Do you mean a party in West LA, where rich people snort a barrel of coke, and are never caught? Or do you mean a guy who’s pushing weed in Fresno, or a girl who’s arrested for prostitution on the streets of Grand Rapids? Do you mean crime that’s successfully prosecuted, or do you include crime that never gets recorded?

And the thesis has long been vulnerable to the evidence. Johnson’s War on Poverty immediately preceded an enormous wave of crime. The hundreds of billions of dollars that American communities spend on welfare has not demonstrably reduced the incidence of crime, however you want to define that term. Most serious analysts believe those dollars have increased it, by fostering a culture in which principles of individual responsibility are no longer considered necessary.

But wait a minute: what is “poverty,” anyhow? What’s the standard? What’s the definition? Was “poverty” the welfareless condition of virtually all Americans in the 1920s? If so, was it the lack of government welfare that induced millions of people to violate the Prohibition laws, and some thousands of them to kill and maim their fellow-citizens in pursuit of profits from that violation? In a larger sense: isn’t poverty relative? The poor of the 1950s were much richer in absolute terms than the poor of the 1920s, yet fewer people were sent to prison in the 1950s. The poor of the 1980s and 1990s were richer still; yet a much larger proportion of the populace went up the river in the 1980s and 1990s than in the 1950s.

Now comes the following announcement from a website in my town (voiceofsandiego.org), about the FBI’s new report on crime in America during 2010, the year of a great depression, especially here in far southern California:

“In San Diego, the number of violent crimes — murder, rape, robbery and aggravated assault — dropped 5.3% from the previous year and the number of property crimes — burglary, theft and vehicle theft — dropped 4.6%. (Nationwide, violent crime dropped 5.5% and property crimes were down 2.8%.)”

The author adds a reference to the prevailing wisdom:

“Nationwide crime declines in recent years have continued to puzzle criminologists, who expected worsening economic conditions to lead to more crime.”

Experts might be puzzled, but no one with any sense, or historical perspective, would suffer their fate. Officially recorded crime went down during the 1930s — the time of the Great Depression. Why shouldn’t it go down in 2010?

We can’t quantify the sources of crime, but we should know this: crime, and the definition of crime, has less to do with “economic conditions” than with community mores, individual opportunities, and (in a reverse sense) government action. When the government declares alcohol or drugs to be illegal, “crime” automatically results. But when individuals and communities hunker down in order to get through a period of relative poverty, crime may well diminish. Only God knows the exact linkages, but it’s not puzzling that people whose families are making less money than before may respond by doing something legitimately and dependably profitable rather than something criminal.

Indeed, as William Blake commented two centuries ago, the idea that poverty causes crime is a slander on poor people. It ought to be resisted by every person of generous mind, and especially by all of us — and there are many, many of us — who have struggled to come out on the other side of bad “economic conditions.”

Share This

Keeping an Eye on the Iceberg


I have often reflected on the looming fiscal disasters that are Social Security and Medicare. As several recent articles suggest, our economy is still heading toward the financial iceberg.

The first report comes from the trustees of these two Ponzi programs. It shows that the financial outlook of both has worsened dramatically over just the last year. It confesses that the so-called Medicare trust fund will be empty by 2024, five years earlier than predicted by the trustees just a year ago. Assuming current revenues, and no unusual increases in the costs of medical care (a dubious assumption indeed), Medicare will only be able to pay 90% of its promised benefits after its trust fund is depleted.

Social Security is also in deep trouble. Last year, it started running a deficit between taxes paid in and benefits paid out. The trustees’ report indicates that from here on out, this deficit will be permanent, and will rapidly increase in magnitude. As another report notes, Social Security will exhaust its “trust fund” by 2036 — one year earlier than last year’s estimate — at which point it will have to slash its benefits by 23%.

This second report confirms a prior reflection of mine that the separate trust fund for the Social Security Disability Insurance program will actually be gone by 2018, if not sooner.

Yet another recent piece sheds light on why SSDI is going off the cliff so quickly: at least some of the 1,500 judges who administer the program just hand out “disability” awards without any real scrutiny of the merits of the claims being made.

This fascinating story centers on Judge David B. Daugherty, who routinely rubber-stamps all requests for disability that come before him. Last year, of the 1,284 cases he saw, 1,280 were greeted with benefits. This year, he is being even more generous with our money, awarding benefits in all 729 cases received. Judge Daugherty has often listened to 20 cases a day, spaced 15 minutes apart, and many from one local lawyer to whom he appears to be particularly sympathetic. His near-100% record awards contrasts with the average of 60% for the system as a whole.

Daugherty is just a particularly egregious example of an entitlement system run amok. Many of the SSDI judges award taxpayer money to claimants without any hearing at all, or merely by looking at the medical evidence submitted by claimants’ attorneys. The SSDI paid out an amazing $124 billion in benefits last year — and the sum will only balloon, making it the first of the entitlement programs to go bust (in a technical sense: naturally the taxpayers will be forced to pick up the tab).

Of course, the point needs to be re-emphasized that all these so-called “trust funds” are just government IOUs issued to cover the surpluses they ran while the Baby Boomers were at their peak earnings. The surplus funds were spent running the government. The “trust funds” are thus the moral equivalent of the “special purpose entities” set up by the crooks at Enron to hide its debt. Their function is to push debt onto entities that are apparently separate from the main organization, which actually owes it. Alas, since Congress exempted itself from the provisions of the Sarbanes-Oxley Act, the feds can get away with this fraud (a point I have explored elsewhere).

The accelerating deficits have moved no less a luminary than Tim Geithner, our incorruptible Treasury Secretary and head of the Social Security and Medicare trustees, to tell us that the report makes clear that we need to act “sooner rather than later.” Yeah, but Geithner didn’t indicate how his call for action squares with the views of his boss Obama, who has steadfastly refused to state where he would cut entitlement programs — even as he created a huge new one in the healthcare field.

The few brave souls who have called for reforming the entitlement programs — most heroically Republican Congressman Paul Ryan — have gotten scant public or media support. (Even Newt Gingrich has attacked Ryan’s call for reform, calling it “right-wing social engineering.”)

The estimable economist Veronique de Rugy, commenting on the recent trustees’ report on the Medicare-Social Security mess, makes the point that these Ponzi schemes transfer money from the young to the elderly population. What she doesn’t note is that the reason they are still so generally supported is that the elderly vote religiously, while younger people vote only sporadically — and children, who must in the end pay for all these deficits one way or another, have no vote whatsoever.

Share This

Doomsday Update


May 21 has come and gone, and so far as I can tell, Judgment Day has not occurred. Whether that’s good or bad is a debatable question.

What is not debatable is the discrediting of one of the world’s best publicized prophecies, and one of the few prophecies specific enough to be fully disconfirmable. I refer, of course, to the Family Radio network’s prediction that the Rapture and the beginning of Judgment would occur on May 21, 2011.

Harold Camping, Family Radio’s “Bible teacher,” went down with his flag nailed to the mast. Throughout last week, he told callers on his daily call-in program that he wouldn’t even consider questions about the possibility that the Rapture wouldn’t happen on the 21st. On May 17, for instance, he remarked, “If you were talking to me three or four years ago, I would have said, well, there’s a high likelihood [of climactic events in 2011]. . . . But beginning about three years ago, God has shown us proof after proof and given us sign after sign. . . . I know absolutely, without any shadow of a doubt whatsoever, that it is going to happen on May 21. . . . It is absolutely going to be May 21. The Bible guarantees it, without any question. So we cannot countenance any other idea. It is absolutely going to happen.”

Liberty provided one of the first nationally published heads-ups and explanations about this matter in its December issue, in the article entitled "An Experiment in Apocalypse." Early this month I continued the discussion. The topic has proven surprisingly interesting to our readers, as it has to millions of other people, worldwide. I am receiving a lot of requests for updates, and I will provide them.

The really interesting thing, of course, isn’t the fact that Family Radio has been proven wrong. The interesting thing is seeing how individuals and institutions respond to the disconfirmation of ideas that they regarded as fully justified by reason and authority. Full evidence about Family Radio’s response will take a while to come in. Its offices were closed over the weekend (starting on Friday, May 20), and all or almost all programming from then till now has been prerecorded. (I write in the early evening of May 22.) The swarm of media attention simply washed over the recumbent form of Family Radio, occasionally sweeping out one or another follower who discussed his disappointment in vague, colorless terms.

But we can expect to learn more, and I have already learned some things. One interesting thing to me is the fact that throughout May 20 and 21 — even as late as 6:30 p.m. on the latter day — the station was still broadcasting invitations to call up and order “Judgment Day, May 21” pamphlets and bumper stickers — thus making itself even more ridiculous than it would have become, had it simply ceased all ads for mail-order material several days before.

Nevertheless, even the most ridiculous things in life happen because somebody decides to make them happen. Somebody — and a number of people would have to be involved — decided to keep running those ads. Somebody scheduled them. Somebody provided them to local stations. Somebody at the local stations ran them. In only one instance (at 1:25 a.m., PST, on the purported day of Rapture) did I hear evidence that an ad might have been spiked by the national network or my local station, with four minutes of music substituted. As I write, Family Radio’s website still declares in bold letters: “Judgment Day, May 21, 2011: The Bible Guarantees It!”, and its clock says there are “00 Days Left.” Yet someone at Family Radio switched Camping’s prerecorded lecture, which runs on Saturday evening and Sunday afternoon, from one of his constant Judgment Day diatribes to a discussion of divorce (he’s against it) that was broadcast “25 or 26 years ago,” according to the prerecorded announcement — which also says that copies of the divorce lecture will be “available this week” if you call or write for them. Divorce is exactly what would interest you, on Judgment Day or immediately afterward, right? It should be noted that on May 10, a woman called in to ask Camping about that very topic, divorce, and he told her that “it’s all academic,” because the world was ending and her husband wouldn’t have time to divorce her anyway.

Absurdity upon absurdity. But what do these absurd contradictions mean?

They may show the depth of institutional inertia, even within a relatively small, voluntary organization, an organization, mind you, that is operated by zealots, not by the pension-pursuers at the DMV. The thinking may have been, “We’ll just keep running whatever we’ve been running, whether it makes sense or not. That’s what we do” — even if it makes our own cause ridiculous. If Family Radio can achieve inertia like this, imagine what a government can do, in the face of all the evidence against its theories and programs.

The contradictions may, however, indicate something exactly opposite to inertia, but equally significant. They may indicate that dissenters within Family Radio, of whose existence there has already been a good deal of evidence, decided to assist the organization in rendering itself absurd, thus making the ousting of its current leadership more likely. These people could have halted the post-Doomsday ads for Doomsday literature; they could have snaked out some lecture that wasn’t about (of all things) divorce. But they used their individual initiative to do something more complicated.

That’s a guess. But here’s the idea, in brief: what happens within organizations and individuals is a contest between inertia and initiative, each with its own set of rewards: security, stability, and conservation of energy on the one hand; new opportunities (for power, for revenge, for simple rightness) on the other. If enough data emerge, the next stage of Family Radio’s existence will constitute a fascinating experiment in conflict, institutional and individual.

I will keep on this beat. My own prediction is that Mr. Camping will be ousted from leadership during the coming week by irresistible forces of change in the organization he founded. But this prediction is disconfirmable. Stay tuned.

Share This

The Audacity of a 1% Hope


Federal Judge Roger Vinson of the Northern District of Florida has declared Obamacare unconstitutional in a lawsuit joined by numerous states seeking to escape from the onerous burdens that the law places upon them. But when I read that his rationale is that the mandate requiring people to buy health insurance exceeds the power of Congress under the commerce clause, I became a little bit sad — sad not because Judge Vinson is just dangling the dream of a new attempt to use commerce clause jurisprudence to limit the government’s meddling in the economy, but because there is only about a one in 100 chance that the United States Supreme Court will agree with his reasoning.

There was a time when Congress could only pass laws authorized by the specific enumerated powers of the Constitution. (This is ostensibly still true, but most congressmen don’t take it seriously.) One of those powers, indicated by the commerce clause, is the major one that Congress uses to destroy — I mean, to regulate — the economy. The commerce clause states: “Congress shall have power… to regulate commerce with foreign nations, and among the several states, and with the Indian tribes.” In my constitutional law class final exam two years ago, I argued that the commerce clause clearly gives Congress precisely the same power to regulate economic activity within a state that it has to regulate economic activity within foreign nations, and since we obviously cannot march into France and order them to have a minimum wage or a health insurance mandate, and we can really only regulate goods from France that come across the Atlantic Ocean, the commerce clause means that Congress can only regulate goods that physically cross state lines. I got an A in the class.

However, most lawyers don’t think that way. Even back in the 1800s the commerce clause case law said that Congress could regulate anything that did not take place entirely within one state. There was a time before the New Deal when the Supreme Court still took the commerce clause seriously and tried its hardest to allow Congress only to regulate interstate commerce. But this interfered with the New Deal. So, after FDR threatened the court with his infamous court-packing scheme, the Court made a “switch in time that saved nine,” in the landmark case of NLRB v. Jones & Laughlin Steel Corp. (1937); it began to undermine the commerce clause by permitting the commerce power to extend wherever commerce within one state had effects across state lines.

Later, US v. Darby (1941) made it clear that the commerce clause was now a joke and Congress could do anything it wanted. To throw a little paint on the feces, the Court decided Wickard v. Filburn (1942), which said that a private individual’s private behavior within one state could be “interstate commerce” under an aggregation theory that proposed the questions: “What if everybody did this? Would the aggregate cumulative effects have an impact on interstate commerce?” This meant that a person growing tomatoes in his own farm for his personal consumption could be engaged in interstate commerce, even though the tomatoes never left his farm, because he was somehow magically connected to all the other tomato farmers out there — a conclusion that is ridiculous only if one does not understand that it is a mere pretext for socialism. FDR’s supporters argued that economics had somehow fundamentally changed since America’s founding, and the law needed to change with it.

Commerce clause jurisprudence remained buried for decades, but in a shallow grave. In the 1990s, Chief Justice Rehnquist, with help from Justice Thomas and the Court’s other conservatives, tried to revive the distinction between economic and noneconomic, and national and local, in cases such as US v. Lopez (1995), which struck down a gun ban under the commerce power, and US v. Morrison (2000), which struck down an anti-gender violence law as having nothing to do with interstate commerce. Then Justice Scalia murdered Rehnquist’s commerce clause revival in Raich v. Gonzales (2005), saying that the commerce power authorized the criminalization of medical marijuana because of the necessary and proper clause: “Congress shall have power . . . to make all laws which shall be necessary and proper for carrying into execution the foregoing powers.” For some reason, Scalia failed to understand that the necessary and proper clause is irrelevant if Congress cannot pass a law under the commerce clause in the first place.

The fate of America’s healthcare system now turns on how the Supreme Court will interpret the commerce clause, whether the justices will hold that the aggregate effects of choosing not to buy health insurance constitute interstate commerce. I predict that the vote will go along political ideological lines, as the choice of whether to take the commerce clause seriously as a limit on Congress’s power is as much a political as a legal decision. But there is no way to tell how Justice Kennedy will vote, and it is still too early to tell how Obama’s appointees will vote. Still, I estimate that there is only a 1% chance that Obamacare will be struck down. It has been many decades since the commerce clause limitation had teeth; and the Court will be afraid of accusations that it is frustrating the will of the American voters if it strikes Obamacare down, even though it is the role of the judiciary to check the tyranny of the majority, and most Americans don’t like Obamacare, anyway.

Nonetheless, if libertarians are to make ourselves known in the legal world, then the commerce clause is one of the main weapons that will need to be in our arsenal. We can hope that one day our constitutional jurisprudence will return America to the Founding Fathers’ vision of a federal government that cannot do whatever it wants. As my constitutional law professor was fond of saying, constitutional doctrines fade away, but they can always come back, and there are libertarian legal doctrines from American history that we can revive if and when we develop the power to make our voices heard in within the legal system.

Share This

Alan Bock, R.I.P.


Alan Bock, contributing editor of this journal, died on May 18 at his home in Lake Elsinore, California, after a heroic fight against cancer. He was 67.

A fine account of his life has been written by Greg Hardesty, his colleague at the Orange County Register, where Alan worked for over 30 years as an editorial writer and columnist. A picture comes with Hardesty's article, and I think it says something about why so many people liked Alan. We at Liberty remember him as an engaging, jovial man — good company — and a writer whose contributions we always looked forward to getting.

If you'd like a sample of Alan's work, pull up our July 1999 issue in the Liberty Archive. On page 23, you'll find Alan's article, "Gateway to Oppression." Our Contents page for that issue characterizes the article in this way: "Alan Bock examines the latest scientific study of marijuana, and wonders: if marijuana kills, why hasn't anyone ever died of it?" Some of Alan's wit comes through in that blurb, but when you read the article, you'll see many other things about him: his steel-trap logic, his mastery of fact, his sympathy for the oppressed, his noble indignation against oppression. By the time he finishes, he's made the definitive analysis of his subject — and the piece is only about 800 words long.

One of the most lovable figures in the American folk imagination is the iconoclastic newspaperman — learned yet colloquial, genial yet incorruptibly just, a man who never gives up on truth and liberty. Alan demonstrated that this figure is not merely a product of the imagination. He was that figure. Everyone who knew him is saddened by his passing; everyone who learned from him remains inspired by his ideals.

Share This

Awesome, Dudes!


Recently, I drove down the Southern California coast to attend a philosophy conference in San Diego. All along the way, and when I later sat in my hotel room overlooking the harbor, I kept thinking: how can a state as beautiful and as blessed as this one be so screwed up?

But an article I read just the other day helps me see, once again, the problem with California. The problem is its deeply demented government.

The story reports that Newport Beach, a wealthy city south of LA, is paying all but one of its full-time lifeguards over $100,000 a year in salary and benefits. $100K a year! And over half of them earn over $100,000 a year in salary alone! In fact, the two highest-earning lifeguards earn well over $200,000!

Additionally, these lifeguards retire after 30 years (or as young as age 50) with a pension of 90% of their highest salary.

Now, granted, Orange County, where Newport Beach is located, is one of the wealthiest counties in the US. But even in Orange County, the median household income ($71,735) is far lower than what these dudes and dudettes earn.

This of course raises a fascinating question: if the freaking lifeguards are earning this kind of money, what is the city paying its other workers? I shudder to think what the city is paying its police and firefighters. And can you imagine what the city officials must be getting?

Here, it just gets crazier, day by day.

Share This

Strauss-Kahn, Exemplar of Socialism


The libertarian critique of socialism, or “social democracy,” has usually gone something like this:

The socialist program demands a planned economy. A planned economy can result only from plans. Plans must be made by a group of experts who are not subject to the vagaries of the electoral process. To form and implement their plans, the planner-kings must know everything crucial to the economy. They must know everything significant to their own plans, and be able to predict everything significant that may result from them.

But that is impossible.

This being true, the people who become planners will be those who are either stupid enough to believe that Plans can succeed or cynical enough to care only about the personal power that can be acquired by Planning.

The libertarian critique has a logic that no socialist program ever possessed.

Now we witness the reductio ad absurdum of the socialist idea: Dominique Strauss-Kahn, head of the International Monetary Fund, chief honcho of the French Socialist Party, and prospective president of France, who was arrested Saturday on charges of trying to force a maid in his $3,000 a night hotel suite to have sex with him.

Suppose that the charges turn out not to be true. Suppose that Strauss-Kahn’s nickname, “the great seducer,” means nothing. Suppose that consensual sex is nobody’s business but one’s own. Suppose all these things — the last of which is certainly true. The $3,000 a night hotel remains a problem.

As a self-chosen representative of socialism, and an anointed planner of the world's economy, Strauss-Kahn has supposedly devoted his life to the good of the people. How, then, $3,000 a night? On what premise must the people of the world pay for that?

I’ll tell you. The premise is that Strauss-Kahn, a product of those inner-circle French schools whose graduates automatically get high government jobs, deserves his perquisites of office, because he is somehow qualified to plan the world's economy.

Is he?

No. And anyone who thinks that he himself is so qualified, and uses that idea to justify his perquisites of office, is likely to present a strange moral profile.

World economic planning is allegedly justified on humanitarian and charitable grounds. Planners, allegedly, exist to help people, especially the deserving poor. Planners are supposed to be performing an altruistic work, the modern form of a religious mission. Yet among these managers of the world economy there is a strange absence of people who live in modest circumstances, practice some kind of religious or ethical discipline, or have anything to do with normal human beings, except when the maid arrives a few minutes early in their $3,000 a night hotel suite.

There are plenty of smart people in this world. Many modest people, skeptical of their own conclusions because they are actually in touch with their fellow citizens and knowledgeable about their lives, are also smart people. Strangely, many of these smart people are socialists, but their ambition is not to become world socialist leaders.


Because the idea that a small group of people is smart enough and knowledgeable enough to plan the financial lives — in fact, the lives — of six billion people is an idea that no one with any ethical understanding would apply to himself. An ordinary moralist would ask, “Who am I to do that? I don’t know enough. I could never know enough.”

Strauss-Kahn presents little evidence of any such moral or practical reflection. But what he did with his life was predictable, under the modern socialist system. A beneficiary of unmerited advancement, he did his best to “stabilize” the world’s economy by using political means to get the productive countries to support the spendthrift countries. He who wasn't producing anything himself.

I don’t presume that an alcoholic is incapable of becoming a good author. Faulkner did. Hemingway did. And I don’t presume that a “great seducer” is incapable of becoming a great thinker. Plenty of examples argue otherwise. But I do not presume that a drunk will be good at running an airline. I do not presume that a person who lacks discretion even about consensual sex affairs will have enough discretion to plan the future of six billion humble families.

To put this in another way: how did someone as stupid as Dominique Strauss-Kahn become one of the small group of people appointed to oversee the fiscal life of planet earth?

The answer is: the logical necessities of the socialist idea. If you want socialism, you are voting for fools like Dominique Strauss-Kahn. You may not know it, but you are. Otherwise — I’m sorry, you can’t have socialism on any other terms. The fact that Strauss-Kahn rose to the top is only a sign that the rest of the candidates were actually less competent than he.

To conclude: if you want someone running your life, and the life of the world, you can be assured that it will be someone like Dominique Strauss-Kahn — and if not him, then worse.

Share This

The Significance of Ron Paul


Rep. Ron Paul, Republican of Texas, is once again running for president.

No member of the House of Representatives has run for president and won since James A. Garfield in 1880 (and Garfield had been elected to the Senate just before his election as president). No one as old as Paul has been elected president. He would be 77 when he took the oath of office. Ronald Reagan was 69.

Most of all, no one as radical as Paul has been elected president during the modern era.

There are hopes that this time around, Paul will break through to mainstream America because his argument against foreign war, for a sound currency, and for large cuts in spending will catch fire. It will with some voters, but political ideas acceptable to the American public don’t change that fast.

I said this two weeks ago in a talk to my state’s conservative activists — an audience that included Paul supporters. I said I agreed with Paul on some important things, but that he could not win. One came up to me afterward and said, “You know, every time you say that, you hurt his movement. He got as far as he did last time because thousands of people thought he could win.”

And they were mistaken. But he changed some minds. He made arguments that nobody else would have made — and some of those arguments look better four years later.

In 2007, no Republican candidates were arguing against the wars in Iraq and Afghanistan except Paul. Now the Politico website reports a rise of war weariness and even “isolationism” among the Republicans in Congress. They are far from a majority, but they are a faction. And there is another libertarian candidate in the race, former governor Gary Johnson of New Mexico, who also calls for getting out of the foreign wars immediately.

Four years ago, no Republican candidates other than Paul were talking about protecting the value of the dollar. I still haven’t heard them doing it — but gold is above $1,500 an ounce, and the US dollar is below the Canadian and Australian dollars. The topic ripens.

Four years ago, there was no quasi-libertarian Tea Party movement, and Ron Paul’s quasi-libertarian son Rand Paul was not in the US Senate.

The ground has changed.

Still, it has not changed enough to elect Ron Paul as president. There is no point collecting dandelion seeds, such as the CNN/Opinion Research poll last week, which showed Paul running stronger against President Obama than any other Republican candidate. I have heard that poll cited several times, never mentioning that the split was Obama, 52%, Paul, 45%. Anyway, it was a poll taken 15 months before the election, which means it was a poll of a public not paying attention. Paul, in particular, had not been seriously attacked.

A few days later, he was. Conservative columnist Michael Gerson of the Washington Post ripped into him for his answer to a reporter’s question. The question was whether Paul favored the legalization of heroin.

There is a purpose in questions like that. It is to see whether the reporter can catch the candidate saying something crazy — not crazy, maybe, to a social scientist or a philosopher, but crazy to a political operative, or Joe Sixpack.

The role of the radical candidate is to take the taboo stands, fight valiantly, lose, and change the political ground.

In his answer, Paul compared freedom to use drugs to freedom of religion. Here is how Gerson paraphrased it: “If you tolerate Zoroastrianism, you must be able to buy heroin at the quickie mart.” This, Gerson sneered, is the essence of libertarianism.

But Paul had said more than that. Wrote Gerson: “Paul concluded his answer by doing a jeering rendition of an addict’s voice: ‘Oh yeah, I need the government to take care of me. I don’t want to use heroin, so I need these laws.’ Paul is not content to condemn a portion of his fellow citizens to self-destruction; he must mock them in their decline.”

Gerson concluded that any candidate who supports “the legalization of heroin while mocking addicts” is marginal and unserious. His column was a way of looking at the Republican list and scratching out the name of Ron Paul.

Libertarians can rail against Gerson as biased, which of course he is. He is an opinion columnist. Bias is part of his job description. But if your candidate is taken seriously, which Paul was not in 2008, this is the kind of attention he is going to get — and here it is attention from a conservative. If Paul became the Republican frontrunner, the pundits of the Left would go after him with machetes and crowbars.

They haven’t, because they delight in schism on the Right. But if he becomes the frontrunner, they will. And Paul has said plenty of things they can use to make a bogeyman out of him. Legalize heroin. Imagine what they could do with that.

Here is the reality. Certain political stands are safe, others are daring, and some are taboo. The role of the radical candidate is to take the taboo stands, fight valiantly, lose, and change the political ground. It is a valuable role to play: it is changing the field so that other good candidates, later on, can win.

What other candidate? Maybe Rand Paul in 2016 or 2020. Maybe Gary Johnson. One can imagine a Mitch Daniels-Gary Johnson ticket in 2012, with Johnson running in the top position later. Once a libertarian faction has been established in the Republican Party and is built into a substantial faction, room is made for other candidates, ones aiming more directly at winning, to have a go.

On the day that Paul announced, I had lunch with his 2008 campaign manager, Lew Moore. The timing was accidental; I had met Moore among the conservative activists two weeks before, and I hadn’t seen him in years. I asked him: when Paul ran in 2008, did the congressman seriously think he could win, or was it mostly to change the debate?

Without denying that Paul had had some chance of winning, Moore said the campaign was mostly about changing the debate. He said, “That is what his whole life has been about.”

And, at 75, Paul is not done. You have to admire the man. A lone congressman from Texas, never enjoying the support of his party’s establishment, has changed the political ground within the Republican Party.

And maybe he will change it some more.

Share This

Wind Power Wannabe


Two recent stories about wind power went unremarked in the mainstream media, presumably because the stories don’t fit the dominant Green narrative, aka the Green Dream.

The first is the report out of the UK that wind farms produce far less energy and cause far more problems with the grid than proponents have predicted or acknowledged.

The John Muir Trust — a “conservation charity,” please note — commissioned an engineering study of wind power in the UK. The report is out, and it is revealing. While wind power farms are pitched to investors — really, lawmakers, since wind power only exists because of lavish subsidies from government — as generating, on average, 30% of their maximum output over time, in reality they average only 25%. So wind power delivers about one-sixth less electricity than promised. This is a very significant shortfall. Yet wind power averages less than 20% of capacity most of the time, and a risible 10% about a third of the time.

But there is a more severe problem. Because wind power is so erratic, it needs backup from fossil fuel power plants, and that backup has to be able to shut down quickly when the wind blows hard, or come online quickly when wind farms won’t deliver even their measly 25% power. So wind power farms must be tied very tightly to fossil fuel plants, or the grid will face a shortfall.

Even worse: the times (such as the middle of the night) when power demands on the grid are slight are often the periods when the wind blows hardest. At such times, owners of wind generators — who have to sell power whenever it shows up, even at a low price — push power onto the grid, thereby forcing other providers off.

This is because the grid is just a distribution network of power lines and transformers with little capacity for storing power when it isn’t being consumed. Yes, there is “pumped storage,” which uses excess electricity to get water up hill, then during periods of high demand lets it flow back down, turning turbines as it goes, thus generating power. But pumped storage is inefficient and limited. Currently, the United States, the world leader in pumped storage, can store only about 2.5% of the average electric power sent across the grid at any given time.

A second damaging piece of news for wind power is the report that it may have lost its enchantment even for the Dutch.

Perhaps because of its historic use of windmills, the Netherlands has invested heavily in modern wind power. It is now third in the world in offshore wind power generation — of course heavily subsidized by the government. But the new center-right government has decided that continuing the massive subsidies, which include the transfer of 4.5 billion Euros of Dutch tax dollars to a German engineering company to build and run new wind farms, is not, shall we say, defensible.

The new Dutch Prime Minister, Mark Rutte, may have come up with the perfect epitaph for wind power. He reputedly said, “Windmills turn on subsidies.” Soon fewer will be turning.

Share This

Rising Star


Liberty is always delighted to acclaim the artistic success of libertarians. Our delight is increased when they are Liberty’s own authors.

Today it is my pleasure to introduce a book by one of my own favorites, Garin Hovannisian. The book is Family of Shadows, it’s published by HarperCollins, and it’s causing a stir on several continents.

Family of Shadows is the story of Garin’s family — who were by no means vague or ghostly people. They were vivid presences, taking their part in some of the most interesting events of the 20th century, from the massacres in Armenia during the time of “the breaking of nations” to the destruction of the Soviet Union. The book is a story of survival, and of the individual freedom that makes survival worth the effort.

It’s also a story told with great style and insight. All historians deal with “shadows,” but a good historian makes them more substantial than the ostensibly real people who surround us daily. And a good historian, like a good novelist, makes us wiser as we read. While reading Family of Shadows, I kept thinking, “This is a very good novel.” But it’s not fiction, nor is it fictionalized. It’s an exhaustively researched history, free of the shallow assumptions, inane theorizing, and formulaic prose of normal historical writing.

Read it for yourself. You’ll find that you won’t be able to put it down. In the meantime, I thought you’d be interested in knowing more about the author. So I asked Alec Mouhibian, himself a writer for Liberty, to interview his friend Garin.

Here’s a look into the writer’s workshop.

 — Stephen Cox


AM: Stories ask to be told. But some stories prefer to be left alone. Why, and how, did this story call to you?

GH: It's strange; I can remember exactly when and where it happened. It was in the fall of 2007. I was all alone in a computer lab on the eighth floor of the Columbia University Graduate School of Journalism in New York. I was a student there, writing my master's thesis about a group of magicians who had been meeting in secret for generations. . . . I'm not sure that's important. But what happened to me that afternoon is, I think, what writers waste their lives waiting for, one of those cosmic events — when stars seem to align into a constellation. . . .

What I mean to say is that I discovered, suddenly and for the first time, that all the details and metaphors and meanings of my family history somehow belonged to a great narrative.

My great-grandfather Kaspar had survived the Armenian Genocide of 1915 and escaped to the vineyards of California's San Joaquin Valley. My grandfather Richard had left his father's farm to pioneer the field of Armenian Studies in the United States. My father Raffi had left his law firm, the American Dream itself, to repatriate to Soviet Armenia, where he went on to serve as the new republic's first foreign minister.

So I realized that the family story was about three men who left — individuals who cheated their destinies — but it was also about men who, unbeknownst to them, had been serving a pattern greater than themselves. A homeland lost, remembered, regained — it was a perfect circle!

Then I knew I would have to write Family of Shadows.

AM: You mentioned how this story hit you as you were wandering the world of magic. You're quite the magician yourself. What connection is there between your history with magic and the discovery that led to this book?

GH: My father gave me my first magic set when I was five, and I knew immediately that I would become a magician. I loved to make the impossible happen, to play at the border of reality and fantasy. Of course I also loved to watch the reactions of people — the astonishment spread upon the faces of strangers. Anyway, I long ago gave up the wand for the pen, but not, I think, the passions that run through both of them: mystery and vanity.

AM: Explain where you had to go to write this book, what you had to explore, and how this vastarray of settings get along with each other in the story and in your mind.

GH: I didn't know, when I decided to write the book, just how far I would have to travel. I couldn't imagine that I would have to spend countless hours at the National Archives in Washington or the Armenian academy called the Jemaran in Beirut or the National Library in Yerevan or the Tulare Historical Museum in the San Joaquin Valley of California. But I think it was that other kind of travel — not through space, but into lost time — that was the most exhilarating. I realized that if I were to tell my story straight, I would have to conduct some difficult interviews — to go deep into the minds and memories of my living characters, where so many details of my story had been trapped for decades.

AM: Your book is evenly divided between the histories ofthree men. Before we go further, explain your process for choosing what to include and what to leave out of their story, and the stories of the many characters surrounding them.

GH: The book, as I first wrote it, was about 450 pages long. The one you'll find in bookstores today is 300. You know very well that you were in part responsible for this. I remember the first time you read the manuscript. I was sitting across from you, minding my coffee, pretending not to notice your reactions. You were quiet, mostly, but every so often, you would emerge from silence to sing the blues, and I knew this wasn't a good sign.

AM: I never thought my rendition of "My Baby Ain't No Baby No More" could be so pregnant.

GH: Oh, it was — and actually it made me realize just how big my own book had become. The truth was that those 150 pages were important — they told so much history and gossip — but they weren't important for this book. So I began to cut. It was slow and deliberate and painful at first. But then you remember what happened to me? Suddenly, I was slashing away at my pages — reversing months of labor. I bet I lost a lot of good lines, too, but it was necessary and, ultimately, deeply liberating.

AM: Homeland. Patterns greater than self. These fall under the greater concept of "Armenia," toward which all the dream-roads in your book lead. Define Armenia — in your own terms — for those (including Armenians) who have no idea what it might mean.

GH: To begin with, Armenia is an actual land — stretching between the Black and Caspian seas — where the Armenian people have lived for thousands of years. We used to have our own empire, but for most of history we were content merely to survive the rise and fall of neighboring empires — the Roman, Persian, Byzantine, Arab. Armenia was where kings came to do battle. And so the blood, the ethos, the mythology of countless civilizations is in our soil.

Armenian history forever changed in 1915. Western Armenia was cleansed of all Armenians by the nationalist Young Turk regime of the Ottoman Empire; those who survived the genocide scattered to new diasporas across the world. Eastern Armenia, meanwhile, was absorbed into the Soviet Union as the smallest of the 15 socialist republics. That tiny sliver of land is the Armenia you'll find on modern maps.

But for much of the 20th century, Armenia existed mostly as a dream. My father and my grandfather before him spent their childhoods yearning for a "free, independent, united Armenia." Forgive me, I do have to be poetic, because the truth is that for us, the millions of Armenians living in exile and dispersion, Armenia had become something like a poem: a spiritual landscape blossoming with metaphor and mystery and apricot. It is there that Family of Shadows is set.

AM: Mmm, metaphor. That strangest and most bitter Armenian crop.

Let's talk about Liberty, and its own role in the soil. I've known you for years, but I never really wanted to know you until your byline appeared in this magazine at the tender age of 17. How does individual liberty figure in the Armenian-American dream? How does it contend with the shadows that haunt every corner of the real and imagined Armenia?

GH: You're testing me. "Let's talk about Liberty" — wasn't that the slogan of the Cato Institute conference we attended in San Diego ages ago? That's where we first met Stephen Cox — followed him into literature and then into Liberty. It was our breakthrough!

Now you know as well as I do that Family of Shadows isn't a libertarian manifesto. But it is, I've long secretly believed, a kind of allegory of individualism and rebellion. At its deepest level, it is the story of three men who were born into times and places where they did not belong, who defied the great forces of history, who defied destiny. My great-grandfather defied his destiny of death during the genocide of 1915. My grandfather rejected his destiny on his father's farm. My father abandoned his destiny in the American Dream.

Maybe that's not fair, though. For my father, I think, the American Dream was never about achieving and enjoying liberty for oneself, but about spreading liberty across countries and continents.

AM: Is there no tension between the spread of liberty and the participation in an ethnic-national heritage, which might be at odds with individualism? How can this be reconciled in Armenia?

GH: Governments don't have ethnicities. People do. So I confess not to feel the tension. I don't see why an individual, living in a free society, shouldn't feel free to seek his private solace or meaning or peace wherever he pleases — in philosophy, religion, even national heritage. You build yourself a free country, but then what? You still have private problems. You still have to deal with death and salvation.As the great poet sings, "you're still gonna have to serve somebody."

AM: Classic Milton. Always comes through. Now of course your book is a powerful human drama, and should therefore matter to anyone who ranks himself among the humans. But perhaps you can explain why Armenia should matter to America.

GH: After the genocide of 1915, an unprecedented human rights movement swept through the United States. American citizens collected more than a hundred million dollars to help the surviving refugees; kids who didn't finish their suppers were told to remember "the starving Armenians." The most important witnesses and chroniclers of the genocide had been American ambassadors and consuls, and now it was the president himself — Woodrow Wilson — who was proposing an American mandate to safeguard Armenia.

In those years, the American people invested their spirit in the Armenian struggle — and I think the mysterious logic of that investment has revealed itself slowly through time. It's been forgotten, but in February 1988, half a million Armenians gathered in Yerevan, the capital of Soviet Armenia, to launch the first successful mass movement against Communist rule. Independence followed in 1991. That's when my father, an American citizen, returned to Armenia. That's also about the time when a million Armenians left a newborn Armenia to seek more certain destinies in the United States.

The stories, the histories, the Armenian and the American Dreams, were in conversation long before I tried to capture that conversation in Family of Shadows.

AM: The Russians have a saying: “Every grandmother was once a girl.” Perhaps it can also be said that every answer was once a question. So...any questions before you go?

GH: You know, I have been wondering: who is John Galt?

Share This

I Can’t Get a Job—I’d Lose My Benefits!


What are we learning from the recent census?

A headline in this Thursday’s Westchester County Journal News proclaims "Census: Density in Pockets." Well, duh. Just look around New York and you'll see high-rises and low-rises that house low-income families whose housing is subsidized by "Section 8" of the welfare code, which ties rents to a percentage of income. The more you earn, the more you pay. Conversely, if you don't earn anything, you'll pay almost nothing. I have a friend who pays $117 for a one-bedroom apartment on a tree-lined street in North Yonkers.

She's one of the lucky ones. I have other friends who live in a building in South Yonkers where a different drug king controls each floor. (That's how lucrative the drug trade is in these areas. After all, it's money off the books. It won't affect the rent.) Buildings are subdivided and subdivided again to provide housing for the burgeoning population of welfare recipients in these dense "pockets."

Another friend of mine teaches junior high in the Bronx. Recently she gave her students a typical assignment: what do you want to be when you grow up? One bright young seventh-grader wrote glowingly about his desire to go to college and become a lawyer. "I'll carry a briefcase to work and wear a charcoal gray suit," he wrote. "I'll drive a BMW and I'll help people with their problems." My friend cheered his enthusiasm as she read his dream. Then she reached his final paragraph: "But if I make too much money, I'll lose my benefits," he concluded. "Maybe I shouldn't go to college after all."

What a chilling message these children are learning from their parents. I hear it too, all the time. "I can't get a job. I'll lose my Medicaid." "I can't get a job. My rent will go up." So parents teach their children how to use the system — how to get on the Section 8 rolls, how to get more food stamps, how to get more welfare. Often for a girl, that means having babies outside of marriage. Children learn how to find jobs that are off the books, income that can go unreported. Their parents don't have the courage to say, "Get out of here! Go to college and fly far away!"

This is a Reflection full of storytelling, so I'm going to tell you one more story. My friend Kelly was a single welfare mom rearing two children, with another one on the way. She was living in a tiny, grungy apartment on one of the worst streets in Yonkers. When the father of the new baby left instead of marrying her, she knew she had to change her life. So she reached out for a different safety net from Section 8 or WIC (aid to Women, Infants, and Children) or Medicaid: she called her parents. Then she moved across the country to Sacramento, where her two older boys are now enrolled in better schools with better classmates. Her mother joyfully volunteered to take care of the baby while Kelly attended school herself. This month Kelly will graduate and become a dental hygienist. By the end of the summer she will be moving into her own apartment. I am so proud of her!

Government welfare always begins with good intentions. No one wants to see young mothers abandoned on the streets. No one wants to see children go hungry or uneducated. But these "pockets" of dense population are not what anyone intended. They are sad places, full of broken dreams and lost courage.

The War on Poverty was supposed to end this mess. It has only gotten worse, as any free marketeer could have predicted. Government needs to get out of the way and stop competing with free market housing, so that more people like Kelly can find the courage to leave the grungy pockets of Section 8 and move into wider, roomier pockets somewhere else — anywhere else! —  with better schools, better opportunities, and a better way of life.

Share This

Good as Gold


James Grant is the best-spoken and most accomplished hard-money man in Wall Street. Plenty of people can inveigh against the Fed; the publisher of Grant’s Interest Rate Observer can put it in the context of today’s financial markets and also in the context of history.

Grant is also a historian. He wrote a biography of John Adams, and another of financier Bernard Baruch, and several other histories, including my favorite among his works, The Trouble with Prosperity: The Loss of Fear, the Rise of Speculation and the Risk to American Savings (1996). His new book, “Mr. Speaker! Mr. Speaker!” should be of particular interest to readers of Liberty.

That’s not because of any deep interest I have in his nominal subject, Thomas B. Reed (1839–1902). Reed was a moderately interesting figure. He was fat, he ate fish balls for breakfast, and he spoke French. He was a congressman from Maine, first elected in the centennial year, 1876. He was a man of his time, a Republican stalwart who supported the tariff and gold and opposed foreign wars. Though he was for woman suffrage, he was no progressive, at least in the sense that today’s progressives are. He had no desire to teach other people how to live. “Pure in his personal conduct,” Grant writes, “he had no interest in instructing the impure.”

During the 1890s, Reed was speaker of the House. His claim to fame is how he changed the House rules to give himself, and the majority, much more power to get things done. Whether that was good or bad depends on your point of view.

Parts of the book are about Reed’s parliamentary maneuverings and also about his wit. These parts are well-written, but I was not much interested in them. The greater part of the book, however, is about the financial events and national political battles from 1870 or so to 1899, which I found very interesting.

Now gold is the money of the distant past. In the 1870s, Grant observes, “Gold was the money of the future."

Grant covers several such things, from the collapse of the Freedmen’s Bank to the stolen election of 1876. He has a particular interest in the currency. One story he tells is of Lincoln’s printing of greenbacks during the Civil War, and the struggle afterward over restoring the link to gold. In the ideology of the day, restoration was necessary. It was part of honorable dealing. But people dragged their feet about doing it, because a national debt had been piled up during the war, and what was the need to repay investors in gold if they had bought bonds with greenbacks? And besides, there was a boom on, and shrinking the money supply would spoil it.

The boom went bust in 1873. For the rest of the decade the country was arguing about the commitment to return to gold. The way in which this argument was conducted was much different from the way it would be today. Now gold is the money of the distant past. In the 1870s, Grant observes, “Gold was the money of the future”:

“A 21st century student of monetary affairs may stare in wonder at the nature of the monetary debate in mid 1870s America. A business depression was in progress. From the south, west and parts of the east arose a demand for a cheaper and more abundant currency. Yet such appeals were met with, and finally subdued by, a countervailing demand for a constant, objective and honest standard of value.”

“Resumption day” was Jan. 1, 1879. There followed “a mighty boom.” In the 1880s, prices fell 4.2% and wages rose.

This was the era of laissez-faire. “The government delivered the mails, maintained the federal courts, battled the Indians, examined the nationally chartered banks, fielded the army, floated the navy and coined the currency.” The big social program was Civil War pensions — for the Union side, not the Confederates. By the standards of the day, the Democrats were the small-government party and the Republicans were the big-government party, but policy differences were at the margin only.

The federal government was funded by the tariff, which the Republicans, and Reed, thought was good for American labor. The Democrats quoted the free-trade arguments of Frédéric Bastiat. “The Republicans,” Grant writes, “having no ready answer for Bastiat’s arguments, were reduced to pointing out that he was, indeed, French.”

The ideological atmosphere started changing in the 1890s. The Panic of 1893 brought on a severe depression, and another political battle over the currency. This time the supporters of gold battled it out with the supporters of silver, with bimetallists arguing for both at once.

Grant provides the best explanation of the gold-versus-silver battle I have read, especially the futility of having two metallic standards at the same time. Here he is not so proud of Thomas Reed, who was by then speaker: “One senses, reading him, that he was not quite sure what the gold standard was all about.” Reed’s position “lacked clarity, or, one might even say, courage. [President Grover] Cleveland had one unshakable conviction, which was gold. Reed had many convictions, only one of which — no free silver — was strictly nonnegotiable.”

Reed’s final battle was over war with Spain. He was against it. The new president, William McKinley, seemed to be against it, but lacked the courage really to oppose it. Ex-President Cleveland was against it, as was William Jennings Bryan, the presidential candidate whom McKinley had beaten in 1896. But the Congress was for it, and even with his self-enhanced power as speaker, Reed couldn’t block the war resolution of 1898. It passed the House 310 to 6.

A year later, Reed resigned. He wrote to a friend, “You have little idea what a swarming back to savagery has taken place in this land of liberty.” Publicly he said nothing. Three years later he was dead.

And so Grant ends his book. It is a fine book. I recommend it. Don’t be put off by any lack of interest you may have in Thomas B. Reed. I wasn’t much interested in him, either. You don’t have to be.

Grant covers several such things, from the collapse of the Freedmen

Editor's Note: Review of "'Mr. Speaker! Mr. Speaker!' The Life and Times of Thomas B. Reed, the Man Who Broke the Filibuster," by James Grant. Simon & Schuster, 2011, 448 pages.

Share This

Creation vs. Diversion


When an investment opportunity seems too good to be true, prudence recommends asking: “Will I be sharing in wealth that I help to create, or will I taking some wealth from its real producers?”

A chain letter is an example of just diverting wealth. So is a Ponzi scheme; so is the familiar Nigerian email scam, which gains superficial plausibility from a whiff of dishonesty or illegality.

Similarly, someone engaged in a possibly though not obviously productive business — perhaps some complicated Wall Street transaction — may well ask about creation or diversion of wealth. If not conscience, then concerns about legality and the dependability of success prompt the question.

Someone offering a suspiciously attractive opportunity might dodge the question about wealth creation by saying that the success of his method, though honorable — a method of choosing stocks and timing transactions, perhaps — depends on secrecy. Even so, an answer in general terms is in order; and the vaguer the answer, the more skepticism prudence recommends.

Beyond physical objects, wealth of course includes intangibles such as education, repair and other services, transportation, entertainment, travel and tourism, and comforts of various kinds. Similarly, the production of wealth includes not just growing or physically shaping things but also specialization, the division of labor, and exchange. (Each party to a voluntary and informed transaction gains what he considers more wealth than what he gives.)

Financial operations come into the story. Arbitrage of the most obvious kind moves resources or products from where they are less scarce and valuable to where they are scarcer and more valuable; and by ironing out price discrepancies, it makes the price system more accurate in equilibrating supply and demand. A well functioning market generates knowledge and uses it productively.

 Speculation — most obviously but not only in standard commodities — is a kind of arbitrage in time, moving things from a time when they are less valuable to a time when they are more valuable. If something, say wheat, is expected to become scarcer in the future, a speculatively bid-up price tends to hasten economies in consuming the thing and possibly hasten its production, thereby lessening its future scarcity.The speculators themselves bear the costs of being wrong. Speculation contributes to the depth and resiliency of futures and forward markets, where businesses can hedge against the risk of adverse price changes (and where, as a result, opposite risks can more or less neutralize one another).  Insurance is an obvious example of hedging against risk. 

Saving and investment make possible the increased productivity of well-chosen roundabout, time-consuming, capital-using kinds and methods of production. (Here “capital” means resources freed by savers from serving current consumption so they can serve longer-term-oriented production instead.) The stock markets and related activities help allocate capital to places where it promises to be most productive. Financial intermediation tailors the terms on which capital is available to the varied wants and needs of savers, borrowers, and companies issuing stock; securitization of loans is just one example.

Like capital, risk-bearing is an essential element in production, especially innovative production. Various kinds of stocks and bonds and derivatives, including even the notorious credit-default swaps, help place risk with the persons and institutions most willing and able to bear it. Specialization in risk-bearing enhances confident and productive business planning.

The great variety of financial operations and innovations allows scope, notorious scope, for ignorance, error, and downright fraud. (Some participants, such as Enron energy traders, reputedly, may see their business as a struggle in which the other side is supposed to lose.) More broadly, wealth comes in so many forms, and cooperation in producing it occurs in so many ways, even very indirect ways, that an answer to the question with which I started can seldom be precise.

Nevertheless, insisting on whether wealth is being created or merely diverted can provide healthy discipline.

Share This

Appleby's Revolution


One thing that has always struck me about the Communist Manifesto is that Karl Marx and Friedrich Engels held capitalism in awe. “The bourgeoisie, during its rule of scarce one hundred years, has created more massive and more colossal productive forces than have all preceding generations together,” they wrote in 1848. They listed the wondrous accomplishments one by one, from steam navigation to canal building, and “whole populations conjured out of the ground.” Of course, their praise preceded a call for workers to overthrow the great capitalist conjuror.

Joyce Appleby’s book, The Relentless Revolution: A History of Capitalism, conveys the same sense of wonder. She praises capitalism, saying that its “distinctive characteristic . . . has been its amazing wealth-generating capacities. The power of that wealth transformed traditional societies and continues to enable human societies to do remarkable things."

Appleby certainly doesn’t recommend overthrowing the system. But her appreciation of capitalism diminishes as the book goes on, just as Marx’s and Engels’ did in the manifesto. At the start, The Relentless Revolution is like an exhilarating train ride, full of insights and a historian’s gold mine of information, but it loses steam and slows to a crawl once the Rubicon of the Industrial Revolution has been crossed. At the end, Appleby is praising the US government for trying to rein in the capitalist beast.

Appleby, who taught history at UCLA, describes herself as a “left-leaning liberal with strong . . . libertarian strains.” Given today’s academic history departments and her own attitudes, she deserves admiration for looking at capitalism as objectively as she can. “In this book,“ she writes, “I would like to shake free of the presentation of the history of capitalism as a morality play, peopled with those wearing either white or black hats.” For half the book, she achieves that goal.

The Relentless Revolution fits into the recent parade of big books trying to explain the rise of the industrial West — books such as Jared Diamond’s Guns, Germs, and Steel, Kenneth Pomeranz’s The Great Divergence, and Ian Morris’ Why the West Rules — For Now. These and other authors are trying to answer a couple of giant questions: why did the West (not China, India, or another part of the world) make the transition from subsistence living to a world of constantly increasing productivity? And how exactly did it happen? As I shall explain, Appleby offers two major responses that I found valuable.

Appleby identifies two main factors in Britain’s rise: higher wages providing the incentive to increase productivity, and coal providing the means.

The pivotal moment in world history is normally called the Industrial Revolution (although perhaps that’s a misnomer, given its slow gestation). Many factors contributed to this revolution, and part of the “game” motivating the big books is to offer new factors. It’s generally agreed that the process flowered first in Great Britain, but the reasons may stretch back to such things as the European practice of primogeniture, the separation of powers between the Catholic Church and the state, and the system of rights spawned by feudalism under changing population pressures.

Appleby focuses on 17th and 18th-century England (why she gives short shrift to Scotland is a puzzle). She points to two reasons why the Industrial Revolution occurred there: England had higher wages than the Continent, and also cheap sources of coal. Higher wages provided the incentive to increase productivity, and coal provided the means.

The idea that England’s wages were higher is actually new to me and undoubtedly has a complex history of its own; Appleby merely says that England’s population growth had leveled off in the 17th century. As for coal accessibility, she agrees with Pomeranz, who says that China fell behind Europe partly because its coal deposits were harder to reach.

Whatever the precursors, actual inventions — especially of the steam engine — required talent and knowledge, and herein lies the first of Appleby’s distinctive insights. In a way that I haven’t seen before, Appleby integrates two related forces: the British tendency toward scientific theorizing (or “natural philosophy”), and the British tendency to tinker. “Technology met science and formed a permanent union. At the level of biography, Galileo met Bacon,” she writes.

She elaborates: “Because of the open character of English public life, knowledge moved from the esoteric investigations of natural philosophers to a broader community of the scientifically curious. The fascination with air pressure, vacuums, and pumps became part of a broadly shared scientific culture that reached out to craftsmen and manufacturers in addition to those of leisure who cultivated knowledge.”

The Royal Society illustrates this integration. Created in 1662, it was both practical and scientific. Its members studied that New World import, the potato, but it also “brought together in the same room the people who were most engaged in physical, mechanical, and mathematical problems.”

Appleby’s second valuable contribution is to take a cold-eyed look at the role of New World slavery in creating the markets that nurtured the Industrial Revolution. Indeed, like Pomeranz, she sees slavery as a major factor behind the engine of progress.

In The Great Divergence, Pomeranz says that the vast expansion of agricultural cultivation in the New World enabled Europe to avoid the “land constraint” that kept the Chinese from developing an industrial economy. And slave labor played an enormous role in the cultivation of sugar in the Caribbean and cotton in the United States. Thus, Pomeranz says, Europe overcame the limitations of land by colonizing the New World, which made agriculture possible to an extent unlikely in tiny Holland or England.

Appleby’s take is a little different. She emphasizes more the three-way trade that we learned about in middle school (slaves from Africa were taken to the New World, where sugar was purchased and taken to Great Britain, where finished clothing and other products were bought, to be sold in Africa). Pomeranz and Appleby are not arguing that slavery was profitable, or that it was “necessary,” but, rather, that it was a significant element in the system of trade that led to the Industrial Revolution. Enthusiasts for capitalism such as myself tend to focus on the “trade”; critics of capitalism — along with Appleby — stress the “slave” part of the trade.

Furthermore, Appleby considers American plantations and British inventions as two sides of the same coin. “These two phenomena — American slave-worked plantations and mechanical wizardry for pumping water, smelting metals, and powering textile factories — may seem unconnected. Certainly we have been loath to link slavery to the contributions of a free enterprise system, but they [the phenomena]must be recognized as twin responses to the capitalist genie that had escaped the lamp of tradition during the seventeenth century.”

Whether she is right about this, I don’t know, but she brings to public attention the periodic debate by economic historians over the role of slavery, a debate that Pomeranz reawakened with The Great Divergence.

Unfortunately, after these interesting observations, The Relentless Revolution begins to wind down. As the book moves on, Appleby tends to equate anything that takes industrial might — such as the imperialist armies that took possession of Africa in the late 19th century — with capitalism. “Commercial avarice, heightened by the rivalries within Europe, had changed the world,” she writes in explaining the European adventures in Africa. I dispute that. While Cecil Rhodes and Henry Morton Stanley may have been private “investors,” for the most part Africa was overcome by government power, not by individuals or corporations.

While Cecil Rhodes and Henry Morton Stanley may have been private “investors,” for the most part Africa was overcome by government power, not by individuals or corporations.

Even greater blindness sets in as Appleby’s 494-page volume moves into the recent past and she becomes influenced by modern prejudices. In Appleby’s view, the financial crash in 2008 was caused by financiers; the only contribution by public officials was to “dismantle the regulatory system that had monitored financial firms.” No word about the role of the Federal Reserve, the Community Reinvestment Act, or Fannie Mae and Freddie Mac.

Indeed, capitalism becomes the same as exploitation, in spite of her more balanced initial definition of capitalism as “a cultural system rooted in economic practices that rotate around the imperative of private investors to turn a profit.”

In her last chapter, Appleby writes, “The prospect of getting rich unleashed a rapacity rarely seen before in human society.” This extravagant statement does not jibe with archaeological (not to mention historical) evidence of the human past. In Why the West Rules — For Now, Morris reports on an archaeological excavation at Anyang village in China. In 1300 BC, this home of the Shang kings was the site of massive deaths. Morris estimates that, in a period of about 150 years, the kings may have killed 250,000 people  — an average of 4 or 5 a day, but probably concentrated in large, horrific rituals — “great orgies of hacking, screaming, and dying.” This was the world before capitalism.

To be fair to Appleby, she is saying that the productivity unleashed by capitalism extended the breadth of human rapacity, and to some degree the example of slavery supports that notion. Eighteenth-century British gentlemen could think high-minded thoughts while enjoying their tea with sugar grown by often brutally treated slave labor — which they may have invested in. “This is the ugly face of capitalism,” Appleby writes, “made uglier by the facile justifications that Europeans offered for using men until they literally dropped dead.”

True. At the same time, it was the British who abolished the slave trade, at high cost in patrolling the seas. Before that, Africans were enslaved only because other Africans captured and sold them to the Europeans. (Malaria and other diseases kept Europeans out of the interior of Africa until late into the 19th century.) There is a lot of blame to go around.

So this book is a mixed bag. Even though Appleby increasingly departs from objectivity as she proceeds into modern times, I respect her project and appreciate her insights. I hope that her “left-leaning” historian colleagues read her book and expand their understanding of the past — just as I have.

fits into the recent parade of big books trying to explain the rise of the industrial West

Editor's Note: Review of "The Relentless Revolution: A History of Capitalism," by Joyce Appleby. Norton, 2010, 495 pages.

Share This

Dr. Jekyll and President Hide


In one of the scenes in Citizen Kane, the protagonist's former friend Jed Leland describes the character of the flamboyant politician and tycoon. "He had a generous mind," Leland says. "But he never gave himself away. He never gave anything away. He just . . . left you a tip."

He might have been describing President Obama.

Like Kane, Obama is a colossal self-advertiser. He first made his reputation, indeed, by writing a book of quasi-autobiography. Like Kane, he can hardly get through a sentence without using the word "I." He constantly refers to government entities as "my secretary of state," "my secretary of the treasury," "my department of defense," and so on. Yet when it comes to revealing himself . . . no. He'd rather be tortured than give up any pieces of the sacred substance, or anything even associated with it.

One assumes that Obama bogarted all specifics about his supposedly close and inspiring relationship with Reverend Wright because Wright had become a political embarrassment. And one assumes that Obama wants to keep his college records secret because he wasn't a very good student. These are assumptions, however, because Obama keeps his stuff to himself even when it would do him good to give it away.

The classic example of this compulsion is his logically pointless war against the people who wanted to see his birth certificate. He conceded the struggle only when he started to fear that it was costing him support for reelection, thus torturing him beyond the limits of even his endurance. For years he had made a public fool of himself by not releasing an innocuous scrap of paper.

Why, after that performance, I expected him to surrender the Osama death photos, I don't know. Maybe I thought he had reformed, and some nice, generous, "transparent" Dr. Jekyll had replaced the clutching, anal, emotionally threatened President Hide. But whatever I thought, I was wrong. The preposterous decision not to release the pictures, ostensibly to chasten radical Islamicists with the evidence of our moral superiority, will merely convince the world that Barry Obama, like Charlie Kane, has more than a small screw loose.

But what about the "tip" — "he just left you a tip"? In Citizen Kane, the protagonist paid other people for "services rendered." He demanded their love, but "he had no love to give." So he offered them money or power or other crass "tips." And that, in his way, is what Obama does. Of all the politicians I can think of, he is the greediest for love but the least interested in other people. His speech is without stories or anecdotes. He seldom alludes to any actual historical event, anything that people actually did in the past. He appears to retain no vivid memories of the people in his own past, or any real interest in the people he meets today. He speaks always as if he were reminding his audience of things they should already have been taught, never as if he wanted to learn from their responses what they themselves would like to know. In lieu of real human concern, he professes a vast interest in abstractions — progress, equality, fairness, proving to our enemies that we are better than they are in some vague, general way.

These are not the kind of tips you can take home and spend. The real stuff — he keeps that to himself. You're not getting any of that.


Share This

Enforcers of Health


Libertarians have uneasy thoughts about the enforcers of public health. In my state, the public health authorities banned tobacco billboards — an act that some thought a violation of the First Amendment. This year, they pushed in the legislature to ban the sale of flavored cigars and pipe tobacco. In both cases, they have said that their aim is to make tobacco less attractive to children.

A century ago the fight was much more immediately vital. Then people of libertarian mind were fighting with the public health enforcers over the control of smallpox.

That disease was eradicated in the 20th century by campaigns of vaccination, many of them not entirely voluntary. In the United States, a turning point came in the epidemic of 1899 to 1902, when outbreaks of the disease prompted a political battle over compulsory vaccination. The story is told in legal writer Michael Willrich’s new book, Pox.

The push for compulsory vaccination grew out of the facts of the disease itself. It was a killer. Of people infected by the main strain of the disease, between 20 and 30% died. Smallpox was highly contagious, so that an infected person was a threat to everyone around him who was unvaccinated. First symptoms appeared more than a week after exposure, so fighting the epidemic by treating sick people was a strategy of being perpetually behind. The disease could, however, be stamped out by vaccinating the healthy.

For the progressives, the model was the Kaiser’s Germany. As Willrich says, “German law required that every child be vaccinated in the first year of life, again during school, and yet again (for the men) upon entering military service.” Germany had “the world’s most vaccinated population and the one most free from smallpox.”

In the constitutionalist America of the 1890s, vaccination was a state, local, and individual responsibility. Americans, writes Willrich, were “the least vaccinated” people of any leading country. The federal government had a bureau, the Marine-Hospital Service, to tend to illnesses of sailors; it had been started in the 1790s under President John Adams. The Service had experts in smallpox control, but they were advisory only and expected local authorities to pay for local work.

The antivaccinationists argued that compulsory vaccination would lead to other bad things. And four years later, in 1906, Indiana enacted America’s first compulsory sterilization law.

In a smallpox outbreak, the public health enforcers would set up a “pesthouse,” usually at the edge of town. They would scour the community for the sick, paying particular attention to the shacks and tenements of blacks, immigrants, and the rest of the lower classes, where the disease tended to appear first. People in these groups were subjected, Willrich says, to “a level of intrusion and coercion that American governments did not dare ask of their better-off citizens.”

Public-health doctors, accompanied by police, would surround tenement blocks and search people’s rooms. The sick would be taken to the pesthouse under force of law, and everyone else in the building would be vaccinated.

Often a community would have a vaccination ordinance that on its face was not 100% compulsory. It would declare that no child could be admitted to public school without a vaccination mark. For adults, the choice might be vaccination or jail — and all prisoners were, of course, vaccinated.

The procedure involved jabbing the patient with a needle or an ivory point and inserting matter from an infected animal under the skin. Typically it was done on the upper arm; often that arm was sore for a week or two, so that people in manual trades couldn’t work. Sometimes, vaccination made people so sick it killed them — though not nearly as often as the disease did.

Such was the background for the outbreaks of 1899–1902. At that time, two new factors emerged. First, America had just had a war with Spain, and had conquered the Philippines, Cuba, and Puerto Rico. All these places were rife with disease — and their status as conquered possessions, Willrich writes, provided “unparalleled opportunities for the exercise of American health authority.”

There the authorities had no worries about people’s rights: “The army’s sanitary campaigns far exceeded the normal bounds of the police power, which by a long American constitutional tradition had always been assumed to originate in sovereign communities of free people.” And coercive methods worked. They worked dramatically.

The second new thing in 1899 was that most of the disease spreading in the United States was a less-virulent form that killed only 0.5 to 2% of those infected. That meant that many Americans denied the disease was smallpox, and didn’t cooperate. The public health people recognized what the disease for what it was, and they went after it with the usual disregard of personal rights — maybe even more disregard, because of “the efficiency of compulsion” overseas. And they stirred up, Willrich writes, “one of the most important civil liberties struggles of the 20th century.”

Their fight was with the antivaccinationists, whom Willrich calls “personal liberty fundamentalists” who “challenged the expansion of the American state at the very point at which state power penetrated the skin.”

He continues:

“Many antivaccinationists had close intellectual and personal ties to a largely forgotten American tradition and subculture of libertarian radicalism. . . . The same men and women who joined antivaccination leagues tended to throw themselves into other maligned causes of their era, including anti-imperialism, women’s rights, antivivisection, vegetarianism, Henry George’s single tax, the fight against government censorship of ‘obscene’ materials and opposition to state eugenics.”

Often, antivaccinationists denied that vaccination worked. Many were followers of chiropractic or other non-standard medicine; this was also the time of “a battle over state medical licensing and the increasing dominance of ‘regular,’ allopathic medicine.” Of course, the antivaccinationists were wrong about vaccination not working, but they were not wrong when they said it was dangerous. In 1902, the city of Camden NJ required all children in the public schools to be vaccinated. Suddenly children started coming down with lockjaw. They were a small fraction of those who had been vaccinated, but all of them fell ill about 21 days after vaccination. Several died, and the press made a scandal of it.

Our newly conquered possessions provided “unparalleled opportunities for the exercise of American health authority.”

Under the common law of the day, the vaccine maker — the H.K. Mulford Co. — was not liable to the parents, because it had no contract with them. Nor was the government liable, though the government had required the treatment. Thus, writes Willrich, “The arm of the state was protected; the arm of the citizen was not.”

The medical and public health establishment initially denied that the lockjaw had been caused by the vaccines. This was admitted only after a consultant to a rival vaccine maker, Parke-Davis, made an epidemiological case for it. Congress responded by quickly passing the Biologics Control Act of 1902, which ordered the inspection and licensing of vaccine producers, and required them to put an expiration date on their products. It was one of the first steps in the creation of the regulatory state.

The act calmed the public and drove some smaller companies out of business. The first two licensees were Parke-Davis (later absorbed by Pfizer) and Mulford (absorbed by Merck). And the purity of vaccines, which had been improving already, improved dramatically in the following decade.

The antivaccinationists turned to the courts in a fight about constitutional principles. The argument on the state’s side was that it had a duty to protect citizens from deadly invasion, which might be launched by either an alien army or an alien army of microbes. The argument on the other side was the individual’s right to control what pathogens were poked into his body, and, as the antivaccinationists’ lawsuit contended, his right to “take his chance” by going bare in a time of contagion.

They brought their case to the Supreme Judicial Court of Massachusetts, and lost. Citing the power of government to quarantine the sick and conscript soldiers in war, the court said, “The rights of individuals must yield, if necessary, when the welfare of the whole community is at stake.” Then they appealed to the US Supreme Court. In the same year in which the libertarian side won a landmark economic-liberty ruling in Lochner v.New York (1905), it lost, 7–2, in the vaccination case, Jacobson v.Massachusetts. Writing for the court, Justice John Harlan compared the case to defense against foreign invasion, and wrote, “There are manifold restraints to which every person is necessarily subject for the common good.”

It is an unlibertarian answer. The author of Pox thinks it was the right answer — and so do I, in this case. Smallpox was indeed a kind of invasion. In the 20th century it killed 300 million of Earth's people, and disfigured millions more. And though public health people can make similar arguments, and they do, about tobacco as a mass killer, I do not support their efforts to stamp out its use by ever more state intrusion. The difference is that with smallpox the individual’s refusal to be vaccinated put others in imminent danger of death. Collective action — compelled collective action — was the only way to defeat the invader. None of this is the case with tobacco.

Naturally, the progressives wanted to apply their wonderfully practical idea to all sorts of things. Referring to epidemics and the germ theory of disease, Willrich writes, “Progressives took up the germ theory as a powerful political metaphor. From the cities to the statehouses to Washington, the reformers decried prostitution, sweatshops, and poverty as ‘social ills.’ A stronger state, they said, held the ‘cure.’ ”

The antivaccinationists argued that compulsory vaccination would lead to other bad things. They made some fanciful statements; one wrote, “Why not chase people and circumcise them? Why not catch the people and give them a compulsory bath?” But they were right. Four years later, in 1906, Indiana enacted America’s first compulsory sterilization law. This was done in pursuit of another scientific, society-improving cause that excited progressives: eugenics. In 1927, in Buck v.Bell, the Supreme Court approved Virginia’s law of compulsory sterilization of the “feeble-minded.” That was the case in which Justice Oliver Wendell Holmes — supposedly the great champion of individual rights — pontificated: “The principle that sustains compulsory vaccination is broad enough to cover cutting the Fallopian tubes. Jacobson v.Massachusetts, 197 U.S. 11. Three generations of imbeciles are enough.”

Jacobson has been cited in many Supreme Court decisions, usually to approve state power. And the applications have not been confined to medicine. In 2004, Justice Clarence Thomas cited Jacobson in his dissent in Hamdi v.Rumsfeld, which was about a US citizen captured in Afghanistan. Thomas, often regarded as the most libertarian of the current justices, was arguing that Yaser Esam Hamdi had no right of habeas corpus and could be held on US soil indefinitely without charge.

Collective action — compelled collective action — was the only way to defeat the invader. None of this is the case with tobacco.

The smallpox epidemic of 1899–1902 was a pivotal event institutionally as well as legally. The Biologics Act created more work for the Marine-Hospital Service, which a few years later became the US Public Health Service. The service’s Hygienic Laboratory, which was empowered to check the purity of commercial vaccines, later grew into the National Institutes of Health.

From the story told in Pox, you can construct rival political narratives. The progressives’ “we’re all in it together” formula about the need for science, federal authority, officials, regulation, and compulsion is there. And in the case of smallpox, progressivism put its strongest foot forward: it was warding off imminent mass death. Smallpox is one of the least promising subjects for the libertarian formula of individual rights and personal responsibility. Yet here you can also find the bad things that libertarian theory predicts: state power used arbitrarily and unevenly, collateral deaths, official denial of obvious truths, a precedent for worse things later on, and even the favoring of large companies over small ones.

I don’t like the progressive formula. But in the case of smallpox it fits, and I accept it, looking immediately for the limits on it.

Fortunately for liberty as well as health, few other diseases are as contagious and deadly as smallpox. Even Ebola and SARS — two contagious and deadly diseases of recent decades — were mostly fought by quickly isolating the sick. State power was necessary — some people were forcibly quarantined — but there were no mass vaccinations of the healthy.

Still, vaccination surfaces as an issue from time to time. It is, on its face, a violation of the rights of the person. So is quarantine. And yet it is prudent to allow both of them in some situations.

That is an admission that the libertarian formula doesn’t work all the time. Liberty is for rational adults in a minimally normal world. That is a limitation, but not such a big one. Surely it is better to design a society for people who can think, make decisions, and take responsibility, while in a few cases having to break those rules, than to live in a world designed for people defined as children of the state.

Editor's Note: Review of "Pox: An American History," by Michael Willrich. Penguin, 2011, 400 pages.

Share This

The End Is Nigh


In an article in the December 2010 issue of Liberty, I alerted readers to the fact that a leading network of Christian radio stations was predicting that the end of the world was absolutely, positively going to happen in 2011. According to Family Radio, which broadcasts in many countries, and which probably has a station near you, Judgment Day will begin on May 21 with the Rapture of the true believers and will conclude on October 21 with the total destruction of the physical universe. In the process, almost all the inhabitants of the earth will perish.

This is the message of Family Radio’s “Bible teacher,” a retired businessman named Harold Camping. His interpretations of Scripture are explained — as well as I, or probably anyone else, could explain them — in the article just mentioned, “An Experiment in Apocalypse.” You can download it here. For a less critical perspective, see Family Radio’s own website, which offers a list of stations where you can hear the apocalyptic message for yourself.

For students of human nature, especially American human nature, this particular religious prediction is a matter of great importance, interest, and (let’s face it) fun. It is, for them, what a transit of Venus is to astronomers, what the sighting of an ivory-billed woodpecker is to ornithologists, what an eruption on the scale of Mt. Saint Helens is to volcanologists. It’s the kind of thing that happens much less than once in a generation.

Of course, experts, religious and secular, make predictions all the time, and other people believe them. Generals predict that if they are given appropriate resources, they will be able to accomplish their mission. Scientists predict that if their policy advice goes unheeded, the environment will be subject to further degradation. Politicians predict that if you vote for them, they will initiate a new age of prosperity for the American people, and if you don’t, you will be visited by spiraling unemployment and a continuing decline in the American standard of living. Preachers say they are confident that the signs of the times point to an early return of our Lord Jesus Christ. Economists assure us that if trends continue, we can expect to see whatever they themselves have been trained to expect.

According to Family Radio, Judgment Day will begin on May 21 with the Rapture of the true believers and will conclude on October 21 with the total destruction of the physical universe.

All these modes of prophecy are potent. They have effects. They get people’s attention. They lead some people to do things that otherwise they would not do — vote, go to war, buy a house, pledge more money to the church. But they are all escapable and forgettable. They never include a date on which something definitewill happen. What you see, when you look at the words and figure out what they really mean, is just a suggestion that something that is capable of almost any definition (“prosperity,” “job creation,” “depression,” “desertification,” “a world in which our children will have more (or less) opportunity than we do,” “the fulfillment of God’s plan”) will manifest itself at some time that is really not a time: “very soon,” “in our generation,” “by the end of the century,” “earlier than predicted,” “much earlier than anyone would have thought,” “with a speed that is startling even the experts,” “at the end of the day,” “eventually”).

Of course, the less definite a prediction is, the less meaning it has; but the more definite it is, the less likely it is to come true. Real economists and real theologians can tell you why. A real economist will show you that human events result from individual human choices, their motives unresolvable into quantifiable data, their possible sequences multiplying from the original motives in fantastic variety. Real theologians will tell you, in the words of the old hymn, that the Deity is not a writer of op-ed forecasts: “God moves in a mysterious way, his wonders to perform; / He plants his footsteps in the sea, and rides upon the storm.”

Nevertheless, it is impossible that someone’s announcement that “California is more vulnerable than ever to a catastrophic earthquake,” or that “this administration will meet the problem of the deficit, and solve it” could ever be completely disconfirmed. If the big one doesn’t destroy San Francisco during your lifetime, as you thought had been predicted, don’t use your dying breath to complain. You’ll just be told that California is even more vulnerable “today” than it was “before,” because more years have passed since the last prediction. If the politician you helped elect disappoints you by not having “solved the problem,” whatever the problem is, you’ll be told that “our plan for the new America ensures that appropriate solutions will be implemented, as soon as Congress enacts them into law.”

How can you disconfirm the ineffably smarmy language of the exhibits in the California Academy of Sciences? Once an intellectually respectable museum, it now adorns its walls with oracles like this: “If we don’t change our actions, we could condemn half of all species of life on earth to extinction in a hundred years. That adds up to almost a million types of plants and animals that could disappear.” Should you decide to argue, you can’t say much more than, “If you don’t stop purveying nonsense like that, your museum has seen the last of me, and my $29.95 admission fees, too.”

Thirty years ago there was a considerable emphasis among mainstream evangelical Christians on the prospect of Christ’s imminent return. There was a popular religious song, urging people to keep their “eyes upon the eastern skies.” Less mainstream religionists said that all indications point to the probability that the present system of things will end in 1975. Meanwhile, a very popular book, Famine 1975! America’s Decision: Who Will Survive? (1967), predicted that the world would soon run out of food; and scientists worried the world with predictions that “global cooling” would soon be upon us.

For students of human nature, especially American human nature, this particular religious prediction is a matter of great importance, interest, and (let’s face it) fun.

Among these predictions were a few that, wonder of wonders, actually could be disconfirmed, and were. Though no one said that the Egyptians (often thought to be especially “vulnerable”) would begin starving to death precisely on May 21, 1975, some people came close enough to saying that; and events showed they were wrong. Yet there are escape routes available for all predictors, even those proven to be wrong. Two escape routes, really: memory and interest.

Jesus knew about this. He told his followers that no man knows the day or the hour of his Return, but that people would always be running around predicting it (Mark 13:32, Luke 21:8–9). The failed predictions, which seemed so snazzy before they failed, wouldn’t really be perceived as failures, because the failures wouldn’t be remembered, or considered interesting enough to be remembered. Watch out for predictions, he said.

There are two things going on here. One is that people die, and their enthusiasms die with them — often to be revived by the next generation, before being forgotten again. Quack medical treatments, as someone has pointed out, have a generational life, and so do quack economic and religious treatments. The Bates Eye Method, a way of using exercise to improve your eyesight, doesn’t work, and when people find that it doesn’t, they abandon it. They eventually die, and another group of people “discovers” the great idea, wants to believe it, and makes a big deal out of it, temporarily. The phony (and I mean “phony” not in the sense of “mistaken” but in the sense of “created to make money and impress people”) religious ideas of the I Am Movement have had a similar life cycle among New Age types. And when it comes to economics, where would we be without such recurrent notions as the idea that unions are responsible for “the prosperity of the middle class,” the idea that the minimum wage lifts people out of poverty, and the idea that general wellbeing can be created by forcing the price of commodities up by means of tariff regulations?

The gullible people who endorse such ideas often die with their convictions intact, although they may not succeed in passing them along to others, at least right away. In April we witnessed the death of the oldest man in the world, a gentleman named Walter Breuning. Before his death at the age of 114, Mr. Breuning gave the nation the benefit of his more than a century of experience and learning — his belief that America’s greatest achievement was . . . Social Security! Yes, if you retire at 66, as Mr. Breuning did, and collect benefits for the next 48 years, I suppose you might say that. But it’s an idea that’s likely to be ignored by people who are 30 years old and actually understand what Social Security is.

And that’s the additional factor: lack of interest. Failed ideas, and failed predictions, aren’t always forgotten — many of them have a second, third, or fourth advent. But they may be ignored. They may not be interesting, even as objects of ridicule. I suspect that most young people would say that Social Security is “good,” but it’s not as important to them as it was to Mr. Breuning. The same can be said of mainstream Christians, who agree that Christ will return, but pay little or no attention to any current predictions.

Right or wrong, as soon as an idea reveals even a vulnerability to disconfirmation, it often starts to dwindle in the public’s mind. Global cooling is a perfect example. Once, cooling was fairly big in the nation’s consciousness; then it didn’t seem to be happening, right now anyway; then it began to seem unimportant; then it disappeared, without anyone really noticing its absence.

This is what tends to happen with political and economic predictions. The smart people, and the political fanatics (such as me), go back to what Roosevelt said or Kennedy said or Obama said, and notice how wildly their promises and predictions varied from the accomplished facts; but the people in the street go their way, unshocked and unaffected. They may not have expected specific accuracy from their leaders’ forecasts, but if they did, they forgot about it. Initially, they were foolish enough to be inspired, or frightened, but they were canny enough to realize that other forecasts — equally inspiring, frightening, and vulnerable to failure — would succeed the present ones.

The subject changes; the language does not. It’s always apocalypse on the installment plan.

It’s like Jurassic Park, where the dinosaurs seem certain to devour the heroes, and almost manage to do so — about 1100 times. After the first few hundred near-death experiences, you realize that the only logical reason this won’t go on forever is that the theater has to be cleared for another showing of Jurassic Park. Expectation diminishes, long before the show is over — although you may be willing to see it again, in a few years, once the specific memory wears off. That’s the way the language of prediction often works, or fails to work.

As long as the idea of socialism has existed, its priests have predicted the downfall of the capitalist system. When each seemingly fatal contingency proved not to be fatal, another contingency was identified; when that failed to produce the climax, a third came into view . . . and so on. Some people were willing to return for Downfall of Capitalism 2, 3, and4; but others sensed that the plot had no logical ending, after all, and sought something different.

So new performances began in the Theater of Prognostication. Followers of Malthus demonstrated that civilization would perish through “over-breeding.” Eugenicists showed that it would end by the over-breeding of the “unfit.” For many generations, journalists computed the size of “the world’s proven fuel resources” and demonstrated that unless alternative sources of energy were found, the engines of the world would stop. In 1918, the world was assured that it was about to be made safe for “democracy.” Then it was assured that it was on the brink of unimaginable prosperity, to be produced by “technocracy.” After that, it learned it was about to be completely destroyed by a new world war. When the war came, but did not succeed in destroying the world, optimists prophesied an imminent “era of the common man,” while pessimists prophesied annihilation by the atom bomb. For generations, the “doomsday clock” of The Bulletin of the Atomic Scientists stood at a few minutes till midnight. It still does — because now it registers not only the purported danger of atomic war, but also the purported likelihood of destruction by “climate change.” In other words, another movie has hit the theater.

The subject changes; the language does not. It’s always apocalypse on the installment plan. You buy it in little doses. First, “evidence seems to show”; then, “all the evidence shows”; after that, “experts are all agreed.” The only thing lacking is clear language about exactly how and exactly when the event will happen.

In the early 20th century, many millions of words were spilled about (A) the world’s inevitable political and social progress; (B) the world’s inevitable destruction in a great war. But when the first great war began, there was nothing of inevitability about it. If Russia, France, Germany, and Austria-Hungary had decided, as they might easily have decided, not to bluff one another, or to call one another’s bluffs, about the insignificant matter of Serbian nationalism, there would have been no World War I. In the 1930s, world war was regarded as inevitable by people terrorized by new types of weapons and by the traditional bogeys of “monopoly capitalism” and “western imperialism.” When war came, it wasn’t ignited by any of those things, but by the least predictable of world historical factors: the paranoid nationalism of the Shinto empire, and the weird appeal of Nazism, embodied in the unlikely figure of Adolf Hitler.

If you refuse to be gulled by apocalyptic lingo, what will happen to you? Here’s what. You’ll be told that you are “in denial.”

There’s nothing much to the predictive power of human intelligence. But if you refuse to be gulled by apocalyptic lingo, what will happen to you? Here’s what. You’ll be told that you are “in denial.” Even more repulsively, you will be told that “denial is not just a river in Egypt.” (Isn’t that clever?) You will be accused of not believing in Science, not respecting the Environment, not caring about History, and so forth. You will be accused of all the characteristics exemplified by the prophets of doom themselves: ignorance, arrogance, and not-listening-to-others.

But let’s see how Harold Camping, Family Radio’s leader and prophet, compares with the other leaders and prophets we’ve considered. In one way he is exactly similar — his use of what Freud called “projection.” In every broadcast Camping warns his audience against arrogance, unloving attitudes toward other people, impulsive and subjective interpretations of the Bible, and submission to the mere authority of so-called Bible teachers. And in every broadcast he denounces members of ordinary churches for failing to heed his prophecies; rejoices in the pain they will suffer on May 21, when they realize that he was right and they were wrong; and suggests that anyone who disagrees with him is denying the authority of the Bible itself. On April 28, his radio Bible talk concerned the dear people in the churches, whom he reluctantly compared with the 450 priests of Baal who were slaughtered by Elijah because they trusted in their own ideas and wouldn’t listen to the true prophet — a prophet who, like Camping, was favored by God because he was absolutely certain that what he said was true.

But — and this is the most important thing — Camping has a dignity and intellectual integrity denied to most other predictors, including those most esteemed in our society. He doesn’t speak in generalities. He predicts that Judgment Day will come, without any doubt or question or problem of definition, on a particular day: May 21, 2011. In all likelihood it will begin with a great earthquake, which will devastate New Zealand and the Fiji Islands, at sundown, local time. After that, the wave of destruction will circle the globe, with the setting sun. By May 22, the Rapture will have been concluded; “it will all be over!”, and everyone will know that it is; the whole thing is “completely locked in.” In making his prophecies, Camping is actually risking something. He is actually saying something, not just uttering fortune cookie oracles.

I use that phrase advisedly. The other night, Mehmet Karayel and I dined at the Mandarin, and as always we were interested in seeing what our fortune cookies had to say. Mehmet’s fortune was a typically New Age, ostensibly precise manipulation of words. It said, “You may attend a party where strange customs prevail.” Yeah, right. Or he may not attend such a party, or the customs may be “strange” only to ignorant people, or the customs may be present but not prevail, etc.

Mehmet is a real intellectual, not a person who plays one on TV, so he was not taken in by this exciting forecast. Then came the unveiling of my fortune. It was, believe it or not, “Your future looks bright.” Can you imagine a feebler thing to bake into a cookie? But Mehmet is aware of Mr. Camping’s prophecies, so he knew how to strengthen the message: “It should say, ‘YourMay 22 looks bright.’”

So Mehmet and Harold Camping, though disagreeing firmly on specifics, stand together in demanding that they be produced. Mehmet is certain that my May 22 can be bright; Camping is certain that it can’t. But they both know that only one of them can be right about a proposition like this, and that we’ll soon find out which one it is. How refreshing.

I must admit that not everybody in Camping’s outfit is up to his high intellectual standard. On April 26 I received a plea for contributions to Family Radio (which, by the way, has a good deal of wealth and doesn’t really need many contributions — but why not ask?). The plea came in two parts. One was a brief letter from Camping, requesting my “continued involvement” in FR’s ministry, because “Time is running out! The end is so very near, with May 21, 2011, rapidly approaching.” The second was a pledge card, where I could check the amount of money I planned to give “each month.” As I said in my December article, there is evidence that some people at FR are biding their time, trying to keep the organization together so it can continue — under their leadership — after the failure of May 21. My speculation is that the pledge card is their product, and they don’t mind contradicting Camping’s message in the letter. They may even be alerting “supporters” like me to keep the faith: my May 22 does indeed look bright.

But that’s a digression. Harold Camping is not a politician or a professor of environmentalism, whose prophecies can never be proven wrong because they’re ridiculously non-specific. No, he has said exactly what he means by the end of the world, and he has said exactly when the end of the world will happen. You can check it. I hope you do. Go to Family Radio’s website, find out where its nearest radio station is, and tune in during the evening of May 20 (US time), when, Camping believes, Judgment Day will begin in the Fiji Islands. Then listen through the next few days, as Family Radio responds to the disconfirmation of its prophecies. Or does not respond — until it figures out how to do so (and that should be interesting also).

As I’ve said before, this is a once in a lifetime opportunity. It will be much more interesting than listening to the constant din of the secular prophets — politicians, historians, economists, and environmentalists — whose intellectual stature, compared to that of Harold Camping, addlepated prophet of the End Time, is as nothing, a drop in a bucket, and the small dust that is wiped from the balance.


Share This

Another Worm Turns


I have reflected before on the pivotal battle fought by Republican governor Scott Walker in Wisconsin to modify the collective bargaining rights of most public employees. Despite an unprecedentedly brutal fight, with union-backed Democrat legislators fleeing to a different state to deny the Republicans a quorum, an extraordinary anti-Walker and anti-Republican propaganda blitz bankrolled by the unions, massive demonstrations organized and also bankrolled by them, Walker and the Republican legislators succeeded in pushing through their reform bill.

The unions then doubled down, running a multi-million dollar campaign to replace an independent state supreme court judge with a union stooge committed to nullifying the law. That failed, and the unions were stunned. They have seldom met with such abject failure before; the experience is beyond ken, and their ability to comprehend.

Now another blow to union domination has been struck in of all places — Massachusetts!

Yes, lawmakers in the Massachusetts House of Representatives have voted to cut back municipal employees’ ability to bargain collectively for healthcare benefits. (The Massachusetts bill, by the way, includes the police. It thus goes farther than the Wisconsin law, which exempted police and firefighters from the new rules). And they did so by an almost 3-to-1 margin.

In particular, the law would give local authorities (such as mayors) of the more than 350 cities in the state the power to modify municipal employee healthcare benefits, such as by setting deductibles and copayments. The unions are pushing an alternative: if municipal officials and municipal union negotiators cannot reach an agreement, the dispute will go to binding arbitration. This is a common union ploy. Unions know that arbitrators don’t have to face the electorate, and don’t have to balance budgets, either.

Even more remarkable is the fact that the move to rein in Massachusetts’ public employee unions was and is led in great part by Democrats. This is the most dramatic illustration of a growing split in the Democratic Party coalition: more and more Democrats are seeing that their cherished progressive programs are being defunded by the boundless greed of the public employee unions, bogarting all the tax revenues collected by the progressive states. In the case of Massachusetts, healthcare benefits for municipal workers have more than doubled in a decade. And the compensation packages (pay, pension, and healthcare benefits) for municipal workers are consuming about three-quarters of the average municipal budget.

As in Wisconsin, the unions fought viciously to stop the bill they did not want, but to no avail. And shocked they were at the turn of the worm. Robert Haynes, head of the Massachusetts’ AFL-CIO (which represents over 175,000 municipal employees), squawked, “It’s pretty stunning. These are the same Democrats that all these labor unions elected. The same Democrats who we contributed to in their campaigns. The same Democrats who tell us over and over again they’re with us, that they believe in collective bargaining, that they believe in unions. . . .” You can just hear his outrage: dammit, in the good old days, when you bought off politicians, they stayed bought!

Haynes vowed that the unions would keep fighting the reforms to “the bitter end,” and that the union myrmidons would target those renegade Democrats as surely as they target Republicans. His union is planning to expand its demonstrations, bringing in police and firefighters to participate, increase the ads run, and increase the lobbying.

Speaker of the House Robert DeLeo, a Democrat, offered a couple of large concessions to the unions and to members of his own party who are frightened by the unions. The first would give public employees 30 days to argue against proposed changes in their healthcare plans (though without the power to stop the changes), and would give union members 20% of any savings for the first year from contested healthcare changes that local officials impose. (Considering that the bill will likely save taxpayers $100 million a year, this is a pretty good deal.) The proposal brings the bill closer to what Governor Deval Patrick has himself offered. But the unions are still opposed.

It is unclear whether the bill will make it through the Massachusetts Senate, and if it does, whether Patrick will sign it. He is a very progressive liberal Democrat, but his state faces a nearly $2 billion deficit.

However, the win in the Massachusetts House is already telling. It says that we are coming to the place where voters are no longer ignorant of how much the public employee unions have been ripping them off. This awareness is only beginning to grow. It will accelerate quickly as the retirement of the Boomers brings to light just what massive fiscal frauds the employee unions have committed, and as people see their social services begin to crumble.

It also says that there is a limit to how long Democrats will do the unions’ bidding. Under public choice theory, we assume that all actors in the political process (voters, special interest groups, and politicians) are motivated primarily by self-interest. The politician wants to be reelected, and he knows that in most situations, the public isn’t paying attention to what laws are being enacted, while the special interests, such as unions, are. So it makes sense to screw the voters in exchange for special interest financial support. But when the public is aware and concerned, the politician knows that special interest money won’t compensate for the lost votes. In such cases, where voters are no longer rationally ignorant, the politician will be forced to defy the special interests.

We may be reaching that point.

Share This

© Copyright 2016 Liberty Foundation. All rights reserved.

Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.