Five Year Plan

 | 




Share This


Paraders Step in the Right Direction

 | 

Every year the Yonkers African American Heritage Community hosts a two-day festival and parade in downtown Yonkers, 15 miles up the river from Manhattan. Every year the Yonkers City Council agrees to provide police, parks, and emergency personnel to serve the event, paying exorbitant overtime fees to do so.

But this year the city told festival organizers that they would have to pay the city's costs to secure the event. The result? The committee opted to host a one-day festival at the community center, instead of the parade. They simply could not afford the tens of thousands of dollars they would have had to pay city workers in order to host the two-day, citywide festival.

This is exactly as it should be. If an event isn't worth tens of thousands of dollars to the people participating in it, why should it be considered worth tens of thousands of dollars to the taxpayers who may not even be attending the event? Or worse, who may be inconvenienced by the parade and the noise?

Earlier this summer the Yonkers Puerto Rican/Hispanic Parade & Festival was canceled for the same reason. When nearby White Plains began billing parade organizers for police and cleanup last year, many of their community organizations also turned to hosting single-location festivals instead of the rowdier and messier parades.

Municipalities across the country should follow this example. Traditions are important. They bring communities together and create bonds across generations. But the details of a tradition can be changed to fit the times. No longer should taxpayers be expected to foot the bill for parties and festivals enjoyed by small groups within the larger groups. Festival organizers should raise money the private way: sell advertising, seek private sponsorships, offer vendor booths, and charge fees. The lessons our mothers taught us apply to municipalities and community organizations: if you can't afford it, don't do it.




Share This


Marshall v. Jefferson

 | 

In the September 2009 issue of Liberty (in a book review entitled "Liberty and Literacy"), Stephen Cox — ever the analytical wordsmith — extols the content and form of Thomas Jefferson’s brilliant first sentence, second paragraph of the Declaration of Independence:

“We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable rights, that among these are life, liberty, and the pursuit of happiness.”

Focusing on the form, he paraphrases the passage into run-of-the-mill prose and berates the reader who can’t tell the difference. He says that “language is not just a method of communicating… (It) is a way of creating pleasure,” and that if one doesn’t see that, then one is illiterate, knows nothing about writing and should —  go away.

Braving Cox’s acid pen and his usually faultless reasoning, I took issue with his (and nearly everyone else’s) assessment of Jefferson’s passage. I responded in Liberty in November 2009:

“Lofty words. Pure poetry, perhaps — but devoid of any connection to reality. It is not self-evident that '“all men are created equal,' or 'that they are endowed by their Creator with certain unalienable rights,' or 'that among those are Life, Liberty, and the pursuit of Happiness.' One need only look at the history of the Bill of Rights and its ignored 9th Amendment to realize that the only rights citizens retain — much less 'are endowed with' — are those that they explicitly claw from their government; Life, Liberty and the pursuit of Happiness not included. Perhaps they were too 'self-evident'?

“It took nearly 100 years for those three self-evident rights to be included in the Constitution under the 14th Amendment as 'life, liberty, or property.' And even now they’re not secure. 'Pursuit of Happiness' was an elegant albeit vague and meaningless euphemism for property, which Jefferson was loath to include, fearing it might justify slavery. Unfortunately, the omission later caused such an erosion of property rights that there is now popular clamor for a property rights amendment to the Constitution (in spite of the 14th Amendment).

The government established under the Articles of Confederation was about as powerful and effective as today’s United Nations.

"The slippery nature of even enumerated rights — much less 'self-evidently endowed' rights — comes to mind in Justice Oliver Wendell Holmes’ dissent in the Lochner v. New York case. His particularly perverse interpretation of the 14th Amendment, using original intent, mind you, found that since the amendment was originally written to protect the rights of freed slaves, it could not apply to workers and management deciding the length of their workday. But then, he was famous for declaring that he could decide any case, any way, using any principle. (He’d later go on to find that eugenics, as government policy, was justified under the Constitution.)

“As populist rabble-rousing, Jefferson’s clause is second to none, and in that sense, it is great writing. However, as a description of reality or a recipe for government, it is a complete failure. Therefore I must counterintuitively conclude, being a firm believer in the dictum that form follows function, that the clause in question is neither effective nor elegant writing.”

Only later, after reading R. Kent Newmyer’s legal biography of John Marshall, the fourth Chief Justice of the United States, John Marshall and the Heroic Age of the Supreme Court, did I that realize that I was not alone in being skeptical of "natural rights" and, instead, advocating enumerated rights.

The controversy over enumerated versus self-evident rights began immediately after the Constitutional Convention disbanded and each state was asked to ratify the new document. Contrary to what today’s tea partiers, radical states’ righters, and some libertarians and conservatives believe, the Constitution was created to increase the power of the federal government, both absolutely and over the states. Under the previous arrangement — the Articles of Confederation — the federal government was dependent on the whims of the states — individually, mind you — for voluntary revenues and a host of other items. These constraints made defense of the new country a much tougher proposition — in raising an army and funding a war. In today’s terms, the government established under the Articles of Confederation was about as powerful and effective as today’s United Nations.

To John Marshall, Alexander Hamilton, and a coterie of other patriots who would later coalesce into the Federalist Party, this arrangement spelled ruin for the new country — not only because the United States might not be able to defend itself adequately, but also because it wouldn’t be able to pay its bills dependably, obtain credit, or participate in the foreign exchange mechanisms necessary for international commerce.

Under the old dispensation, individual states were responsible for debts incurred during the Revolutionary War, and some were thinking of defaulting, either from irresponsibility or from spite toward some of the Brits who’d bankrolled them. As Hamilton observed about the debt and its consolidation under federal responsibility, it was "the price of liberty." Additionally, each state (as well as private entities) could issue its own currency. Without the full faith and credit of a central government, the new country would be unable to participate effectively in international trade — a serious impediment under the new, capitalist world order.

Without the full faith and credit of a central government, the new country would be unable to participate effectively in international trade — a serious impediment under the new, capitalist world order.

Most delegates to the Constitutional Convention appreciated this. Yet, because the new Constitution increased the federal government’s power, some delegates (anti-federalists, later to coalesce around Thomas Jefferson and his Democratic-Republican Party), fearing tyranny, fought for a bill of enumerated rights, limiting the federal government. The idea that such a bill would be forthcoming may have beena make-or-break point for ratification.

Counterintuitively, people opposed to including a Bill of Rights (many of them Federalists) replied that it was impossible to enumerate all the self-evident rights that the people retained; that enumerating a few rights would guarantee only those; and that the unenumerated rights would forever be lost (think of the right "to privacy," later discovered in penumbras and emanations, together with the right "to earn an honest living," "to marry a person of the same sex," "to marry more than one person," "to cohabitate," "to fight for your country in spite of being gay," "to suicide," "to ingest any substance," "to enter into contracts," "to make obscene profits," etc. — you get the point).

As we all know, the Jeffersonian anti-Federalists won the battle for the Bill of Rights, mostly because the Federalists’ arguments — vague and hypothetical — did not have the immediacy of the fear of tyranny. The odd hitch — to a modern audience — was that the Bill of Rights did not apply to state governments, only to the federal government. Massachusetts retained a state religion until 1830, and the Bill of Rights wasn’t interpreted to apply to the states until the passage of the Fourteenth Amendment after the Civil War.

In order to allay Federalists’ concerns, the Ninth Amendment to the Constitution stated: “The enumeration in the Constitution of certain rights shall not be construed to deny or disparage others retained by the people.”

Unfortunately, that language has become meaningless — as well it should, if examined critically.

Robert Bork, who failed to be confirmed to the Supreme Court because some senators thought him too conservative, both politically and judicially, has likened the Ninth Amendment to an inkblot. In The Tempting of America he argued that “while the amendment clearly had some meaning, its meaning is indeterminate; because the language is opaque, its meaning is as irretrievable as it would be had the words been covered by an inkblot.” From the Left, Laurence Tribe has said that “the ninth amendment is not a source of rights as such.” The best defense of the Ninth comes from Randy Barnett, a libertarian constitutional scholar (and contributor to Liberty), who says that it “calls for a presumption of liberty.”

Marshall, in spite of his Federalism, understood the problem with the Ninth Amendment and advocated the enumeration of rights. He did not believe in natural rights endowed by a creator; he believed that we the people endow ourselves with rights based on expediency and tabulated as positive law. He was a nuts-and-bolts lawyer who believed that the best laws were those that required the least interpretation. Ironically (as we shall see), though he eschewed grandiose philosophical visions and paradigms, he is best known for defining the role of the Supreme Court in the new United States and establishing a rock-paper-scissors hierarchy among the three branches of government that remains the modern modus operandi.

Marshall was a nuts-and-bolts lawyer who believed that the best laws were those that required the least interpretation.

Marshall’s concern with rights was particularly personal. An admirer of John Locke, Marshall was obsessed with the sanctity of contracts to a degree that today might be considered excessively libertarian. He believed that the terms of a contract superseded statutory limitations. For example, in a minority opinion, he opined that bankruptcy laws could not relieve a debtor from a previous obligation. Likewise, he would have heartily supported the 1905 Lochner v. New York decision that overthrew a statute limiting the working hours of bakers as an infringement of the rights of employees and employers to negotiate their own contracts. He also believed that legislation was a contractual obligation of government to the citizenry. In Fletcher v. Peck (1810), Alexander Hamilton, representing a group of investors — including himself — argued that a Georgia state law authorizing a land sale was a contract, and that Georgia's Rescinding Act of 1796 invalidating the sale was unconstitutional under the contract clause of the constitution. Marshall agreed.

These perspectives had the curious effect of another rock-paper-scissors round robin: though Marshall was a strong advocate of the supremacy clause, the phrase in the Constitution stipulating that federal law trumps state law (a big damper on states’ rights), his view of contracts tends to elevate individuals above both state and federal law. But I digress — back to Marshall’s personal concern with rights.

Marshall and his family were speculators in lands west of the Appalachians. Like most libertarians and economists today, Marshall saw nothing wrong with speculation; in fact, he believed that speculators provided services essential to the opening of new tracts for settlement — subdivision, assessment of resources, initial price arbitrage, surveying into lots, market making, etc. The buying and selling of deeds required contracts, the details of which, he believed, were between the parties involved, and should be arrived at with minimal government interference.

But speculators, then as now, had a bad reputation among the substantial portion of the population that didn’t understand their function and thought they were making profits without effort. These folks, Thomas Jefferson prominently among them, believed in the ideal of a republic of small, yeomen farmers. Speculators were just an unnecessary — even an evil — obstacle to that ideal.

Jefferson and Marshall despised each other. Though the Democratic-Republican Jefferson managed to restore his friendship with Federalist John Adams, he could not stand fellow Virginian Marshall.

Marbury v. Madison

Though not the beginning of their feud, Marbury v. Madison — one of two of Marshall’s most famous decisions — best summarizes their intellectual conflict. It is the decision that established the power of the Supreme Court to overturn congressional legislation through the principle of judicial review, thereby elevating the Supreme Court to coequality with Congress and the Executive.

Judicial review, already a long-standing legal principle in other contexts, was a power not specifically granted to the newly-established Supreme Court. Marshall understood that without it, the Supreme Court could never properly function as one third of the triad it was designed to be, since an effective separation of powers required three equally potent branches of government. It is a complex and convoluted decision, tight in reasoning, and difficult to explain. I’ll give it a shot.

The case began amid the bitter political conflicts of the waning days of Adams’ administration. The (barely) peaceful transfer of power to the opposition was a landmark in the new nation’s development. Still, anti-Jefferson riots were expected in the capital.

In a bold attempt to curtail the new administration’s power, Adams nominated Marshall, his Secretary of State, as the new Chief Justice of the United States. The Federalist-controlled lame-duck Congress not only quickly confirmed him, it also passed a law authorizing the appointment of a number of justices of the peace to govern the District of Columbia in case the riots materialized. Adams immediately appointed 42 Federalist judges.

Jefferson was livid. As Newmyer says:

“Unfortunately for historians, there were no cameras to record the deliciously ironic moment on March 4, 1801, when the new chief justice administered the oath of office to the new president. With his hand on the Bible held by Marshall, Jefferson swore to uphold the Constitution Marshall was sure he was about to destroy…It was not coincidental that Marshall turned his back to the president during the ceremony. . . . Jefferson had already concluded that the federal judiciary had to be humbled and ‘the spirit of Marshallism’ eradicated.”

The (barely) peaceful transfer of power to the opposition was a landmark in the new nation’s development. Still, anti-Jefferson riots were expected in the capital.

The new appointments were duly signed and sealed but, ominously, not all of them were delivered by the Secretary of State (still John Marshall), whose job it was to finalize the procedure, but who had only had the last week of the Adams administration in which to comply. When James Madison, Jefferson’s newly appointed Secretary of State (and an author of the Constitution), assumed his duties on March 5 he discovered the remaining undelivered appointments.

Jefferson ordered Madison not to deliver the commissions. Enter William Marbury, one of the prospective justices appointed by Adams. He and three other denied appointees, petitioned the Supreme Court for a writ of mandamusdirected at Madison, in essence ordering him to comply.

Meanwhile, the new Democratic-Republican Congress repealed the legislation that authorized the appointments in the first place and, adding fuel to the fire, cancelled the 1802 Supreme Court term, Marshall’s first. The intervening period permitted Marshall and his colleagues to ponder the constitutionality of events, the dangers of challenging executive authority head-on by issuing the mandamus, and the formulation of strategy.

For two years, the Court’s powers, or lack thereof, had been debated in Congress and in the court of public opinion. The Court had even been a focus of Jefferson’s political agenda. Specifically, was the Supreme Court subservient to Congress or the Executive or both, or was it equal in stature and power? Marshall was looking for an opportunity to settle the debate, and Jefferson gave it to him when he blocked Adams’ judicial appointments.

The new Democratic-Republican Congress, adding fuel to the fire, cancelled the 1802 Supreme Court term.

In February 1803, the Court came out fighting, opening its term with Marbury v. Madison. Immediately, Jefferson — claiming executive privilege — insulted the Court by refusing to permit US counsel to appear or executive witnesses to be heard. And he continued to stonewall, micromanaging executive witnesses even when the Court established, after much technical to-ing and fro-ing, that it did indeed have jurisdiction in the case, and that it could go forward.

In what was to become his typical fashion, Marshall (with a unanimous Court), decided the case on narrow grounds: the rule of law. He stated that Marbury’s office was vested when President Adams signed his commission; that at that point — irrespective of mundane details — the operation of law began. Marshall, to Jefferson’s great irritation, virtually lectured the new president that he was not above the law.

So, where is the judicial review, the Court’s power to overturn congressional legislation, for which Marbury v. Madison is so well known? In the last six pages of the 26-page opinion, in which the court struck down section 13 of the Judiciary Act of 1789. Marshall's reasoning almost became the proverbial camel passing through the needle's eye.

In the Act, Congress had magnanimously granted the Supreme Court the right to issue mandamus writs, reasoning that, since the power wasn’t specifically granted by the Constitution — and the Court couldn’t very well function without it — it was necessary for the Court to have it.

Marshall disagreed on two major points. First and foremost, he declared that Congress — through simple legislation — could not change the Constitution, and that only the Supreme Court had the power to interpret it. Second, for a variety of reasons, Marshall decided that the Constitution already gave the Court the power to issue mandamus writs.

Confused? There is no doubt that Marshall was out to prove a point and — with some fancy footwork — had to weave a sinuous path to make it. Luckily (and some say, with Marshall’s prodding and collusion) circumstances, timing, allies, and even adversaries, all fell into place for him. Almost all aspects of the decision are still debated, even as to whether it was at all necessary; mainly because many commentators believe the Constitution already implicitly grants the power of judicial review to the Court. In Federalist No. 78, Hamilton opines that not only is judicial review a power of the Court, it is a duty. Not one delegate to the Constitutional Convention argued against the principle.

Jefferson — claiming executive privilege — insulted the Court by refusing to permit US counsel to appear or executive witnesses to be heard.

But critics claimed that the Marshall court had vastly overreached. Jefferson himself believed that the president (and the states) had the power to interpret the constitution, and he forever fulminated against the decision. Congress, however, was not troubled and took it in stride — which leads Newmyer to conclude that, “put simply, it was presidential power, not congressional authority, Marshall targeted.” The Supreme Court’s power of judicial review was extended over the states in 1816 in Martin v. Hunter’s Lessee,another Marshall decision.

Jefferson versus Marshall

John Marshall was the longest serving — and arguably the most important — chief justice. Serving from 1801 to 1835, he presided over the most formative decisions the new country faced. He helped to establish a balanced, effective, and more manageable government, and helped set the tone for the future sparring among the three branches of federal power. During his term, the Constitution became much more than a founding document — it became something closer to accepted law.

Today most of us perceive political parties as somewhere within the Left-Right continuum. It is difficult to see things in any other way: that’s how today’s politics play. But the Federalist versus Democratic-Republican divide was an entirely different one.

Although most people today associate Jefferson with individual rights and a fundamentalist view of the Constitution; and the Federalists with the advocacy of a strong central government, the distinctions are not so facile and clear-cut. For one, the Democratic-Republicans supported slavery, while the Federalists generally opposed it. These positions led the Jeffersonian tradition directly to the policies of Jackson, Calhoun, and, finally, Jefferson Davis; while the Federalist tradition led to Lincoln.

The irony here is that those who were most skeptical of the Constitution are the ones referred to as “strict constructionists,” while the Federalists are regarded as free-wheeling interpreters of its provisions.

The Federalists were the first to see and understand the failure of the Articles of Confederation, so they pushed for change. The anti-Federalists thought that the Articles could be tweaked for improvement and were skeptical about the whole constitutional enterprise. In the end, they accepted it reluctantly — and showed it. To them, “strict constructionism” meant that if the Constitution granted one the right to eat, the right to obtain food didn’t automatically follow. Or, if it granted the right to free political speech, the right of media accessibility for broadcasting that speech — since it wasn’t actually spelled out — didn’t exist. Such thinking, often imbued with deep resentment, led to muddled action, ambivalence, and, sometimes a reversal of roles — with the president himself leading the way.

Jefferson, expecting an immediate victory, ordered a squadron of ships to destroy the Muslim navies. The war dragged on for almost 15 years.

Jefferson’s cavalier attitude toward the Constitution was shown early in his presidency, with his 1801 attack on the Muslim state of Tripolitania on the Barbary Coast (Tunisia, Algeria, Morocco, and present day Libya) without a congressional declaration of war, which contemporary opinion believed he was constitutionally obligated to obtain. Several of the Barbary states had demanded tribute from American merchant ships in the Mediterranean. When the Americans declined, the Pasha of Tripoli captured several seamen and held them for ransom. Jefferson, expecting an immediate victory, ordered a squadron of ships to destroy the Muslim navies. The war dragged on for almost 15 years.

But it was the Louisiana Purchase — the constitutionality of which even Jefferson was skeptical about — that was really troublesome. For starters, the Constitution did not empower the federal government to acquire new territory without the universal consent of every state (as per Andy P. Antippas' view in his History of the Louisiana Purchase). Some of the articles of the Purchase Agreement were also in violation of the Constitution because they gave preferential tax treatment to some US ports over others; they violated citizenship protocol; and they violated the doctrine of the separation of powers between the president, Congress, and the judiciary.

As Antippas recounts:

“Jefferson and his fellow Republicans were 'strict constructionists.' i.e., they allegedly adhered to the letter of the Constitution and were strong proponents of 'state’s rights' and 'limited government;' however, Jefferson and most of his party members chose simply to ignore all the Constitutional issues as merely philosophical for the sake of expediency — Jefferson’s response to his critics was 'what is practicable must often control what is pure theory' — in other words, 'the end justifies the means.'"

Jefferson’s specious argument to his critics was that the federal government's power to purchase territory was inherent in its power to make treaties. The Senate bought that argument and ratified the Louisiana treaty.

In their individual approaches to personal liberty, Jefferson’s and Marshall’s actions speak volumes. As I’ve already mentioned, Marshall was fanatically laissez-faire, while Jefferson favored greater economic regulation for what he thought was the good of society. Specifically, Jefferson favored a society of agrarian smallholders and did not approve of speculators buying up western lands as soon as they were available — he wanted smallholders to get in on the action right away. He did not understand the redeeming socioeconomic value of speculators, abhorred their — in his view — unearned profits, and advocated restricting or eliminating these — again, to him — unnecessary middlemen, prominent among whom were Marshall and his family.

Both were Virginians and slaveholders, but their treatment of slaves differed markedly. Jefferson is known to have beaten his slaves; there is no evidence that Marshall ever did. In his will, Marshall wisely granted more liberty to his slaves than we might intuitively suppose today. He gave them two options upon his death: liberty, with severance pay, so they could set themselves up independently (or emigrate to Liberia); or continued servitude, in case the radical transition to liberty was more than they could handle. In 1781, near the end of the Revolutionary War, 23 of Jefferson’s slaves escaped to the British.

Marshall's and Jefferson's approaches to Native Americans were even more illuminating. Though Jefferson’s words spoke respectfully, even admiringly, of the noble savage, his policies began the trail of tears that would destroy cultures and result in the reservation system.

As soon as Louisiana was purchased, Jefferson embarked on a cold-blooded policy toward Native Americans. In a lengthy letter to William Henry Harrison, military governor of the Northwest Territory, he explained that the nation's policy "is to live in perpetual peace with the Indians, to cultivate their affectionate attachment from them by everything just and liberal which we can do for them within the bounds of reason." But he goes on to explain "our" policy (presumably his own, and that of the United States) on how to get rid of every independent tribe between the Atlantic states and the Mississippi, through assimilation, removal, or — if push came to shove —  "shut(ting) our hand to crush them." Finally, in secret messages to his cabinet and Congress, Jefferson outlined a plan for the removal of all Native Americans east of the Mississippi to make sure that the land would never fall to the French or the British, who chronically supported the Indians in their disputes against the US.

Marshall, in contrast, did everything he could to prevent the confiscation of Indian land and the eviction of the Indians from Georgia, in a series of cases collectively known as the Cherokee Indian cases.

As a young man, while serving in the Virginia House of Delegates in 1784, Marshall supported a bill that encouraged intermarriage with Native Americans. Three years later, in the Indian slave case of Hannah v. Davis, he argued successfully that Virginia statute law prohibited the enslavement of Native Americans.

The Cherokee Indian cases, too long and complicated to detail here, came before the court in the early 1830’s, when Andrew Jackson was president. The final one, Cherokee Nation v. Georgia, turned on the supremacy clause. In a bald-faced land grab, Georgia had declared sovereignty over Cherokee lands. The Cherokees sued. The Marshall court decided that Indian affairs were the province of the federal government alone; therefore the Georgia statutes that claimed control over Indian lands were null and void. But neither Georgia nor Jackson — both strong states’ rights advocates — nor Congress supported Marshall’s decision. Jackson is reputed to have said, “John Marshall has made his decision, now let him enforce it.” Congress, meanwhile, had passed the Indian Removal Act of 1830, which Jackson heartily endorsed. It was a Pyrrhic victory that few Cherokees savored on their forced march along the "trail of tears and death" to Oklahoma.

McCulloch v. Maryland

Besides the fundamental issue of judicial review, another fundamental issue remained to be addressed in order to make the Constitution something closer to ultimate law, as opposed to simply a guiding, founding document: objective guidelines for practical interpretation. Other than the Federalist papers, which were not law, firm guidance about constitutional interpretation was lacking.

At one end of the spectrum was the strict fundamentalist approach (see examples already mentioned above), akin to religious fundamentalist interpretations of the Bible: virtually no interpretation. At the other extreme were freewheeling, almost poetic readings — complete with what today are called “penumbras and emanations.” With Holmesian effort, one could interpret anything, anywhere, in any way. This was not only impracticable law, but a recipe for tyranny. In 1819 Marshall got the opportunity — again, with some prodding and collusion on his part — to set the standards.

The constitution had given Congress the power “to coin money and regulate the value thereof.” Under that clause, the first Congress — with the approval of President Washington — chartered the first Bank of the United States, legitimized by its power to coin money. Though Jefferson opposed it, he left the bank in place. He and his political friends referred to it as "Hamilton’s bank." Being strict constructionists, they argued that it was Congress that had been empowered by the constitution to coin money, not the Bank. However, four years without the bank during the War of 1812 spoke eloquently about the value of its services. When a charter for a second Bank of the United States (the first charter ran for only 20 years) was introduced in 1816, it had the support of President Madison, who signed the bill into law. Only Virginia's congressional delegation voted (11 to 10) against the bank.

Though Jefferson’s words spoke respectfully, even admiringly, of the noble savage, his policies began the trail of tears that would destroy cultures and result in the reservation system.

Though the Constitution had empowered the federal government to coin money, it had not explicitly barred states from doing so. Virginia, a staunch states’ rights advocate, kept its Bank of Virginia, headquartered in Richmond. And there was one of the rubs: the branch of the Bank of the US stationed in Richmond was too much competition for the alternative, state banking system.

Exacerbating the dispute was the mismanagement of the Second Bank of the United States, which — shades of today’s crisis — had provided easy credit for a land boom in the south and west. When the bank called in its improvident loans to state banks — to cover its own debts, which it had improvidently incurred — the default of banking institutions swept like wildfire across the southern and western states.

Battle lines were drawn. The states moved against the national bank. Ohio, in the most radical reaction, outlawed the Bank of the United States, using the theory of “nullification," according to which states could cherry-pick federal laws, rejecting whichever they chose. Nullification had been around since 1798; it gained from the support of none other than Jefferson and Madison. (Though it must be admitted that when pressed about the constitutionality of nullification, Madison hedged and declared that it was an extraconstitutional option.) Taken to extremes, nullification implied the right to secede, with each state being judge of the constitutionality of its own cause; and, as Ohio later tried to do with McCulloch v. Maryland, the right to reject Supreme Court decisions. Though the Civil War and numerous Supreme Court decisions have hacked both the legs and arms from nullification, like the black knight in Monty Python’s Holy Grail it keeps coming back for one more round. Russell Pearce, an Arizona state senator, is sponsoring the latest nullification bill. Libertarians should reflect on the fact that nullification cuts both ways: it is at least as likely to be used to nullify as to uphold individual rights.

To cover its debts, Maryland passed a law taxing — in the most punitive and unconventional manner — the Bank of the United States. James McCulloch, head of the bank's Baltimore branch, refused to pay the tax. Maryland sued.

When the case finally reached the Supreme Court in 1819, Marshall found for McCulloch — less than a week after the conclusion of oral arguments, leading some to wonder whether he’d written the decision before hearing counsels’ arguments. As well he might. Not only was the case “arranged” (both parties sought an expeditious decision), but Marshall apprehended the issues immediately, commenting to a fellow justice that “if the principles which have been advanced on this occasion were to prevail, the constitution would be converted into the old confederation.”

Many principles were in play in Marshall's decision: federal supremacy over the states within a constitutional sphere (Maryland could not punitively tax a federal institution), judicial review (reaffirmed), nullification (denied — state action may not impede valid constitutional exercises of power by the federal government)and finally, the most important issue, implied powers.

Invoking the necessary and proper clause of the Constitution, Marshall declared:

“Let the end be legitimate, let it be within the scope of the constitution, and all means which are appropriate, which are plainly adapted to that end, which are not prohibited, but consist with the letter and spirit of the constitution, are constitutional.”

Here, finally, was an objective guideline for the interpretation of the constitution. Accordingly, the constitutionality of the Bank of the United States was established without a doubt. By extension, its heir today, the Federal Reserve Bank, is a valid, constitutional entity empowered by Congress “to coin money and establish the value thereof” — however much we may disagree with its methods and their effects.

* * *

Much controversy over the Constitution and its meaning continues — witness the Russell Pierce case and the calls for the abolition of the Federal Reserve. Even the Bill of Rights is not fully settled law. During recent arguments before the Court, Justice Elena Kagan sought to minimize the importance of an attorney’s statement, with which she disagreed, by referring to “buzz words”:“heightened scrutiny” and “rational basis.” These words refer to the standards a court should employ in assessing the impact of governmental action that may affect individual rights. As Chip Mellor of the Institute for Justice has stated in Forbes, contemporary judicial activism has “creat[ed] a hierarchy of rights with those at the top (like the First Amendment) receiving relatively strong protection — the heightened scrutiny — and those at the bottom (property rights and economic liberty) receiving very little," since these latter are thought to require a "rational basis" for review.

Following Cherokee Nation v. Georgia, Jackson is reputed to have said, “John Marshall has made his decision, now let him enforce it.”

The Constitution is only "settled" law in the sense that nearly all Americans accept it as not only our primary founding document, but also as the lawful basis for our government. In many other respects, it is far from "settled" — witness the extremely varying interpretations ascribed to it and the continuing legal battles over exactly what it means and how to apply it. John Marshall, in Marbury v. Madison and McCulloch v. Maryland, set parameters within which that debate should productively take place. Understanding those two cases — and Marshall's perspective — is essential to a knowledgeable understanding of our government's structure and powers.

As libertarians, we can be most effective if we work within the framework of accepted law to protect and extend liberty, rather than making ineffective flanking attacks from the swampy fringes, armed with quixotic arguments. The Constitution must be scrupulously and objectively interpreted, and with due respect for Marshall’s great tradition: first, as Randy Barnett has suggested, according to “original meaning” of the words used at the time; then, according to “original intent” — a less stringent bar, requiring interpretation of documents such as the Federalist Papers. This approach to interpreting the Constitution is a firmer bulwark for liberty than the well-intentioned but murky intellectual musings of Jefferson, which — though noble and intelligent — are no substitute for tight legal reasoning.




Share This


Irene: The Man-Made Disaster

 | 

I am a victim of Hurricane Irene.

My friend and I were visiting New York when Irene “struck” early today — Sunday, August 28. We had plane reservations to leave the city on Saturday, August 27. Delta Airlines canceled our reservations on Friday afternoon. It, like all the other airlines, abandoned traffic to New York more than 24 hours before any hurricane could possibly have caused trouble at the airports. Because of these cancellations, travel throughout the nation was convulsed.

None of this was necessary, or wise, or profitable to anyone. It was the result of a panic induced by government and media, and willingly indulged by the kind of corporations that have acquired the worst characteristics of both &‐ arbitrary power and a zest for misinformation. When our reservations were zeroed out, we were emailed, almost a day after the fact, “Your flight has been cancelled” (no apology, no explanation); then we were told that “we have rebooked you on another flight” — two days later. Notice the transition between the passive mood, which people in power reserve for the bad things they do, and the active mood, which they choose for the good things they don’t do. Our flight wasn’t rebooked by the airline; it was rebooked by us, after we pestered the airline and they eventually returned our call, and after we were unable to rebook it on the airline’s website, which wasn’t working. The woman who finally assisted us acted as if it was an amazing idea that we should be reimbursed for the downgrade of our tickets from first class to coach.

But let me report a few highlights of this ridiculous exercise in misinformation and authoritarianism, by which all America was damaged by a minor storm.

On Wednesday, ABC reported that the hurricane, then reputedly a category 3, or maybe 2, “could be category 4 by Thursday.” Other media, including the Weather Channel, suggested that it would be. When the hurricane came ashore in North Carolina on Saturday, it was barely a category 1, something that the media geniuses never believed could happen to their darling, “the hurricane of a lifetime,” although normal people easily guessed it. By Saturday evening, Irene was visibly disintegrating, had lost its eye, and was about to become a mere tropical storm, and not an especially strong one. Yet at that time, the mayor of New York was strongly advising all people to stay at home between 9 p.m. Saturday and 9 p.m. Sunday, had closed all mass transit at noon on Saturday, had sent his goons out to advise people living in 30-story buildings that they ought to evacuate, because the park next door might flood, and was telling workers to plan on mass transit still being shut down during their Monday morning commute. He seemed to enjoy himself, decreeing fates like that.

Businesses were closing everywhere in Manhattan, because of the mass transit shutdown, but my friend and I found a restaurant, “Da Marino,” that promised to be open on Saturday evening, and on Sunday evening if possible. To deal with the transit problem, the management had rented rooms for their employees in a hotel next door. So on Saturday night we enjoyed a good meal and listened while people accurately identified Bloomberg as the man who was causing the mess. But most merchants had shut down on Saturday afternoon, or failed to open that day at all. All Starbucks stores shut down. Pastry shops that cater to the local hotel business shut down, even though they had a captive mob of customers. Madame Tussaud’s shut down. Even churches canceled their Sunday services. Leaving Da Marino after an excellent dinner, served to customers reported to be more numerous than at any time in the restaurant’s history, my friend and I looked down Broadway from 49th Street to Times Square. The lights were on, but there was no crowd, no life, no business. A few people drifted across the street, in posses of two or three. Official vehicles could be seen in the distance, idling and flashing their lights. A faint drizzle of rain came down. That was the Great White Way on Saturday evening, August 27.

And why? Because the official class decreed that there should be a disaster.

Back in our hotel, we turned on the disaster reports on TV. Local news was enthusiastic about a picture of Grand Central Station standing empty except for cops who were there to fend normal people off. “No reason why you should go there anyway,” the news anchor said. A young newsperson, standing on location amid a few drips of water, predicted that soon, very soon, the neighborhood in which he stood would be hopelessly flooded. Anchorpeople advertised the fact that 4,000 people were now without electricity in New Jersey, New York, and Pennsylvania, not stating how many of the millions who live in those areas are without energy at any normal time. The electric company, prompted by the mayor, threatened to cut off energy “preemptively” to large areas of New York City, allegedly to protect its equipment against flooding. And to make matters worse, yet another of Bloomberg’s constant news conferences was threatened.

My friend and I fell asleep. When we awoke at 10 on Sunday morning, the rain had gone; the sun was shining; and people were walking the streets, sans umbrellas, hunting for places to eat. Places to enjoy. Places to honor with their business. Places that had survived the onslaught of paternalism.

Soon we will hear how many billions of dollars Hurricane Irene cost the nation. But remember: the hurricane itself was responsible for virtually none of those losses. This was a manmade disaster.




Share This


More on Government Motors

 | 

Earlier this year, President Obama went on one of his gloating tours, touting the wisdom of his nationalization of General Motors and Chrysler. Theirs was a corrupt bankruptcy that strongly rewarded the UAW, one of Obama’s major financial contributors. The new GM then posted a few months of improved sales, leading to much crowing by all the corrupt cocks.

But lately, the road for what is derisively termed “Government Motors” has become rather bumpy, as illustrated in a recent story. The report is about how the New GM is trying desperately to get a dismissal of a class action lawsuit filed on behalf of 400,000 Chevy Impala owners.

The suit, filed by one Donna Truska, argues that the Impalas — made between 2007 and 2008 — had defective rear spindle rods, leading to rapid tire wear. The plaintiff claims that GM has breached its warranty, and demands that GM fix the cars.

But the new GM argues that since the cars were made by the Old GM, it is not liable for the repairs, and the 400,000 Impala owners should therefore go to hell. Of course, the New GM was only too happy to take over the losses of the Old GM so it could stiff other taxpayers out of future taxes on the New GM, but it doesn’t want to assume any liabilities.

And of course, back in March of 2009, as GM headed toward bankruptcy, Obama promised that in any action he took to “save” GM, consumers would have their warranties honored. As he trumpeted at the time, “Let me say this as plainly as I can. If you buy a car from Chrysler or General Motors, you will be able to get your car serviced and repaired just like [sic] always. In fact, it will be safer than it has ever been. Because starting today, the United States will stand behind your warranty!”

Another Obama lie, of course. He stood behind GM warranties about as much as he has stood behind the American dollar . . .

Meanwhile, shares of the New GM hit a new low of $22 a share.




Share This


Seen and Unseen

 | 

Recently, President Obama stumbled through a poorly conceived bus tour of several states in the Midwest. The object of the junket seems to have been to counter media coverage of the GOP presidential candidates who’d gathered in Ames, Iowa, for the first major straw poll of the 2012 election cycle. Instead, Obama made comments about car- and truck-manufacturing that reminded listeners of his central-planning mindset. He treated a group of Tea Party leaders in a haughty and condescending manner. And, most damning, he stammered through the following self-justification:

“We had reversed the recession, avoided a depression, gotten the economy moving again. But over the last six months, we’ve had a run of bad luck.”

The man is not good at improv. And he’s not well-read. Numerous pundits (not all of them right-leaning) noted that the president’s excuses reflected this famous quote from the great Robert Heinlein:

“Throughout history, poverty is the normal condition of man. Advances which permit this norm to be exceeded — here and there, now and then — are the work of an extremely small minority, frequently despised, often condemned, and almost always opposed by all right-thinking people. Whenever this tiny minority is kept from creating, or (as sometimes happens) is driven out of a society, the people then slip back into abject poverty. This is known as ‘bad luck.’”

How could Obama, a man who trades on being seen as smart and articulate, make such a boneheaded gaffe?

I think this has to do with the ignorance and insular nature of American statists. They operate under a simplistic notion of politics — call it “Manichean,” if you feel like being generous about their philosophical grounding, “infantile” if you don’t.

Their adversaries are “enemies;” and their enemies are “terrorists,” “extreme” and “crazy.” (The quoted terms in that last sentence are from a few recent articles posted on dailykos.com — but Obama and his underlings have used them, too.)

These mutterings reflect a shallow worldview. American collectivists haven’t read the books that define and expand on free-market philosophy; most justify their ignorance by dismissing Hayek, Mises, Rand et al., as “evil.” Instead, they seem to skim some magazines and websites. Mostly, though, they watch TV. And they focus on the personal manners (and lives) of limited-government advocates, rather than the substance of the positions.

Obama’s supporters focus on (and equate themselves with) the weakest and least rational of the president’s critics — a motley crew of bigots and conspiracy mongers. As a result, Obama’s supporters weaken themselves. They can’t understand that there are rational criticisms of a president who has done so much damage to the philosophical and political foundations of the United States.

They just don’t see.

This blindness has rendered Rep. Ron Paul — the most effective advocate of real limited government among the recognized presidential candidates — something of an invisible man.

Because you read Liberty, you know more about Dr. Paul and his latest campaign for the White House than do most Americans. But, for a moment, put yourself in the shoes of an ordinary salt-of-the-earth citizen or even an impassioned Obama supporter. You’d probably have only the vaguest sense of who Ron Paul is. And you wouldn’t understand how many people share Paul’s perspective and beliefs. When Paul finishes a razor-close second in the aforementioned Iowa straw poll, you’d fall back on your epithets. Or just deny the whole thing.

Rep. Debbie Wasserman Schultz (whose retiree-heavy Florida congressional district gobbles up more than its share of federal benefit dollars) took the first option. Here’s some of the invective she hurled about the Iowa straw poll results, via CNN:

“In previous presidential campaigns, we might have chalked extreme fringe-type candidates like Michele Bachmann and Ron Paul as an anomaly. . . . But we’re looking at the core of the Republican Party now. The heart of the Republican Party is the extreme right wing.”

No surprise. A woman who counts on scaring pensioners to maintain her livelihood is bound to vilify people who talked about benefit cuts. I’d just like, once, to read a quote from the wretched Ms. Wasserman Schultz that didn’t include the word “extreme.”

Most statists, though, have simply chosen to pretend Paul doesn’t exist. He and Rep. Michele Bachmann finished in a near-tie for first place in the Iowa straw poll, separated by less than 1% of all votes cast. Mrs. Bachmann won but, if the vote had been an actual election, many jurisdictions would have called for an automatic recount. The next Sunday, Bachmann appeared on all five of the so-called “major” weekend TV news programs; Paul appeared on none.

Obama’s supporters can’t understand that there are rational criticisms of a president who has done so much damage to the philosophical and political foundations of the United States.

Bachmann — whose public persona strikes some as addled — serves as a stand-in during this election cycle for the absent Sarah Palin. Perhaps that’s why the establishment media revels in making Bachmann look ridiculous, as a recent and unflattering cover picture on Newsweek magazine proved. Media outlets that still favor Obama seem to be following a strategy of portraying Bachmann as “crazy” and, therefore, any Republican challenger to the president as crazy by association.

Paul is included in this scheme. But, mostly, he’s simply ignored. And this isn’t just a left-wing phenomenon. Fox News Channel’s top-rated host Bill O’Reilly, a statist of a nominally “conservative” stripe, goes out his way to ignore Paul. And, when pressed, O’Reilly dismisses Paul’s chances of winning even the GOP nomination as “zero.”

In the days after Bachmann’s media blitz, a slight shaft of light — from an unexpected source — cut through the willful darkness. TV talk show host and topical comedian Jon Stewart ran a humorous segment pointing out the media’s obvious denial of Paul’s presence and popularity. Stewart referred to Paul as “the 13th floor” of the presidential news coverage and took cable TV reporters to task for blatantly ignoring the congressman’s close second-place finish in Iowa.

The New York Times, the Associated Press and U.S. News (yes, it still exists as an online news site . . . but has dropped “and World Report” from its name) followed Stewart’s satire with semi-serious articles that discussed the media’s dismissal of Paul, in Iowa and in general.

The Associated Press piece acknowledged that Paul has raised enough money to stay in the presidential race for a long time. And that his supporters are more dedicated than most. But it concluded:

“Still, Paul finds himself outside the bounds of traditional Republicans. His opposition to the wars in Iraq and Afghanistan defines him as a dove. His skepticism toward the Federal Reserve has spooked Wall Street. And his libertarian views on gay rights draw the ire of social conservatives. He also tweaks Republicans on foreign policy, arguing it isn’t the United States' role to police Iran's nuclear program or to enforce an embargo with Cuba. ‘Iran is not Iceland, Ron,’ former Sen. Rick Santorum told Paul during Thursday's debate.”

An article on presidential politics that quotes Rick Santorum as an authority on anything is suspect, in my opinion.

The U.S. News piece concluded lazily by quoting an establishment media hack to characterize Paul’s candidacy:

“ 'He’s got a very dedicated cadre of people,’ says Larry Sabato, director of the University of Virginia’s Center for Politics. ‘And they're very intense, but they’re relatively few in number . . . It’s ridiculous talking about him getting the nomination.’ ”

Prof. Sabato’s record on political predictions is no more reliable than former Sen. Santorum’s.

I have no idea how well Ron Paul will do in the coming presidential primaries. But I know that he has the money and the organization in place to campaign until the GOP convention. And I know that he’s announced he won’t seek reelection to his seat in congress, so that he can dedicate himself to this presidential run.

I hope he lasts long enough to force more of the establishment media and the GOP powers-that-be to acknowledge he exists. And that his arguments for limited government are as mainstream as anything the rent-seeking Ms. Wasserman Schultz has to say.

Now, if we can only get them to acknowledge Gary Johnson . . .




Share This


The Best of the Alien Films

 | 

Our summer of the aliens ends with the best alien encounter movie of the decade. Attack the Block has it all: mysterious creatures crashing out of the sky; kids on bicycles pedaling to save the planet; a mass of hairy apes climbing up buildings; and avowed enemies unitingagainst the invaders. Add to this a truly libertarian hero who learns that "actions have consequences," and enough blood to paint an elevator. What more could you want from a summer movie?

You might not have heard of Attack the Block, but you probably know its pedigree. It's a British film produced by Edgar Wright, who made Shaun of the Dead (2004) and last year's Scott Pilgrim vs. the World. It's directed by Joe Cornish, who was also involved in Hot Fuzz (2007) and the upcoming Tintin. I have to admit, these films are an acquired taste, but I think they are a taste worth acquiring.

The story takes place in a neighborhood of high-rise apartment buildings in the poor part of south London. As Sam (Jodie Whittaker) walks home from her job as a nurse, she is mugged by a gang of threatening young men in ski masks. Their crime is interrupted by an alien falling out of the sky and into a car right next to them, and Sam is able to run away. The rest of the film follows the young thugs as they first try to make money from the beast and then run for their lives as the creature's larger pals come looking for it.

One of the unexpected delights of this film is the way we get to know the boys themselves. These are not hardened criminals but novice thugs on bicycles who strut down the street to impress each other while surreptitiously calling home to reassure their parents that they will be back by ten. Interestingly, Joe Cornish says he was inspired to write this film by being mugged by a gang of boys who seemed as scared as he was. They are led by a young tough with the unlikely name of Moses (John Boyega), who turns out to be quite the leader — almost like the preacher in Poseidon Adventure.

Moses recognizes that they can't rely on the police to help them, or even to believe them, so they must rely on themselves to escape the aliens and save the block. They don't seem to feel it is their responsibility to save the world, just their own little corner of it. As a libertarian, I like that. And then there are the unexpected side characters: the crazy drug dealers who get involved, the little wannabes who call themselves Probs (Sammy Williams) and Mayhem (Michael Ajao) . . . and the rich kid wannabe . . . and the crazy weapons . . . and clever lines . . . Just trust me. It's a great movie. And the less you know in advance, the better.

This is the best kind of sci-fi horror movie. Early encounters with the aliens take place off screen or behind walls, with sudden quick bursts of teeth or fur that don't let us focus enough to see what they look like. We just know they are terrifying. We see them creeping through the shadows, with occasional glimpses of their neon-bright teeth, but we don't have a full view of the creatures until at least halfway through the film. To be sure, there's enough blood and gore to warrant the R rating, but the violence is brief and somehow fun.

Give Attack the Block a try. You'll be laughing with horror and screaming with delight.


Editor's Note: Review of "Attack the Block," directed by Joe Cornish. Studio Canal, 2011, 87 minutes.



Share This


No, Really — Why Does He Do That?

 | 

Have you ever noticed that President Obama runs whenever he sees a set of stairs?

He always (check this out, and you’ll find I am right) runs up and down the steps to his plane. If there are steps to the platform where he’s going to speak, he runs up and down those steps. I mean, even when he’s speaking in the White House, where he lives, he thinks he needs to run the two steps to his podium.

Now he’s touring the Midwest on a gargantuan bus that is supposed to make him look like a normal Midwesterner. Good luck. But the vehicle has the usual three steps between the ground and the body of the thing. So after every speaking engagement, Obama hauls off and runs up the steps of the bus. He runs up three steps.

I don’t mean that he walks fast. I mean that he runs like a junior high school kid doing his first competitive sprints. And he doesn’t just run the steps, he takes off from ten feet away, as if he needed to win third place at the Kalamazoo County JV meet and knew that you’ve gotta show your hustle if you wanta win. His arms are pumping, his legs are striving to please the coach, and his demeanor is like, “I can do it! I can run up these steps.”

For God’s sake, what kind of person is this?




Share This


"The Help" Deserves the Buzz

 | 

The Help is the film everyone has been talking about this week. Based on the bestselling novel of the same name by Kathryn Stockett, it has been eagerly awaited by book club members and sensitive readers nationwide since it was published two years ago. The film provides an intimate look at the often-demeaning relationship between white women in Mississippi and the black maids who served them during the turbulent 1960s.

During this time, women up north were beginning to recognize the vast career options available to them. But in the Deep South, women were still staying at home with their children, joining the Junior League, hosting bridge clubs, and criticizing "the help" — and each other. In this story, Hilly Holbrook (Bryce Dallas Howard) is the "queen bee" whose opinion matters to everyone, black or white. She controls the social life of the town by voicing her opinions firmly and then leads the shunning of anyone who dares to disagree with her. Her kind of female has always existed, of course, and not just in the South. She has been immortalized in such films as The Women and Mean Girls, and can still be found controlling social groups, PTA meetings, cheerleading squads, and even board rooms, with a raised eyebrow and a withering look. No one likes her, but no one dares to cross her.

In the story, Hilly has been leading her group of friends since grade school. All of them are now married with children, except Skeeter (Emma Stone), who has chosen to finish college and wants to become a writer. She lands a job at the local newspaper as an advice columnist answering questions about house cleaning. Ironically, of course, Skeeter has never polished a spoon or scrubbed a bathtub ring in her life. So she turns to "the help" for help, in the person of Aibileen (Viola Davis), her friend Elizabeth's maid. Eventually she convinces Aibileen and a dozen other maids to share their stories, and a book is born.

As a nation we are proud of how far we have come in terms of civil rights. But we still notice racial differences and often act accordingly.

Aibileen is what Skeeter ought to be. Like many white college graduates, Skeeter simply "wants to be a writer." She doesn't have a burning topic just itching to come out. She wants the title of "writer" as much as she wants the occupation. When she applies for a job at Harper & Row, the editor (Mary Steenburgen) tells her, "Write about something that disturbs you, particularly if it bothers no one else." Skeeter looks for a topic that will allow her to become a writer, rather than using her writing to expose a problem she cares deeply about. Aibileen, by contrast, is simply a writer. She writes every night for an hour or two. She writes what is in her soul. She writes her prayers.

In many ways, Viola Davis as Aibileen carries the show and at the same time embodies the central conflict of the story. I say this because, although Davis is one of the finest actors in Hollywood, with an Oscar to her credit, you will seldom see that accolade in print without the modifier "black actress." As a nation we are proud of how far we have come in terms of civil rights: our schools and neighborhoods are fully integrated. We have a black president in the White House. But we still notice racial differences and often act accordingly. I would love to ask Davis how she feels about the roles she has been offered.

Equally impressive is Octavia Spencer as Aibileen's best friend, Minny Jackson, an outspoken maid who has lost so many jobs because of her sassy back talk that she now works for the last woman in town who will hire her — Celia Foote (Jessica Chastain), who is shunned by the ladies because of her "white trash" background. Celia doesn't know the rules of maid-employer relationships. Ironically, Minny teaches Celia the boundaries she and the other maids are trying to expose with Skeeter’s book. Spencer's large liquid eyes alternately shine with sharp-witted laughter and melt into pain-filled tears. If Aibileen is the soul of this black community, Minny is its heart.

Having read the book, I wasn't pleased to learn that the beautiful Emma Stone had been cast as the tall, skinny, unattractive Skeeter, since her gangly appearance is such an important part of her character. But somehow Stone manages to look like a plain Jane in this film — her eyes are too big, her lips are too thin, her hair is too curly, and her face is too pale. In short, she is perfect.

Despite having grown up in Jackson, Skeeter really doesn't fit in with her snooty friends. She is disturbed by Hilly's insistence that Elizabeth install a separate bathroom for Aibileen. In fact, Hilly wants a law mandating separate facilities in private homes, "for the prevention of disease." This prompts Skeeter to examine the way maids are treated by the women who employ them. "Colored women raise white children, and twenty years later these white children become the boss," she muses. "When do we change from loving them to hating them?" Aibileen observes the same dilemma: "I want to stop that moment coming — and it come in ever white child's life — when they start to think that colored folks ain't as good as whites."

Toilets, and the material that goes into them, become the strongest recurrent image in this film. From diapers and potty training to vomiting and pranks, toilets are a symbol for what was wrong with the "separate but equal" policy in the south. The facilities were separate, but they most assuredly were not equal. Aibileen's bathroom is a plywood closet located in a corner of the garage with a bare bulb hanging from a wire, and toilet paper resting on a bare 2x4. The symbol, which emphasizes how badly blacks could be treated by whites in those days, provides moments of both shame and laughter.

However, the film misses the richer, darker, and more sinister tone that underlies the book. For black women to write about their employers was no joke, and the book makes it clear that its women are risking real dangers when they decide to tell the truth. Permanent job loss, physical violence, and even jail are real threats in a society where the mere accusation of a crime can lead to vigilante justice with lifetime consequences. By showing this clearly, the book gains a tension and suspense that is missing from the film.

The most important question asked by The Help is this: how did these southern women go from loving the black maids who reared them as children to degrading them in adulthood?

Strangely, I found it more difficult to enter the minds and lives of the maids while watching the film than I did while reading the book. The story is told through the three voices of Aibileen, Minny, and Skeeter, who narrate alternating sections of the book. These voices are strong and rich, and I could enter their worlds, empathizing with their experiences vicariously. In the film, however, I was merely an observer. I often felt defensive, rather than empathetic, about what I was seeing, as though I were somehow responsible for the actions of those women long ago, simply because I am white. If we learn anything from our battle for civil rights, however, it is that each person should be judged individually, and not collectively as part of a race.

The most important question asked by The Help is this: how did these southern women go from loving the black maids who reared them as children to degrading them in adulthood? Stockett, who was reared in Mississippi by a black maid whom she says she loved, suggests that they learned it from their mothers, by example as well as by instruction. To quote Oscar Hammerstein in South Pacific, racism "has to be carefully taught." But books like this also suggest that children can be carefully taught not to be judgmental. Every day Aibileen tells Elizabeth's little girl, "You is smart. You is kind. You is important." She says nothing about little Mae Mobley's appearance, good or bad. Knowing that she will likely be fired or retired before Mae Mobley reaches her teen years, Aibileen hopes desperately that these words will be enough.

As is often the case, the film is good, but the book is so much better. Don't take a short cut this time. Read The Help first, and then see the movie. You will enjoy both so much more if you do it that way.


Editor's Note: Review of "The Help," directed by Tate Taylor. Dreamworks, 2011, 137 minutes.



Share This


Global Warming Updates

 | 

Two recent stories concerning the theory of Anthropogenic Global Warming (AGW) caught my eye and are worth noting.

The first is the news from Forbes that a recent study of NASA satellite data from the last decade (2000–2010) shows that far more heat is escaping the earth’s atmosphere than has been predicted by AGW computer models. This in turn means that there will be far less global warming than predicted by those models, which were used by the UN climate science panel which took a dire view of the planet’s “warming.”

As the study’s co-author, Dr. Roy Spencer — a climate scientist at the University of Alabama at Huntsville and Science Team Leader for the Advanced Microwave Scanning Radiometer on board NASA’s Aqua satellite — put it, there is a huge discrepancy” between the empirical, observational data coming from NASA’s Terra satellite and what has been predicted by the climate warming crowd.

If you still have the quaint and antiquated notion that scientific theories ought to comport with observed data, this gap is, to say the least, disconcerting.

But the Terra satellite data are consistent with earlier data from another NASA satellite (the ERBS satellite) from an earlier period (1985 to 1999), which showed that vastly more long-wave radiation (therefore heat) escaped the atmosphere than was predicted by the global warming models. We now have a quarter of a century of data from two different satellites, pretty much saying the same thing.

The problem for the computer models seems to be that they predict that the increase in CO2 will cause an increase in atmospheric humidity and cirrus cloud cover, which in turn will trap heat, but the data seem at variance with the prediction. Curious, no?

The second story is about the scientist — one Charles Monnett, to be precise — who published an influential article in the journal Polar Biology in 2006 urging the claim that polar bears were drowning in the Arctic Ocean, presumably because the ice had melted from global warming. The article was based on Monnett’s observations, and this “peer-reviewed” article became an instant hit in the world of environmental activists. The article helped bring the polar bear to the forefront of the worldwide enviro movement. For example, the allegedly beleaguered animal figured into Al Gore’s movie, An Inconvenient Truth, which showed sad polar bears — oh, so cute and cuddly! — swimming desperately in search of ice.

That research was then cited in 2008 when the Department of the Interior decided to put the polar bear on the endangered species list. And it is frequentlyused as part of the evidence that global warming is an imminent threat to animal life, so we need massive policy changes, with potential costs in the trillions.

Now, this particular bit of “science” should have aroused some scrutiny before, because it reeks of tendentious incompetence at work. The observational base of the study allegedlyconsisted of four (count ’em, four) polar bear carcasses floating in the ocean, observed from a plane flying at an altitude of 1,500 feet, on a research expedition studying — whales! No autopsy was done on the bears to see if they had drowned; their drowning was just “inferred.”

Note: the internal “peer review” panel included Monnett’s wife! “Yeah, Honey, your paper looks super! Please pass the pasta . . .”

Monnett is now under investigation for scientific misconduct by the Department of the Interior’s Inspector General’s Office, and has been placed on administrative leave from “the federal agency where he works.”Researchers are talking to him and his research partner about their work. Monnett’s career may wind up looking like . . . well . . . a dead polar bear from 1,500 feet up.




Share This


The Passing Paradigm

 | 

The latest much-ado-about-nothing crisis passed, with a result that should seem familiar. In 2008, Americans were told that if the TARP bill (a $787 billion taxpayer-funded welfare handout to large banking institutions) wasn’t passed, the stock market would crash and massive unemployment would follow. After an unsuccessful first attempt to pass the bill amid angry opposition from constituents, the bill passed on a second vote. Subsequently, there was a stock market crash followed by massive unemployment.

This time, our political-media cabal told us that if Congress was unable to pass a bill to raise the debt ceiling, the government would not be able to meet its short term obligations, including rolling over short term bonds with new debt. US debt would be downgraded from its AAA status, and a default would be imminent. After the melodrama, Congress passed the bill raising the debt ceiling. Standard and Poor’s subsequently downgraded US Treasury debt anyway, and deep down everyone knows that a default is coming as well, one way or another.

We are seeing the end of a paradigm. Thomas Kuhn argued in The Structure of Scientific Revolutions (1962) that anomalies eventually lead to revolutions in scientific paradigms. His argument holds equally true for political paradigms.

A paradigm is a framework within which a society bases its beliefs. For example, people at one time believed that the forces of nature were the work of a pantheon of gods. Sunlight came from one god, rain from another. The earth was a god, as was the moon. With nothing to disprove the premises of the paradigm, it persisted. People went on believing that sunlight and rain were the work of sunlight and rain gods because there was no compelling reason for them to believe otherwise.

However, within any paradigm there are anomalies. Anomalies are contradictions — phenomena that cannot be explained within the framework of the paradigm. People have a startling capacity to ignore or rationalize away these anomalies. While it may defy logic to continue to believe that rain comes from a rain god even after evaporation and condensation has been discovered and proven, people would rather ignore the anomalies and cling to the paradigm than face the fact that the paradigm is false.

There is at least one thing that will be quite obvious: centralized government is insane.

But once there are too many anomalies, the paradigm fails, and a new one must take its place. This new paradigm renders the old one absurd, even crazy. At some point in the future, people will look back on the political paradigm of the 20th and early 21st centuries. There is at least one thing that will be quite obvious to them: centralized government is insane.

Consider the premises upon which this present paradigm relies: all facets of society must be planned and managed by experts. The judgment of the experts trumps the rights or choices of any individual. The choices made by the experts will result in a more orderly society and greater happiness for the individuals who compose it. There will be better results from one small group of experts controlling everyone than multiple groups of experts controlling smaller subgroups of society.

Of course, libertarians reject every one of these assumptions on its face. A free society does not tolerate “planning” or “management” by anyone. All choices are left to the individual, as any attempt to plan or manage his affairs amounts to either violation of his liberty, looting of his property, or both. However, let’s assume that the first three assumptions of the present paradigm are valid and merely examine the last. Even that does not hold up to scrutiny.

Suppose an entrepreneur starts a business. At first, his market is local. He opens retail outlets that are overseen by store managers. The entrepreneur is the CEO of the company and manages the store managers. Even at this point, the CEO must trust day-to-day decisions to his managers. He has no time to make everyday decisions as he tries to expand his business. The managers do this for him and he concentrates on strategic goals.

His business is successful and soon he begins opening outlets outside of the original market. He now has a need for regional managers to manage the store managers. He manages the regional managers and leaves the details of how they operate within their regions to them.

The business continues to expand. With retail outlets in every state, there are now too many regions for the CEO to manage directly. The CEO appoints executive directors to manage larger regions, each composed of several smaller ones. There is an executive director for the West Coast, another for the Midwest, and another for the East Coast. Of course, the CEO has the assistance of his corporate vice presidents who manage sales, operations, human resources, and other company-wide functions from the corporate office.

Now, suppose that one day the CEO decides to fire the executive directors, the regional managers, and the store managers. He will now have the salespeople, stock clerks, and cashiers for thousands of retail outlets report directly to him and his corporate vice presidents. Would anyone view this decision as anything but insane?

As silly as this proposition sounds, this is a perfect analogy for how we have chosen to organize society for the past century. The paradigm rests on the assumption that every social problem can be better solved if the CEO and his corporate staff manage the cashiers and the salespeople directly. As in all failed paradigms, anomalies are piling up that refute its basic assumptions.

This paradigm assumes that centralized government can provide a comfortable retirement with medical benefits for average Americans, yet Social Security and Medicare are bankrupt. It assumes that a central bank can ensure full employment and a stable currency, yet the value of the dollar is plummeting and unemployment approaches record highs (especially when the same measuring stick is used as when the old records were set). It assumes that the national government’s military establishment can police the world, yet the most powerful military in history cannot even defeat guerrilla fighters in third-world nations. It assumes that the central government can win a war on drugs, yet drug use is higher than at any time in history. It assumes that experts in Washington can regulate commerce, medicine, and industry, yet we get Bernie Madoff, drug recalls, and massive oil spills.

Hundreds of years ago, the prevailing medical science paradigm assumed that illnesses were caused by “bad humors” in the blood. Operating with that assumption, doctors practiced the now-discredited procedure known as “bleeding.” They would cut open a patient’s vein in an attempt to bleed out the bad humors. As we now know, this treatment often killed the patient. Most rational people today view the practice of bleeding as nothing short of lunacy.

Ironically, this is a perfect analogy for the paradigm of centralized government. The very act of a small group of experts attempting to manage all of society drains its lifeblood. It is the uncoerced decisions of millions of individuals that create all the blessings of civilized society. It is the attempt by a small group of people to override those decisions that is killing society before our very eyes. Someday, people will look back on our foolishness and laugh as we do now at the misguided physicians who bled their patients to death. The present paradigm is dying. The revolution has begun.




Share This


The Huddled Masses Leaving En Masse

 | 

As it happens, business brings me to my favorite American travel destination, New York City. As an L.A. dude, I like Los Angeles’ great weather and more laid-back attitude. But Manhattan is something L.A. can never be, namely, a walker’s paradise. Happily ensconced in a very modest hotel in Midtown, I can take off in any direction and just walk, seeing the sites and working up my appetite, which can be sated at any number of superb (if somewhat spendy) restaurants.

So I couldn’t help noticing a Wall Street Journal piece about the exodus of New Yorkers from the state in general and the Big Apple in particular.

The US Census data show that over the last decade, about 1.6 million New Yorkers moved out of the state. The biggest chunk of these émigrés was from the city itself: 70% of New Yorkers moving out of state were from NYC, and another 10% were from Westchester and Nassau Counties, which are essentially suburbs of NYC.

These losses were offset in part by an influx of 900,000 foreign immigrants. But there was still a net loss of nearly 700,000 residents, and the number of foreign immigrants was the lowest in about four decades.

The three most popular destinations for fleeing New Yorkers are Arizona, Florida, and Nevada. This suggests that the desire for warmer weather may be a factor in peoples’ decisions to move. But two of those states have no state income taxes, which suggests that NewYork’s notoriously high taxes may be a powerful reason as well.




Share This


The Missing Link

 | 

Alien creatures threaten civilization as we know it, and humans must band together to defend themselves. Is this another review of Cowboys & Aliens? No — it's a review of Rise of the Planet of the Apes, a prequel to the iconic 1968 film Planet of the Apes that is earning praise from critics, moviegoers, and even PETA, the People for Ethical Treatment of Animals, who sent picketers out to show support for the film when it opened. Now there's a switch!

The original Planet of the Apes was sort of a space age Gulliver's Travels: an American space crew, headed by Charlton Heston as the Gulliver character, discovered a planet populated by intelligent apes instead of Jonathan Swift's horsey Houyhnhnms. In both cases, humans in the strange new land have no language skills by which to prove their intelligence, and are used as breeders and beasts of burden. Interestingly, Jonathan Swift coined the word "yahoos" to describe the morally bestial humans in his fantasy world.

No one who has seen Planet of the Apes can forget the gasp of horrified realization that happens when Heston, trying to escape the topsy-turvy planet and return to Earth (he's riding a horse, in a deliberate nod to Swift's story), discovers the top of the Statue of Liberty submerged in sand.  This scene has been immortalized through allusion and satire for nearly half a century. The message is clear: we cannot escape the future we create for ourselves on this earth.

The new film has its own gaspworthy instant, although it occurs midway through, not at the end. I won't tell you what causes the audible gasp in the audience, but I will tell you that I've asked everyone I know who has seen the movie if that gasp happened during their screening too, and all have said yes. It is a powerful moment, made more powerful by the astounding acting of Andy Serkis, an unsung hero of CGI technology. Serkis is the body behind Gollum in The Lord of the Rings (2001, 2002); the ape in King Kong (2005); and now the chimp, Caesar, in Rise. His movements, especially the expression in his face and his eyes, bring sensitivity, pathos, and life to what could have been flat computer generated characters.

Don't you just get so tired of the predictability of Hollywood movies blaming greedy pharmaceutical manufacturers for all our problems?

Rise of the Planet of the Apes creates a possible backstory for how the apes became the cultured, speaking, master race, while humans devolved into brutish creatures. I say "possible," because I'm not convinced that the film's premise works. The idea is that scientists, experimenting with chimps to discover a cure for Alzheimer's disease, inadvertently create the master race of apes and destroy the humans at the same time. The story is smart and engaging and ties up all the loose ends satisfactorily. But it blames the mutation on a single manmade event, completely changing the premise of the first film, which suggested that evolution and devolution will lead to the rise of apes and the fall of humankind.  The sand-covered Statue of Liberty at the end of the 1968 film suggests that the transformation happened over the course of many centuries, not in one generation.

Not surprisingly, capitalism (rather than science itself) is portrayed as the ultimate enemy to mankind. While research scientist Will Rodman (James Franco) is motivated by a desire to cure Alzheimer's, the company he works for is owned and directed by the obligatory greedy capitalist who uses and abuses the chimps in his quest for profits. (Don't you just get so tired of the predictability of Hollywood movies blaming greedy pharmaceutical manufacturers for all our problems?) This film goes a step further, however. For some reason I shudder to contemplate, the casting agent chose Nigerian David Oyelowo to play the brutish bad guy with a British accent. Not sure what the message of this decision might be, but it's hard to believe that the casting was accidental. Enough said about that.

Ironically, despite the filmmakers' obvious distaste for profits, they inadvertently acknowledge the power of money as a motivator when Caesar, the chimp who has been transformed by the chemical trials, wants the other primates to follow him: he buys their loyalty with Chips Ahoy cookies instead of fighting each one of them into submission. And it works! Now there's a message worth sharing.

A message that does not work, however, is the one that PETA especially liked — the portrayal of chimps as misunderstood neighbors who should not be feared. When Caesar makes his way outside to play with a neighbor child, the little girl's father picks up a baseball bat to protect her. He is portrayed throughout the film as a man with a bad temper (although he's an airline pilot; have you ever known an airplane pilot to be anything but calm and comforting?), and we are supposed to take the side of the chimp. However, the memory of the Connecticut woman whose face and hands were torn off by one of these animals two years ago makes it hard to sympathize with the man-sized creature and its lion-sized canines. Even if he does wear pants and a sweatshirt.

Several subtle moments add to the classy styling of this film. At one point, for example, Caesar sadly observes Will kissing his girlfriend (Freida Pinto), creating a poignant allusion to Mary Shelley's Frankenstein and the creature's longing for a woman like himself. Caesar is like Frankenstein's "monster" — too smart to be an ape, but too much an animal to be a human. Where does he belong? Another example: the primate house where Caesar and dozens of other apes are caged overlooks San Francisco Bay and Alcatraz Island, where the notorious prison was located. And a third: a brilliant moment of self-parody occurs with the musical motif that begins when the apes start escaping from the primate house. We hear an undercurrent of the "Dr. Zaius, Dr. Zaius" melody from The Simpson's musical parody of the original Planet of the Apes. How's that for aping one's apers?

All the Planet of the Apes films can be seen as cautionary tales, warning viewers that power and authority are ephemeral. Although the specific catalysts and destructive philosophies are subject to change, the impending doom — transference of power —  does not. On a weekend when the credit rating of the United States was downgraded for the first time in a century, this film is a timely reminder that there may, indeed, be real threats to our comfortable styles of living.

The Lord of the Rings


Editor's Note: Review of "Rise of the Planet of the Apes," directed by Rupert Wyatt. Twentieth Century Fox Entertainment, 2011, 105 minutes.



Share This


A Call to Repentance

 | 

Are there libertarians who still regard President Obama with affection?

I understand that some people voted for him because they wanted to punish Bush and his fellow Republicans. The Republicans were warlike, and they were spendthrifts.

Well, if punishment is on the agenda, I want to be first in line to give some. Plenty, in fact. I’ll never get over George Bush’s ability to lie, lie, and keep on lying. But did you expect something better from Obama, you who supported him?

You did. I know you did. I heard you — at length.

As you said, Bush went to war, twice. But Obama continues running both wars, and he started a third one, the marvelously useless war in Libya. If he doesn’t get us involved in Somalia or Haiti, it will be a wonder.

As you said, Bush spent too much money. But Obama started off by spending a trillion dollars on a feckless economic program. He instituted a healthcare scheme that, basically, nobody wanted, which will cost at least half a trillion more and will give us notably less effective healthcare.

On August 8, Obama addressed the nation’s economic problems by demanding higher taxes and accusing those who don’t (such as you) of having caused the present economic distress. While he was talking, the stock market dropped like a rock. It lost 634 points that day.

But perhaps those who expected something libertarian out of Obama were right in one respect. His presidency has been wonderful for the gold market.




Share This


The Perfect Ending

 | 

After an agonizingly protracted battle, congressional leaders and the president reached an agreement to raise the debt limit, with some minor cuts in spending now and supposedly more cuts in the future, cuts that will be determined by a bipartisan panel.

There has been considerable rending of clothes and gnashing of teeth on both the left and the right sides of the political spectrum. But really, the agreement probably captures the mood of the majority of Americans.

As I have noted before, people are only just beginning to see the entitlement spending iceberg towards which the nation’s economy has been sailing for decades. But polls show that the public — including self-described Tea Party members — still strongly support the major culprits in the fiscal follies with which the country is beset: the entitlement programs, especially Social Security and Medicare.

In sum, the public is beginning to see the problem, but remains clueless — or, to wax Nietzschean for a moment, deliberately blind — to the real cause of the problem.

The agreement had immediate effects; though not ones, I daresay, that were comprehended by the supercilious solons who spawned it. And I’m not talking about the Standard & Poor’s downgrade.

First, as the US Treasury reported, the national debt immediately shot up $238 billion to a grand total of $14.58 trillion, officially hitting the mark of 100% of GDP. We as a nation have hit that mark only once before, right after World War II, the biggest foreign war we ever fought. We are now there again, in a time of comparative peace. As the report drily notes, this debt level puts us in the league of countries such as Italy and Belgium.

The second effect was not a stock market rally created by the exuberant joy of investors, relieved that disaster had been averted, but instead a massive sell-off, caused at least in part by the recognition that disaster looms.

All this brings to mind the old adage: a country gets the government it deserves.




Share This


Investment Opportunities

 | 

Walking in the downtown area of the small city where I live, I came upon a raggedy man engaged in a heated conversation with a man and woman sitting in a slightly less raggedy sedan. The man alternated between leaning into the car to hear what the man or woman was saying and then standing up to yell what he had to say — so that everyone in the vicinity (which meant just me, at that moment) could hear his end of it.

“This is fuckin’ ridiculous. It’s such an easy thing. Such a fuckin’ little thing.”

He was thin, in a sickly way. His face was flushed and deeply lined; his teeth, few and gray. He moved and sounded like a junkie. The downtown area is full of them: men, mostly, who in previous generations would have worked in the timber business, displaced by the Endangered Species Act and warped by years of unemployment and welfare into Gollums of entitlement. Crystal meth is usually their drug of choice . . . but marijuana or cheap booze will do.

“I can’t believe you’re doin’ this to me. Settin’ me up like this. Settin’ me up to fail. Fuck.”

The sedan and the junkie were idling in front of a bank. It was pretty clear that the “this” the people in the car were doing — or not doing — involved money.

I tried to get a clear look at the people in the car. They were older than the junkie but it was hard to tell how much. Junkies age badly; and, even when they aren’t junkies, working-class people in the Pacific Northwest don’t age well. The people in the car might have been his parents. Or a sibling and spouse. Something about the junkie’s sense of indignation suggested a family connection.

“I mean, look. It’s a fuckin’ investment. Investment. That’s what it is.”

And so language is ground into oblivion.

In our state, people on the dole usually have to sit through various educational meetings or sessions as a part of getting benefits. My guess: on his stumble down the socioeconomic ladder, the junkie had waited impatiently while many, many government employees repeated threadbare lines about their agencies’ “investment” in job training or cheap housing or troubled people. He’d retained it as a powerful word, a money word.

But he had no sense of what “investment” actually means. No sense of the return that investors expect on their money. No sense of the responsibility that comes with accepting investment. To him, “investment” was just a fancy word for handout — and he used it in the same way that a deadbeat asks for “loans” that he doesn’t intend to repay.

Many observers, from George Orwell to Liberty’s own Stephen Cox, have noted that collectivists use euphemisms in an effort to strip actions of meaning. And, particularly, to strip bad actions of their badness. It’s a pernicious process that robs people of moral agency.

Many of the same goodie-giving government agencies that talk about welfare as “investment” describe welfare recipients as “clients.” The misuse of each word has similar effect. The word “client” usually refers to the paying customer of some kind of professional service. Someone receiving a good or service for free is not a client; but, if he hears himself called a “client” often enough, he may lose the ability to make that distinction. And expect to be treated like a client wherever he goes.

I went into the bank to do some business and, when I came out a few minutes later, the junkie was leaning near the window of the raggedy sedan. He wasn’t saying anything. Neither of the people in the car was saying anything, either. They were all just staring at each other. Unmoored from meaning, frozen in their indignation.




Share This


A Big Fish May Slip the Net

 | 

Last year, Illinois bucked the national trend and voted for a very leftist governor — Pat Quinn, who had assumed the governorship in 2009 after Rod Blagojevich was impeached and removed from office.

Despite winning by only a narrow margin, Quinn has governed as any devout leftist would. He has pushed wind and solar initiatives, signed a law eliminating the death penalty, and increased taxes like crazy. In the face of a $15 billion budget deficit, he raised the personal income tax from 3% to 5%, and increased corporate taxes from 4.8% to 7%. He also instituted a sales tax on internet sales (the “Amazon tax”). Businesses let it be known that they would consider leaving the state if his tax increases passed, but he went ahead anyway, laughing in their faces.

Now businesses may get the last laugh. They are indeed beginning to leave. Especially illustrative is the recent announcement by the CME Group that it is "evaluating” whether to move some operations to other states. It is currently in talks with Florida, Tennessee, and Texas. What do these states have in common? Hmm . . . let me think. Wait . . . Oh, I know. They don’t have state income taxes, and they are notoriously pro-business.

CME is a pretty big fish. It is the parent company of the Chicago Mercantile Exchange, the Chicago Board of Trade, and the New York Mercantile Exchange (which includes COMEX, the New York Commodity Exchange). With 2,300 employees and revenues of between $2 and $3 billion, CME would be a real loss to Illinois’ economy if it departed. But Illinois would, quite frankly, deserve it.

As for Quinn, he probably doesn't care. He probably expects that if Obama passes another “stimulus” bill, money will be shoveled Chicago’s way — in the Chicago way.




Share This


End of the Beginning or Beginning of the End?

 | 

After much huffing and puffing, the barons on Capitol Hill have reached a deal that prevents the US government from defaulting on its debt. At the same time they staved off, for the time being at least, a downgrade in America’s credit rating. The president, our Othello, was offstage as the deal was struck, and now finds himself a diminished actor, even as he prepares for his most challenging role as a candidate for reelection.

What in fact has occurred? The United States government has been pulled, kicking and screaming, into taking its first baby step toward fiscal responsibility. Elections, we find, do matter. For, love them or hate them, it is the Tea Party Republicans elected to Congress in 2010 who compelled Uncle Sam to stand up and walk. They and they alone managed to force the issue over the debt ceiling. Of course, they got nothing like the deficit reduction they were looking for. But they have both changed the debate in Washington and achieved a modest first step toward fiscal sanity.

The squeals of distress emitted by Democrats and their supporters in the media (most notably the New York Times) make plain just how much the tide has turned in Washington and, perhaps, the country at large. The consequences of spending beyond one’s means were brought home for the average American in the Panic of 2008. As a result of that financial meltdown, it became common wisdom that out-of-control government spending must, at some point, lead to disaster. It was this realization, as much as opposition to “Obamacare,” that led to the Tea Party sweep in 2010.

Nevertheless, great dangers remain for the Republicans. The second round of spending cuts mandated under the just-passed legislation amounts to a drop in the ocean of American debt. We are still looking at trillions of new debt being added over the coming years — an unsustainable level of deficit spending and borrowing. To solve this problem, revenues must indeed be on the table. The Republicans should come out strongly for real tax reform, with a lowering of both personal and corporate rates tied to the elimination of loopholes and other steps to broaden the tax base. If the Republican Party’s plan is to allow General Electric, for example, to continue to reap billions of dollars in profits without paying any federal tax, they will be signing their own political death warrant.

Additionally, some Republicans are clearly opposed to major cuts in the defense budget. That the people will accept austerity except in defense is an illusion. The United States is not seriously threatened militarily by any power on earth. We currently spend about the same amount of money on defense as the rest of the world combined. The global commitments of the United States must shrink. When Republican Senator Lindsey Graham said, as he emerged from the Senate vote on the debt ceiling, that America must finance Egypt’s transition to democracy, he revealed himself as doubly out of touch, for it is equally absurd to believe either that American dollars can create democracy in the Arab world, or that the average citizen is willing to throw away his hard-earned money on such a will-o’-the-wisp.

Congressional Democrats, on the other hand, are acting as if they have solved the deficit-debt problem, and are talking about moving on “to what Washington does best — creating jobs and opportunity for Americans.” If this is what they truly believe (and it certainly appears that many of them do think this way), then they too are barreling down the road to self-destruction.

The president is talking about a balanced approach, but does he mean it? He ignored the opportunity to move toward a balanced budget in the wake of the Bowles-Simpson commission’s report and the mandate for fiscal responsibility given to Congress by the voters in 2010. And even if he is serious, does it matter? He overreached himself in pushing for revenues in his one-on-one negotiations with Speaker John Boehner, and was then reduced almost to a cypher as Congressional leaders forged a deal largely on their own. His poll numbers are down and his political relevance is in question. The mediocrity of the Republican presidential field is his one comfort.

There is, of course, a dirty secret out there, unspoken but quietly acknowledged by many thoughtful people. It is that nothing the politicians can do will prevent another serious economic crisis, one perhaps much worse than 2008. The debt and deficit issues are not resolvable without draconian cuts and revenue increases, which taken together must derail any prospects for sustainable economic growth and job creation. Resorting to the printing press, as in 2008, is impossible given the level of government indebtedness. In any case it would only postpone the day of reckoning. A Keynesian jobs program was tried in 2009 and largely failed, at a cost of nearly a trillion dollars. Vast numbers of people lacking the education or skills needed in an economy that has been transformed by globalization will be left with nowhere to turn. The private sector cannot use them; the public sector will no longer be able to support them. Therefore we face . . . what? We certainly seem fated to live in “interesting times.”




Share This


The Cliché Crisis

 | 

I don’t know about you, but for me the worst thing about this year’s budget “crisis” was the gross overspending of clichés.

No, I’m not crying wolf. I am not holding America hostage. Neither am I rearranging deck chairs on the Titanic. Nor am I gleefully informing my close friends and colleagues that their favorite proposals will be dead on arrival when they hit my desk. Hopefully, I am acting more responsibly than anyone in the nation’s capital. I hold no brief for revenue enhancements (i.e., taxes), or for throwing grandma under the bus. I consider myself the adult in the room.

Nevertheless, I can’t claim that I cared very much about the budget emergency. I knew that I wouldn’t get what I wanted — even a small attempt to reform the federal government’s fiscal racket — so I couldn’t be disappointed by the spectacle that took place during the last week of July.

You can’t feel very bad because some Nigerian spam artist didn’t send you the $15 million he promised “in the name of God.” In the same way, you can’t feel very bad about the two political parties for failing to fulfill their promise and impart economic health to the country. But you can feel bad about how everyone with a microphone kept insisting, night and day, that we cannot keep kicking the can down the road.

An older cliché informs us that actions speak louder than words. I deny it. Often words speak much louder than actions. We all do a lot of impulsive things that don’t say much about who we usually are. But the words we carefully marshal to impress people in argument: those words are us. If not, what are they?

Here’s a way to measure a mind. Does it invent interesting means of saying things, or does it just repeat what others have been saying, thousands of times over? Does it use words, or do words use it? Is it working with words, or is it just . . . kicking the can down the road?

By this standard, nobody in Washington turned out to be very smart during the great budget embarrassment. Nobody said anything original or interesting. It was too much trouble. Take the cliché I just mentioned. The political geniuses thought about it for a while, then decided to picture themselves standing like idle boys on a country road, gazing balefully at a can that was begging to be kicked — and refusing, in an access of self-righteousness, to kick it. Dennis the Democrat was itching to give it a boot. So was Randy the Republican. But they controlled themselves. They did nothing — a very complicated nothing, but nothing nonetheless. Unfortunately, the can had a life of its own. It vaulted down the road and lodged in weeds from which it will be very hard to extract it.

Well, so much for that cliché. It didn’t work. But the horrible thing was that all these people thought they were being extraordinarily clever when they talked about the can.

This shows you what is so awful about clichés. They stay with us because people keep thinking that these are the words that make them clever. President Obama smiled at his cleverness when he urged Americans to sacrifice some never-specified largesse of the federal government. “Eat your peas,” he said, and smiled. He was being clever, he thought.

An older cliché informs us that actions speak louder than words. I deny it. Often words speak much louder than actions.

Today, it is considered very clever, when responding to some request for a serious opinion, to say, “It is what it is.” That’s what one of the Casey Anthony jurors said, when asked about the possibility that, although he voted “not guilty,” in the legal sense, Anthony might not have been “innocent,” in the moral sense. He wasn’t interested in reflecting on the question. “It is what it is,” he replied. Is that what John Galt meant to say when he suggested that A equals A? Or was the juror paraphrasing some dictum of Jean-Paul Sartre? In any case, I’ve heard that expression four times today, and it’s barely past noon. Last night, Sen. Mary Landrieu (D-LA), said in support of the budget compromise, the provisions of which she had not yet read, “it is what it is.”

Daring souls who venture beyond it is what it is have many other choices of clichés. One of them is to emphasize the idea that, no matter what idiotic decisions they make, they have done their due diligence, just by showing up. A whiff of legalese makes any choice legitimate. Sheer laziness, as we know, can always be justified as an abundance of caution, or a pious respect for what will emerge at the end of the day. At the end of the day the jury may reject the obvious and irrefutable evidence. At the end of the day the Republicans may (and probably will) sell out their voters. At the end of the day we’re all dead. Such things are, apparently, good, because they happen at the end of the day.

That’s a lax, supine, virtually inert locution. Somewhere toward the opposite end of the spectrum is a cliché as old, and as batty, as the House of Ussher. The expression is raves, as in “The New York Times raves.” Have you ever seen a movie trailer that didn’t use that cliché? It’s possible that the thing has become a self-reflexive joke among the producers of these silly ads — a reflection of their knowing superiority over the audience they are hired to manipulate. That’s us, the boobs in the theater — the mindless herd that is supposed to be taken in by the image of the newspaper of record screaming and frothing at the mouth. Of course, that’s what the Times actually does, every day, on almost every page; but why imply that there’s something special about its movie notices?

Speaking of clichéd images, how about the face of? This is another advertising cliché, closely related to poster boy for. Every time I turn to a cable news channel, I see the same old codger in the same ad for the same ambulance-chasing law firm, proclaiming, as if in answer to outraged objections, “I am not an actor. I am the face of mesothelioma.” Who could doubt it? And who could doubt that Casey Anthony is the face of jury imbecility? So what? I am the face of Word Watch. So, again, what? Advertising is intended to convince you to feel something extra about the obvious (or the nonexistent). That doesn’t mean that it’s clever.

Well, let’s escape. Let’s refuse to cast ourselves as the faces of anything other than ourselves. Let’s be individuals. But even then, clichés will pursue us. If we’re successful, we will probably be regarded as a breath of fresh air. And that’s not a good thing. The prevalence of this expression shows how easy it is to turn individualism into something quite the opposite.

Let me put it this way: have you ever met a breath of fresh air who wasn’t either a lunatic or a bore of Jurassic proportions? Or both? And no wonder, because the people who look for breaths of air are usually the stuffiest people around — the biggest conformists, dominators, and fools, in whatever group or institution you encounter them. In my experience, they tend to be people who think that Marxism is the newest idea in town. They are always people who welcome change because they’ve got theirs and know they will keep it, whatever damage their radical protégés may inflict on others. (Recall the late Senator Edward Moore Kennedy.) With the aid of progressive clichés, establishments maintain their existence.

Let’s escape. Let’s refuse to cast ourselves as the faces of anything other than ourselves. Let’s be individuals.

Here’s another one: “she [or he] is a very private person.” We hear that constantly. I heard it the other day on CNN, in reference to Huma Abedin, Hillary Clinton’s chief aide, and wife of the disgraced Congressman Weiner. The expression evokes the whole range of faux-individualist dogmas about privacy and the right to privacy (a cliché invented by a Supreme Court impatient with the stately and accurate language of individual rights provided by the constitution). The implication is that there’s something good about being “private,” meaning “sheltered,” as opposed to being a real person and not giving a damn what the rest of us think of you, or whether we think enough about you to want to take your picture. A sheltered person is someone who cares very much what you think about him, and what the picture looks like; therefore he becomes very private, until he thinks the camera may offer a flattering angle. The people acclaimed as private are almost always celebrities and politicians — creatures of the media, who then resent (or pretend to resent) the media’s incursions into their affairs. Private person is a particularly dangerous cliché, a cliché that distorts reality, a cliché that turns American values upside down.

Someone out there is counseling straightforward thieves and murderers to portray themselves as the compassionate Buddha. But why would you want to be the Buddha’s penpal?

The same kind of expression, though one that generally appeals to a different social group, is compassionate. A couple of years ago, when I was writing The Big House, my book about prisons, I looked at a lot of convict penpal sites. Almost without exception, the prisoners seeking correspondents described themselves as compassionate. Now, I’m not one to shy away from convicts. All the convicts and ex-convicts I interviewed treated me very well. I’m grateful to them. And I don’t think it’s the worst thing in the world to have a prison record. But compassionate shows all too clearly that the televised clichés of the middle class are seeping even into the prisons, polluting and corrupting everything they touch. Someone out there is counseling straightforward thieves and murderers to portray themselves as the compassionate Buddha. But why would you want to be the Buddha’s penpal?

One of the worst things about clichés is that they establish themselves as immortal statements of values. No matter how skewed the values are, the antiquity of the clichés attached to them implies that they are worthy of grave respect. This is a major problem with the insufferable clichés of the 1960s, which now, half a century years later, are reverently prescribed to hapless youth, as if they were the cadences of the Latin Vulgate. Hence the young denounce apathy, long to speak truth to power, idolize movements, embrace social justice, declare themselves for peace and global cooperation, commit themselves to the environment, the balance of nature, and (something quite different) change, and haven’t a clue that they are using the cunning vocabulary of the Old Left, c. 1935, and the birdbrain lingo of spirituality, c. 1900. Like a breath of fresh air, long-discredited phrases were transmitted by the Old Left to the New Left of the 1960s, to people (of the whom I am one of which) who had no idea that the words in their mouths had been put there by generations of silly old fuds. They (we) had no idea that even empty clichés can be repulsive and dangerous.

The other night I finally got a chance to see The Spanish Earth (1937), a famous movie that almost nobody who talks about it has ever seen. It was cowritten by the crypto-communist Lillian Hellman; Ernest Hemingway and John Dos Passos also participated (before they learned better). The film is a “documentary” about the Spanish Civil War, presented from the communist point of view, and it has about as much to do with the truth about that war as Triumph of the Will, from which it freely borrows, has to do with the truth about Hitler. I got a special kick out of the movie and its communist heroes constantly denouncing their enemies as rebels. Take that, you ring-in-the-nose college Marxist! You never realized it before, but the mission of the working class is to quell the rebels.

The most wonderful thing was the survival of so many wretchedly misleading political clichés, the kind of phrases that have soldiered on from Marx to Hellman to Rigoberta Menchú to the presidential aspirations of John Edwards and Barack Obama.

“Why do they fight?” the narrator asks about the Spanish people. Most of them didn’t fight, of course; and those who did took many sides, from Stalinist to anarcho-syndicalist. But never mind; a clichéd question deserves a clichéd answer: “They fight to be allowed to live as human beings.”

Human beings. How many times have we heard that, since? It’s an “argument” for every political program you can imagine.

“How ya doin’ today, Mr. Voter?”

“Uh, I dunno. Not so good, I guess. I think I’d feel more, like, more human if I owned a house. I’d feel more like I was living the American dream. Too bad I come from a working family.

“But that’s good for you — very good indeed! Working families are the meaning of America. So how much do you make?”

“Well . . . nothin’, right now. I been on disability these past few years. Ya know, this acne’s really actin’ up . . .”

“No problem! That’s why there’s a government! No reason why you can’t get a loan. As a working man, it’s your right.

“Damn! Really? Thanks, Congressman!”

“So, anything else I can do for you?”

“Well, uh, I guess I’d feel more human if I could retire at 60 . . .”

Most clichés aren’t deployed to answer questions; they’re meant to anesthetize them. So, if you say, in regard to The Spanish Earth, “Wait — I’m confused. Exactly who are these people who fight to be allowed to live as human beings?”, the film will tell you that they are “the men who were not trained in arms, who only wanted work and food.” These are the people who, we are told, “fight on.”

So at the end of the day, it’s the pacifists who inherit the earth — the pacifists who take up arms. Are you confused? I am. I’d like to know more about these people who are fighters because they don’t want to fight. But what I’m given is another cliché. I’m told that they are people who only want work and food.

Most clichés aren’t deployed to answer questions; they’re meant to anesthetize them.

It sounds good. Modesty is becoming. But one thinks of succeeding clichés, logically deduced from wanting only work and food: “It’s the economy, stupid.” “This election isabout one thing: putting America back to work.” “They work all day long, many of them scraping by, just to put food on the table. And . . . they see leaders who can't seem to come together and do what it takes to make lifejust a little bit better for ordinary Americans. They are offended by that. And they should be.”

That last remark is President Obama’s. The first remark is James Carville’s, back in the election of 1992. The one in the middle is sadly common at all times and everywhere, left, right, and center. Each remark suggests that ordinary Americans want only one thing — work and food. And that is why they vote the way they do.

Consider this the received wisdom, the grand cliché.

I’m offended by that. And ordinary Americans should be too.




Share This


George Soros: Transparent Hypocrite

 | 

George Soros is the notorious leftist billionaire who has spent lavishly to push this country (and others) in a statist direction. His preferred mechanism is quiet subversion: he funds front groups such as Media Matters (which aims at squelching speech by conservatives and libertarians on radio and television) and MoveOn.org (which aims at electing leftists to office). Ironically, he made a big chunk of his money collapsing the currency of a statist government (the UK) in 1992.

Well, he’s back in the news. He has now closed his hedge fund to outside investors. Why? Because of the new “transparency” financial regulations laid down by the SEC under the monstrosity that is Dodd-Frank. The new rules require that by early next year, large hedge funds (i.e., ones with asset bases over $150 million), such as his own, must register. Any large hedge fund will now have to disclose who invests in it, who works for it, whether it faces any conflicts of interest, and what it owns or invests in.

As the deputy chairmen of his fund (and, coincidentally, his sons), Jonathan and Robert Soros, so sadly put it, “An unfortunate consequence of these new circumstances is that we will no longer be able to manage assets for anyone other than a family client as defined under the regulations.” Accordingly, the fund will return all outside (non-family) capital to the investors — to the tune of $750 million.

Other hedge fund managers, such as Carl Icahn and Stanley Druckenmiller, have done the same with their funds. So why single Soros out for attention?

He deserves all the attention we can give him, because his megamillions helped elect the Red Congress that enacted Dodd-Frank, as well as the statist American president who pushed the bill and signed it into law. Soros wanted this country run by neo-socialists. He spent lavishly to ensure that it would be. But now he doesn’t want to have to live up to the spirit of the regulations this regime is inflicting upon the nation.

In short, Soros is a hypocrite. Not to mention a schmuck.




Share This

© Copyright 2013 Liberty Foundation. All rights reserved.



Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.