Five Year Plan

 | 




Share This


Paraders Step in the Right Direction

 | 

Every year the Yonkers African American Heritage Community hosts a two-day festival and parade in downtown Yonkers, 15 miles up the river from Manhattan. Every year the Yonkers City Council agrees to provide police, parks, and emergency personnel to serve the event, paying exorbitant overtime fees to do so.

But this year the city told festival organizers that they would have to pay the city's costs to secure the event. The result? The committee opted to host a one-day festival at the community center, instead of the parade. They simply could not afford the tens of thousands of dollars they would have had to pay city workers in order to host the two-day, citywide festival.

This is exactly as it should be. If an event isn't worth tens of thousands of dollars to the people participating in it, why should it be considered worth tens of thousands of dollars to the taxpayers who may not even be attending the event? Or worse, who may be inconvenienced by the parade and the noise?

Earlier this summer the Yonkers Puerto Rican/Hispanic Parade & Festival was canceled for the same reason. When nearby White Plains began billing parade organizers for police and cleanup last year, many of their community organizations also turned to hosting single-location festivals instead of the rowdier and messier parades.

Municipalities across the country should follow this example. Traditions are important. They bring communities together and create bonds across generations. But the details of a tradition can be changed to fit the times. No longer should taxpayers be expected to foot the bill for parties and festivals enjoyed by small groups within the larger groups. Festival organizers should raise money the private way: sell advertising, seek private sponsorships, offer vendor booths, and charge fees. The lessons our mothers taught us apply to municipalities and community organizations: if you can't afford it, don't do it.




Share This


Marshall v. Jefferson

 | 

In the September 2009 issue of Liberty (in a book review entitled "Liberty and Literacy"), Stephen Cox — ever the analytical wordsmith — extols the content and form of Thomas Jefferson’s brilliant first sentence, second paragraph of the Declaration of Independence:

“We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable rights, that among these are life, liberty, and the pursuit of happiness.”

Focusing on the form, he paraphrases the passage into run-of-the-mill prose and berates the reader who can’t tell the difference. He says that “language is not just a method of communicating… (It) is a way of creating pleasure,” and that if one doesn’t see that, then one is illiterate, knows nothing about writing and should —  go away.

Braving Cox’s acid pen and his usually faultless reasoning, I took issue with his (and nearly everyone else’s) assessment of Jefferson’s passage. I responded in Liberty in November 2009:

“Lofty words. Pure poetry, perhaps — but devoid of any connection to reality. It is not self-evident that '“all men are created equal,' or 'that they are endowed by their Creator with certain unalienable rights,' or 'that among those are Life, Liberty, and the pursuit of Happiness.' One need only look at the history of the Bill of Rights and its ignored 9th Amendment to realize that the only rights citizens retain — much less 'are endowed with' — are those that they explicitly claw from their government; Life, Liberty and the pursuit of Happiness not included. Perhaps they were too 'self-evident'?

“It took nearly 100 years for those three self-evident rights to be included in the Constitution under the 14th Amendment as 'life, liberty, or property.' And even now they’re not secure. 'Pursuit of Happiness' was an elegant albeit vague and meaningless euphemism for property, which Jefferson was loath to include, fearing it might justify slavery. Unfortunately, the omission later caused such an erosion of property rights that there is now popular clamor for a property rights amendment to the Constitution (in spite of the 14th Amendment).

The government established under the Articles of Confederation was about as powerful and effective as today’s United Nations.

"The slippery nature of even enumerated rights — much less 'self-evidently endowed' rights — comes to mind in Justice Oliver Wendell Holmes’ dissent in the Lochner v. New York case. His particularly perverse interpretation of the 14th Amendment, using original intent, mind you, found that since the amendment was originally written to protect the rights of freed slaves, it could not apply to workers and management deciding the length of their workday. But then, he was famous for declaring that he could decide any case, any way, using any principle. (He’d later go on to find that eugenics, as government policy, was justified under the Constitution.)

“As populist rabble-rousing, Jefferson’s clause is second to none, and in that sense, it is great writing. However, as a description of reality or a recipe for government, it is a complete failure. Therefore I must counterintuitively conclude, being a firm believer in the dictum that form follows function, that the clause in question is neither effective nor elegant writing.”

Only later, after reading R. Kent Newmyer’s legal biography of John Marshall, the fourth Chief Justice of the United States, John Marshall and the Heroic Age of the Supreme Court, did I that realize that I was not alone in being skeptical of "natural rights" and, instead, advocating enumerated rights.

The controversy over enumerated versus self-evident rights began immediately after the Constitutional Convention disbanded and each state was asked to ratify the new document. Contrary to what today’s tea partiers, radical states’ righters, and some libertarians and conservatives believe, the Constitution was created to increase the power of the federal government, both absolutely and over the states. Under the previous arrangement — the Articles of Confederation — the federal government was dependent on the whims of the states — individually, mind you — for voluntary revenues and a host of other items. These constraints made defense of the new country a much tougher proposition — in raising an army and funding a war. In today’s terms, the government established under the Articles of Confederation was about as powerful and effective as today’s United Nations.

To John Marshall, Alexander Hamilton, and a coterie of other patriots who would later coalesce into the Federalist Party, this arrangement spelled ruin for the new country — not only because the United States might not be able to defend itself adequately, but also because it wouldn’t be able to pay its bills dependably, obtain credit, or participate in the foreign exchange mechanisms necessary for international commerce.

Under the old dispensation, individual states were responsible for debts incurred during the Revolutionary War, and some were thinking of defaulting, either from irresponsibility or from spite toward some of the Brits who’d bankrolled them. As Hamilton observed about the debt and its consolidation under federal responsibility, it was "the price of liberty." Additionally, each state (as well as private entities) could issue its own currency. Without the full faith and credit of a central government, the new country would be unable to participate effectively in international trade — a serious impediment under the new, capitalist world order.

Without the full faith and credit of a central government, the new country would be unable to participate effectively in international trade — a serious impediment under the new, capitalist world order.

Most delegates to the Constitutional Convention appreciated this. Yet, because the new Constitution increased the federal government’s power, some delegates (anti-federalists, later to coalesce around Thomas Jefferson and his Democratic-Republican Party), fearing tyranny, fought for a bill of enumerated rights, limiting the federal government. The idea that such a bill would be forthcoming may have beena make-or-break point for ratification.

Counterintuitively, people opposed to including a Bill of Rights (many of them Federalists) replied that it was impossible to enumerate all the self-evident rights that the people retained; that enumerating a few rights would guarantee only those; and that the unenumerated rights would forever be lost (think of the right "to privacy," later discovered in penumbras and emanations, together with the right "to earn an honest living," "to marry a person of the same sex," "to marry more than one person," "to cohabitate," "to fight for your country in spite of being gay," "to suicide," "to ingest any substance," "to enter into contracts," "to make obscene profits," etc. — you get the point).

As we all know, the Jeffersonian anti-Federalists won the battle for the Bill of Rights, mostly because the Federalists’ arguments — vague and hypothetical — did not have the immediacy of the fear of tyranny. The odd hitch — to a modern audience — was that the Bill of Rights did not apply to state governments, only to the federal government. Massachusetts retained a state religion until 1830, and the Bill of Rights wasn’t interpreted to apply to the states until the passage of the Fourteenth Amendment after the Civil War.

In order to allay Federalists’ concerns, the Ninth Amendment to the Constitution stated: “The enumeration in the Constitution of certain rights shall not be construed to deny or disparage others retained by the people.”

Unfortunately, that language has become meaningless — as well it should, if examined critically.

Robert Bork, who failed to be confirmed to the Supreme Court because some senators thought him too conservative, both politically and judicially, has likened the Ninth Amendment to an inkblot. In The Tempting of America he argued that “while the amendment clearly had some meaning, its meaning is indeterminate; because the language is opaque, its meaning is as irretrievable as it would be had the words been covered by an inkblot.” From the Left, Laurence Tribe has said that “the ninth amendment is not a source of rights as such.” The best defense of the Ninth comes from Randy Barnett, a libertarian constitutional scholar (and contributor to Liberty), who says that it “calls for a presumption of liberty.”

Marshall, in spite of his Federalism, understood the problem with the Ninth Amendment and advocated the enumeration of rights. He did not believe in natural rights endowed by a creator; he believed that we the people endow ourselves with rights based on expediency and tabulated as positive law. He was a nuts-and-bolts lawyer who believed that the best laws were those that required the least interpretation. Ironically (as we shall see), though he eschewed grandiose philosophical visions and paradigms, he is best known for defining the role of the Supreme Court in the new United States and establishing a rock-paper-scissors hierarchy among the three branches of government that remains the modern modus operandi.

Marshall was a nuts-and-bolts lawyer who believed that the best laws were those that required the least interpretation.

Marshall’s concern with rights was particularly personal. An admirer of John Locke, Marshall was obsessed with the sanctity of contracts to a degree that today might be considered excessively libertarian. He believed that the terms of a contract superseded statutory limitations. For example, in a minority opinion, he opined that bankruptcy laws could not relieve a debtor from a previous obligation. Likewise, he would have heartily supported the 1905 Lochner v. New York decision that overthrew a statute limiting the working hours of bakers as an infringement of the rights of employees and employers to negotiate their own contracts. He also believed that legislation was a contractual obligation of government to the citizenry. In Fletcher v. Peck (1810), Alexander Hamilton, representing a group of investors — including himself — argued that a Georgia state law authorizing a land sale was a contract, and that Georgia's Rescinding Act of 1796 invalidating the sale was unconstitutional under the contract clause of the constitution. Marshall agreed.

These perspectives had the curious effect of another rock-paper-scissors round robin: though Marshall was a strong advocate of the supremacy clause, the phrase in the Constitution stipulating that federal law trumps state law (a big damper on states’ rights), his view of contracts tends to elevate individuals above both state and federal law. But I digress — back to Marshall’s personal concern with rights.

Marshall and his family were speculators in lands west of the Appalachians. Like most libertarians and economists today, Marshall saw nothing wrong with speculation; in fact, he believed that speculators provided services essential to the opening of new tracts for settlement — subdivision, assessment of resources, initial price arbitrage, surveying into lots, market making, etc. The buying and selling of deeds required contracts, the details of which, he believed, were between the parties involved, and should be arrived at with minimal government interference.

But speculators, then as now, had a bad reputation among the substantial portion of the population that didn’t understand their function and thought they were making profits without effort. These folks, Thomas Jefferson prominently among them, believed in the ideal of a republic of small, yeomen farmers. Speculators were just an unnecessary — even an evil — obstacle to that ideal.

Jefferson and Marshall despised each other. Though the Democratic-Republican Jefferson managed to restore his friendship with Federalist John Adams, he could not stand fellow Virginian Marshall.

Marbury v. Madison

Though not the beginning of their feud, Marbury v. Madison — one of two of Marshall’s most famous decisions — best summarizes their intellectual conflict. It is the decision that established the power of the Supreme Court to overturn congressional legislation through the principle of judicial review, thereby elevating the Supreme Court to coequality with Congress and the Executive.

Judicial review, already a long-standing legal principle in other contexts, was a power not specifically granted to the newly-established Supreme Court. Marshall understood that without it, the Supreme Court could never properly function as one third of the triad it was designed to be, since an effective separation of powers required three equally potent branches of government. It is a complex and convoluted decision, tight in reasoning, and difficult to explain. I’ll give it a shot.

The case began amid the bitter political conflicts of the waning days of Adams’ administration. The (barely) peaceful transfer of power to the opposition was a landmark in the new nation’s development. Still, anti-Jefferson riots were expected in the capital.

In a bold attempt to curtail the new administration’s power, Adams nominated Marshall, his Secretary of State, as the new Chief Justice of the United States. The Federalist-controlled lame-duck Congress not only quickly confirmed him, it also passed a law authorizing the appointment of a number of justices of the peace to govern the District of Columbia in case the riots materialized. Adams immediately appointed 42 Federalist judges.

Jefferson was livid. As Newmyer says:

“Unfortunately for historians, there were no cameras to record the deliciously ironic moment on March 4, 1801, when the new chief justice administered the oath of office to the new president. With his hand on the Bible held by Marshall, Jefferson swore to uphold the Constitution Marshall was sure he was about to destroy…It was not coincidental that Marshall turned his back to the president during the ceremony. . . . Jefferson had already concluded that the federal judiciary had to be humbled and ‘the spirit of Marshallism’ eradicated.”

The (barely) peaceful transfer of power to the opposition was a landmark in the new nation’s development. Still, anti-Jefferson riots were expected in the capital.

The new appointments were duly signed and sealed but, ominously, not all of them were delivered by the Secretary of State (still John Marshall), whose job it was to finalize the procedure, but who had only had the last week of the Adams administration in which to comply. When James Madison, Jefferson’s newly appointed Secretary of State (and an author of the Constitution), assumed his duties on March 5 he discovered the remaining undelivered appointments.

Jefferson ordered Madison not to deliver the commissions. Enter William Marbury, one of the prospective justices appointed by Adams. He and three other denied appointees, petitioned the Supreme Court for a writ of mandamusdirected at Madison, in essence ordering him to comply.

Meanwhile, the new Democratic-Republican Congress repealed the legislation that authorized the appointments in the first place and, adding fuel to the fire, cancelled the 1802 Supreme Court term, Marshall’s first. The intervening period permitted Marshall and his colleagues to ponder the constitutionality of events, the dangers of challenging executive authority head-on by issuing the mandamus, and the formulation of strategy.

For two years, the Court’s powers, or lack thereof, had been debated in Congress and in the court of public opinion. The Court had even been a focus of Jefferson’s political agenda. Specifically, was the Supreme Court subservient to Congress or the Executive or both, or was it equal in stature and power? Marshall was looking for an opportunity to settle the debate, and Jefferson gave it to him when he blocked Adams’ judicial appointments.

The new Democratic-Republican Congress, adding fuel to the fire, cancelled the 1802 Supreme Court term.

In February 1803, the Court came out fighting, opening its term with Marbury v. Madison. Immediately, Jefferson — claiming executive privilege — insulted the Court by refusing to permit US counsel to appear or executive witnesses to be heard. And he continued to stonewall, micromanaging executive witnesses even when the Court established, after much technical to-ing and fro-ing, that it did indeed have jurisdiction in the case, and that it could go forward.

In what was to become his typical fashion, Marshall (with a unanimous Court), decided the case on narrow grounds: the rule of law. He stated that Marbury’s office was vested when President Adams signed his commission; that at that point — irrespective of mundane details — the operation of law began. Marshall, to Jefferson’s great irritation, virtually lectured the new president that he was not above the law.

So, where is the judicial review, the Court’s power to overturn congressional legislation, for which Marbury v. Madison is so well known? In the last six pages of the 26-page opinion, in which the court struck down section 13 of the Judiciary Act of 1789. Marshall's reasoning almost became the proverbial camel passing through the needle's eye.

In the Act, Congress had magnanimously granted the Supreme Court the right to issue mandamus writs, reasoning that, since the power wasn’t specifically granted by the Constitution — and the Court couldn’t very well function without it — it was necessary for the Court to have it.

Marshall disagreed on two major points. First and foremost, he declared that Congress — through simple legislation — could not change the Constitution, and that only the Supreme Court had the power to interpret it. Second, for a variety of reasons, Marshall decided that the Constitution already gave the Court the power to issue mandamus writs.

Confused? There is no doubt that Marshall was out to prove a point and — with some fancy footwork — had to weave a sinuous path to make it. Luckily (and some say, with Marshall’s prodding and collusion) circumstances, timing, allies, and even adversaries, all fell into place for him. Almost all aspects of the decision are still debated, even as to whether it was at all necessary; mainly because many commentators believe the Constitution already implicitly grants the power of judicial review to the Court. In Federalist No. 78, Hamilton opines that not only is judicial review a power of the Court, it is a duty. Not one delegate to the Constitutional Convention argued against the principle.

Jefferson — claiming executive privilege — insulted the Court by refusing to permit US counsel to appear or executive witnesses to be heard.

But critics claimed that the Marshall court had vastly overreached. Jefferson himself believed that the president (and the states) had the power to interpret the constitution, and he forever fulminated against the decision. Congress, however, was not troubled and took it in stride — which leads Newmyer to conclude that, “put simply, it was presidential power, not congressional authority, Marshall targeted.” The Supreme Court’s power of judicial review was extended over the states in 1816 in Martin v. Hunter’s Lessee,another Marshall decision.

Jefferson versus Marshall

John Marshall was the longest serving — and arguably the most important — chief justice. Serving from 1801 to 1835, he presided over the most formative decisions the new country faced. He helped to establish a balanced, effective, and more manageable government, and helped set the tone for the future sparring among the three branches of federal power. During his term, the Constitution became much more than a founding document — it became something closer to accepted law.

Today most of us perceive political parties as somewhere within the Left-Right continuum. It is difficult to see things in any other way: that’s how today’s politics play. But the Federalist versus Democratic-Republican divide was an entirely different one.

Although most people today associate Jefferson with individual rights and a fundamentalist view of the Constitution; and the Federalists with the advocacy of a strong central government, the distinctions are not so facile and clear-cut. For one, the Democratic-Republicans supported slavery, while the Federalists generally opposed it. These positions led the Jeffersonian tradition directly to the policies of Jackson, Calhoun, and, finally, Jefferson Davis; while the Federalist tradition led to Lincoln.

The irony here is that those who were most skeptical of the Constitution are the ones referred to as “strict constructionists,” while the Federalists are regarded as free-wheeling interpreters of its provisions.

The Federalists were the first to see and understand the failure of the Articles of Confederation, so they pushed for change. The anti-Federalists thought that the Articles could be tweaked for improvement and were skeptical about the whole constitutional enterprise. In the end, they accepted it reluctantly — and showed it. To them, “strict constructionism” meant that if the Constitution granted one the right to eat, the right to obtain food didn’t automatically follow. Or, if it granted the right to free political speech, the right of media accessibility for broadcasting that speech — since it wasn’t actually spelled out — didn’t exist. Such thinking, often imbued with deep resentment, led to muddled action, ambivalence, and, sometimes a reversal of roles — with the president himself leading the way.

Jefferson, expecting an immediate victory, ordered a squadron of ships to destroy the Muslim navies. The war dragged on for almost 15 years.

Jefferson’s cavalier attitude toward the Constitution was shown early in his presidency, with his 1801 attack on the Muslim state of Tripolitania on the Barbary Coast (Tunisia, Algeria, Morocco, and present day Libya) without a congressional declaration of war, which contemporary opinion believed he was constitutionally obligated to obtain. Several of the Barbary states had demanded tribute from American merchant ships in the Mediterranean. When the Americans declined, the Pasha of Tripoli captured several seamen and held them for ransom. Jefferson, expecting an immediate victory, ordered a squadron of ships to destroy the Muslim navies. The war dragged on for almost 15 years.

But it was the Louisiana Purchase — the constitutionality of which even Jefferson was skeptical about — that was really troublesome. For starters, the Constitution did not empower the federal government to acquire new territory without the universal consent of every state (as per Andy P. Antippas' view in his History of the Louisiana Purchase). Some of the articles of the Purchase Agreement were also in violation of the Constitution because they gave preferential tax treatment to some US ports over others; they violated citizenship protocol; and they violated the doctrine of the separation of powers between the president, Congress, and the judiciary.

As Antippas recounts:

“Jefferson and his fellow Republicans were 'strict constructionists.' i.e., they allegedly adhered to the letter of the Constitution and were strong proponents of 'state’s rights' and 'limited government;' however, Jefferson and most of his party members chose simply to ignore all the Constitutional issues as merely philosophical for the sake of expediency — Jefferson’s response to his critics was 'what is practicable must often control what is pure theory' — in other words, 'the end justifies the means.'"

Jefferson’s specious argument to his critics was that the federal government's power to purchase territory was inherent in its power to make treaties. The Senate bought that argument and ratified the Louisiana treaty.

In their individual approaches to personal liberty, Jefferson’s and Marshall’s actions speak volumes. As I’ve already mentioned, Marshall was fanatically laissez-faire, while Jefferson favored greater economic regulation for what he thought was the good of society. Specifically, Jefferson favored a society of agrarian smallholders and did not approve of speculators buying up western lands as soon as they were available — he wanted smallholders to get in on the action right away. He did not understand the redeeming socioeconomic value of speculators, abhorred their — in his view — unearned profits, and advocated restricting or eliminating these — again, to him — unnecessary middlemen, prominent among whom were Marshall and his family.

Both were Virginians and slaveholders, but their treatment of slaves differed markedly. Jefferson is known to have beaten his slaves; there is no evidence that Marshall ever did. In his will, Marshall wisely granted more liberty to his slaves than we might intuitively suppose today. He gave them two options upon his death: liberty, with severance pay, so they could set themselves up independently (or emigrate to Liberia); or continued servitude, in case the radical transition to liberty was more than they could handle. In 1781, near the end of the Revolutionary War, 23 of Jefferson’s slaves escaped to the British.

Marshall's and Jefferson's approaches to Native Americans were even more illuminating. Though Jefferson’s words spoke respectfully, even admiringly, of the noble savage, his policies began the trail of tears that would destroy cultures and result in the reservation system.

As soon as Louisiana was purchased, Jefferson embarked on a cold-blooded policy toward Native Americans. In a lengthy letter to William Henry Harrison, military governor of the Northwest Territory, he explained that the nation's policy "is to live in perpetual peace with the Indians, to cultivate their affectionate attachment from them by everything just and liberal which we can do for them within the bounds of reason." But he goes on to explain "our" policy (presumably his own, and that of the United States) on how to get rid of every independent tribe between the Atlantic states and the Mississippi, through assimilation, removal, or — if push came to shove —  "shut(ting) our hand to crush them." Finally, in secret messages to his cabinet and Congress, Jefferson outlined a plan for the removal of all Native Americans east of the Mississippi to make sure that the land would never fall to the French or the British, who chronically supported the Indians in their disputes against the US.

Marshall, in contrast, did everything he could to prevent the confiscation of Indian land and the eviction of the Indians from Georgia, in a series of cases collectively known as the Cherokee Indian cases.

As a young man, while serving in the Virginia House of Delegates in 1784, Marshall supported a bill that encouraged intermarriage with Native Americans. Three years later, in the Indian slave case of Hannah v. Davis, he argued successfully that Virginia statute law prohibited the enslavement of Native Americans.

The Cherokee Indian cases, too long and complicated to detail here, came before the court in the early 1830’s, when Andrew Jackson was president. The final one, Cherokee Nation v. Georgia, turned on the supremacy clause. In a bald-faced land grab, Georgia had declared sovereignty over Cherokee lands. The Cherokees sued. The Marshall court decided that Indian affairs were the province of the federal government alone; therefore the Georgia statutes that claimed control over Indian lands were null and void. But neither Georgia nor Jackson — both strong states’ rights advocates — nor Congress supported Marshall’s decision. Jackson is reputed to have said, “John Marshall has made his decision, now let him enforce it.” Congress, meanwhile, had passed the Indian Removal Act of 1830, which Jackson heartily endorsed. It was a Pyrrhic victory that few Cherokees savored on their forced march along the "trail of tears and death" to Oklahoma.

McCulloch v. Maryland

Besides the fundamental issue of judicial review, another fundamental issue remained to be addressed in order to make the Constitution something closer to ultimate law, as opposed to simply a guiding, founding document: objective guidelines for practical interpretation. Other than the Federalist papers, which were not law, firm guidance about constitutional interpretation was lacking.

At one end of the spectrum was the strict fundamentalist approach (see examples already mentioned above), akin to religious fundamentalist interpretations of the Bible: virtually no interpretation. At the other extreme were freewheeling, almost poetic readings — complete with what today are called “penumbras and emanations.” With Holmesian effort, one could interpret anything, anywhere, in any way. This was not only impracticable law, but a recipe for tyranny. In 1819 Marshall got the opportunity — again, with some prodding and collusion on his part — to set the standards.

The constitution had given Congress the power “to coin money and regulate the value thereof.” Under that clause, the first Congress — with the approval of President Washington — chartered the first Bank of the United States, legitimized by its power to coin money. Though Jefferson opposed it, he left the bank in place. He and his political friends referred to it as "Hamilton’s bank." Being strict constructionists, they argued that it was Congress that had been empowered by the constitution to coin money, not the Bank. However, four years without the bank during the War of 1812 spoke eloquently about the value of its services. When a charter for a second Bank of the United States (the first charter ran for only 20 years) was introduced in 1816, it had the support of President Madison, who signed the bill into law. Only Virginia's congressional delegation voted (11 to 10) against the bank.

Though Jefferson’s words spoke respectfully, even admiringly, of the noble savage, his policies began the trail of tears that would destroy cultures and result in the reservation system.

Though the Constitution had empowered the federal government to coin money, it had not explicitly barred states from doing so. Virginia, a staunch states’ rights advocate, kept its Bank of Virginia, headquartered in Richmond. And there was one of the rubs: the branch of the Bank of the US stationed in Richmond was too much competition for the alternative, state banking system.

Exacerbating the dispute was the mismanagement of the Second Bank of the United States, which — shades of today’s crisis — had provided easy credit for a land boom in the south and west. When the bank called in its improvident loans to state banks — to cover its own debts, which it had improvidently incurred — the default of banking institutions swept like wildfire across the southern and western states.

Battle lines were drawn. The states moved against the national bank. Ohio, in the most radical reaction, outlawed the Bank of the United States, using the theory of “nullification," according to which states could cherry-pick federal laws, rejecting whichever they chose. Nullification had been around since 1798; it gained from the support of none other than Jefferson and Madison. (Though it must be admitted that when pressed about the constitutionality of nullification, Madison hedged and declared that it was an extraconstitutional option.) Taken to extremes, nullification implied the right to secede, with each state being judge of the constitutionality of its own cause; and, as Ohio later tried to do with McCulloch v. Maryland, the right to reject Supreme Court decisions. Though the Civil War and numerous Supreme Court decisions have hacked both the legs and arms from nullification, like the black knight in Monty Python’s Holy Grail it keeps coming back for one more round. Russell Pearce, an Arizona state senator, is sponsoring the latest nullification bill. Libertarians should reflect on the fact that nullification cuts both ways: it is at least as likely to be used to nullify as to uphold individual rights.

To cover its debts, Maryland passed a law taxing — in the most punitive and unconventional manner — the Bank of the United States. James McCulloch, head of the bank's Baltimore branch, refused to pay the tax. Maryland sued.

When the case finally reached the Supreme Court in 1819, Marshall found for McCulloch — less than a week after the conclusion of oral arguments, leading some to wonder whether he’d written the decision before hearing counsels’ arguments. As well he might. Not only was the case “arranged” (both parties sought an expeditious decision), but Marshall apprehended the issues immediately, commenting to a fellow justice that “if the principles which have been advanced on this occasion were to prevail, the constitution would be converted into the old confederation.”

Many principles were in play in Marshall's decision: federal supremacy over the states within a constitutional sphere (Maryland could not punitively tax a federal institution), judicial review (reaffirmed), nullification (denied — state action may not impede valid constitutional exercises of power by the federal government)and finally, the most important issue, implied powers.

Invoking the necessary and proper clause of the Constitution, Marshall declared:

“Let the end be legitimate, let it be within the scope of the constitution, and all means which are appropriate, which are plainly adapted to that end, which are not prohibited, but consist with the letter and spirit of the constitution, are constitutional.”

Here, finally, was an objective guideline for the interpretation of the constitution. Accordingly, the constitutionality of the Bank of the United States was established without a doubt. By extension, its heir today, the Federal Reserve Bank, is a valid, constitutional entity empowered by Congress “to coin money and establish the value thereof” — however much we may disagree with its methods and their effects.

* * *

Much controversy over the Constitution and its meaning continues — witness the Russell Pierce case and the calls for the abolition of the Federal Reserve. Even the Bill of Rights is not fully settled law. During recent arguments before the Court, Justice Elena Kagan sought to minimize the importance of an attorney’s statement, with which she disagreed, by referring to “buzz words”:“heightened scrutiny” and “rational basis.” These words refer to the standards a court should employ in assessing the impact of governmental action that may affect individual rights. As Chip Mellor of the Institute for Justice has stated in Forbes, contemporary judicial activism has “creat[ed] a hierarchy of rights with those at the top (like the First Amendment) receiving relatively strong protection — the heightened scrutiny — and those at the bottom (property rights and economic liberty) receiving very little," since these latter are thought to require a "rational basis" for review.

Following Cherokee Nation v. Georgia, Jackson is reputed to have said, “John Marshall has made his decision, now let him enforce it.”

The Constitution is only "settled" law in the sense that nearly all Americans accept it as not only our primary founding document, but also as the lawful basis for our government. In many other respects, it is far from "settled" — witness the extremely varying interpretations ascribed to it and the continuing legal battles over exactly what it means and how to apply it. John Marshall, in Marbury v. Madison and McCulloch v. Maryland, set parameters within which that debate should productively take place. Understanding those two cases — and Marshall's perspective — is essential to a knowledgeable understanding of our government's structure and powers.

As libertarians, we can be most effective if we work within the framework of accepted law to protect and extend liberty, rather than making ineffective flanking attacks from the swampy fringes, armed with quixotic arguments. The Constitution must be scrupulously and objectively interpreted, and with due respect for Marshall’s great tradition: first, as Randy Barnett has suggested, according to “original meaning” of the words used at the time; then, according to “original intent” — a less stringent bar, requiring interpretation of documents such as the Federalist Papers. This approach to interpreting the Constitution is a firmer bulwark for liberty than the well-intentioned but murky intellectual musings of Jefferson, which — though noble and intelligent — are no substitute for tight legal reasoning.




Share This


Irene: The Man-Made Disaster

 | 

I am a victim of Hurricane Irene.

My friend and I were visiting New York when Irene “struck” early today — Sunday, August 28. We had plane reservations to leave the city on Saturday, August 27. Delta Airlines canceled our reservations on Friday afternoon. It, like all the other airlines, abandoned traffic to New York more than 24 hours before any hurricane could possibly have caused trouble at the airports. Because of these cancellations, travel throughout the nation was convulsed.

None of this was necessary, or wise, or profitable to anyone. It was the result of a panic induced by government and media, and willingly indulged by the kind of corporations that have acquired the worst characteristics of both — arbitrary power and a zest for misinformation. When our reservations were zeroed out, we were emailed, almost a day after the fact, “Your flight has been cancelled” (no apology, no explanation); then we were told that “we have rebooked you on another flight” — two days later. Notice the transition between the passive mood, which people in power reserve for the bad things they do, and the active mood, which they choose for the good things they don’t do. Our flight wasn’t rebooked by the airline; it was rebooked by us, after we pestered the airline and they eventually returned our call, and after we were unable to rebook it on the airline’s website, which wasn’t working. The woman who finally assisted us acted as if it was an amazing idea that we should be reimbursed for the downgrade of our tickets from first class to coach.

But let me report a few highlights of this ridiculous exercise in misinformation and authoritarianism, by which all America was damaged by a minor storm.

On Wednesday, ABC reported that the hurricane, then reputedly a category 3, or maybe 2, “could be category 4 by Thursday.” Other media, including the Weather Channel, suggested that it would be. When the hurricane came ashore in North Carolina on Saturday, it was barely a category 1, something that the media geniuses never believed could happen to their darling, “the hurricane of a lifetime,” although normal people easily guessed it. By Saturday evening, Irene was visibly disintegrating, had lost its eye, and was about to become a mere tropical storm, and not an especially strong one. Yet at that time, the mayor of New York was strongly advising all people to stay at home between 9 p.m. Saturday and 9 p.m. Sunday, had closed all mass transit at noon on Saturday, had sent his goons out to advise people living in 30-story buildings that they ought to evacuate, because the park next door might flood, and was telling workers to plan on mass transit still being shut down during their Monday morning commute. He seemed to enjoy himself, decreeing fates like that.

Businesses were closing everywhere in Manhattan, because of the mass transit shutdown, but my friend and I found a restaurant, “Da Marino,” that promised to be open on Saturday evening, and on Sunday evening if possible. To deal with the transit problem, the management had rented rooms for their employees in a hotel next door. So on Saturday night we enjoyed a good meal and listened while people accurately identified Bloomberg as the man who was causing the mess. But most merchants had shut down on Saturday afternoon, or failed to open that day at all. All Starbucks stores shut down. Pastry shops that cater to the local hotel business shut down, even though they had a captive mob of customers. Madame Tussaud’s shut down. Even churches canceled their Sunday services. Leaving Da Marino after an excellent dinner, served to customers reported to be more numerous than at any time in the restaurant’s history, my friend and I looked down Broadway from 49th Street to Times Square. The lights were on, but there was no crowd, no life, no business. A few people drifted across the street, in posses of two or three. Official vehicles could be seen in the distance, idling and flashing their lights. A faint drizzle of rain came down. That was the Great White Way on Saturday evening, August 27.

And why? Because the official class decreed that there should be a disaster.

Back in our hotel, we turned on the disaster reports on TV. Local news was enthusiastic about a picture of Grand Central Station standing empty except for cops who were there to fend normal people off. “No reason why you should go there anyway,” the news anchor said. A young newsperson, standing on location amid a few drips of water, predicted that soon, very soon, the neighborhood in which he stood would be hopelessly flooded. Anchorpeople advertised the fact that 4,000 people were now without electricity in New Jersey, New York, and Pennsylvania, not stating how many of the millions who live in those areas are without energy at any normal time. The electric company, prompted by the mayor, threatened to cut off energy “preemptively” to large areas of New York City, allegedly to protect its equipment against flooding. And to make matters worse, yet another of Bloomberg’s constant news conferences was threatened.

My friend and I fell asleep. When we awoke at 10 on Sunday morning, the rain had gone; the sun was shining; and people were walking the streets, sans umbrellas, hunting for places to eat. Places to enjoy. Places to honor with their business. Places that had survived the onslaught of paternalism.

Soon we will hear how many billions of dollars Hurricane Irene cost the nation. But remember: the hurricane itself was responsible for virtually none of those losses. This was a manmade disaster.




Share This


More on Government Motors

 | 

Earlier this year, President Obama went on one of his gloating tours, touting the wisdom of his nationalization of General Motors and Chrysler. Theirs was a corrupt bankruptcy that strongly rewarded the UAW, one of Obama’s major financial contributors. The new GM then posted a few months of improved sales, leading to much crowing by all the corrupt cocks.

But lately, the road for what is derisively termed “Government Motors” has become rather bumpy, as illustrated in a recent story. The report is about how the New GM is trying desperately to get a dismissal of a class action lawsuit filed on behalf of 400,000 Chevy Impala owners.

The suit, filed by one Donna Truska, argues that the Impalas — made between 2007 and 2008 — had defective rear spindle rods, leading to rapid tire wear. The plaintiff claims that GM has breached its warranty, and demands that GM fix the cars.

But the new GM argues that since the cars were made by the Old GM, it is not liable for the repairs, and the 400,000 Impala owners should therefore go to hell. Of course, the New GM was only too happy to take over the losses of the Old GM so it could stiff other taxpayers out of future taxes on the New GM, but it doesn’t want to assume any liabilities.

And of course, back in March of 2009, as GM headed toward bankruptcy, Obama promised that in any action he took to “save” GM, consumers would have their warranties honored. As he trumpeted at the time, “Let me say this as plainly as I can. If you buy a car from Chrysler or General Motors, you will be able to get your car serviced and repaired just like [sic] always. In fact, it will be safer than it has ever been. Because starting today, the United States will stand behind your warranty!”

Another Obama lie, of course. He stood behind GM warranties about as much as he has stood behind the American dollar . . .

Meanwhile, shares of the New GM hit a new low of $22 a share.




Share This


Seen and Unseen

 | 

Recently, President Obama stumbled through a poorly conceived bus tour of several states in the Midwest. The object of the junket seems to have been to counter media coverage of the GOP presidential candidates who’d gathered in Ames, Iowa, for the first major straw poll of the 2012 election cycle. Instead, Obama made comments about car- and truck-manufacturing that reminded listeners of his central-planning mindset. He treated a group of Tea Party leaders in a haughty and condescending manner. And, most damning, he stammered through the following self-justification:

“We had reversed the recession, avoided a depression, gotten the economy moving again. But over the last six months, we’ve had a run of bad luck.”

The man is not good at improv. And he’s not well-read. Numerous pundits (not all of them right-leaning) noted that the president’s excuses reflected this famous quote from the great Robert Heinlein:

“Throughout history, poverty is the normal condition of man. Advances which permit this norm to be exceeded — here and there, now and then — are the work of an extremely small minority, frequently despised, often condemned, and almost always opposed by all right-thinking people. Whenever this tiny minority is kept from creating, or (as sometimes happens) is driven out of a society, the people then slip back into abject poverty. This is known as ‘bad luck.’”

How could Obama, a man who trades on being seen as smart and articulate, make such a boneheaded gaffe?

I think this has to do with the ignorance and insular nature of American statists. They operate under a simplistic notion of politics — call it “Manichean,” if you feel like being generous about their philosophical grounding, “infantile” if you don’t.

Their adversaries are “enemies;” and their enemies are “terrorists,” “extreme” and “crazy.” (The quoted terms in that last sentence are from a few recent articles posted on dailykos.com — but Obama and his underlings have used them, too.)

These mutterings reflect a shallow worldview. American collectivists haven’t read the books that define and expand on free-market philosophy; most justify their ignorance by dismissing Hayek, Mises, Rand et al., as “evil.” Instead, they seem to skim some magazines and websites. Mostly, though, they watch TV. And they focus on the personal manners (and lives) of limited-government advocates, rather than the substance of the positions.

Obama’s supporters focus on (and equate themselves with) the weakest and least rational of the president’s critics — a motley crew of bigots and conspiracy mongers. As a result, Obama’s supporters weaken themselves. They can’t understand that there are rational criticisms of a president who has done so much damage to the philosophical and political foundations of the United States.

They just don’t see.

This blindness has rendered Rep. Ron Paul — the most effective advocate of real limited government among the recognized presidential candidates — something of an invisible man.

Because you read Liberty, you know more about Dr. Paul and his latest campaign for the White House than do most Americans. But, for a moment, put yourself in the shoes of an ordinary salt-of-the-earth citizen or even an impassioned Obama supporter. You’d probably have only the vaguest sense of who Ron Paul is. And you wouldn’t understand how many people share Paul’s perspective and beliefs. When Paul finishes a razor-close second in the aforementioned Iowa straw poll, you’d fall back on your epithets. Or just deny the whole thing.

Rep. Debbie Wasserman Schultz (whose retiree-heavy Florida congressional district gobbles up more than its share of federal benefit dollars) took the first option. Here’s some of the invective she hurled about the Iowa straw poll results, via CNN:

“In previous presidential campaigns, we might have chalked extreme fringe-type candidates like Michele Bachmann and Ron Paul as an anomaly. . . . But we’re looking at the core of the Republican Party now. The heart of the Republican Party is the extreme right wing.”

No surprise. A woman who counts on scaring pensioners to maintain her livelihood is bound to vilify people who talked about benefit cuts. I’d just like, once, to read a quote from the wretched Ms. Wasserman Schultz that didn’t include the word “extreme.”

Most statists, though, have simply chosen to pretend Paul doesn’t exist. He and Rep. Michele Bachmann finished in a near-tie for first place in the Iowa straw poll, separated by less than 1% of all votes cast. Mrs. Bachmann won but, if the vote had been an actual election, many jurisdictions would have called for an automatic recount. The next Sunday, Bachmann appeared on all five of the so-called “major” weekend TV news programs; Paul appeared on none.

Obama’s supporters can’t understand that there are rational criticisms of a president who has done so much damage to the philosophical and political foundations of the United States.

Bachmann — whose public persona strikes some as addled — serves as a stand-in during this election cycle for the absent Sarah Palin. Perhaps that’s why the establishment media revels in making Bachmann look ridiculous, as a recent and unflattering cover picture on Newsweek magazine proved. Media outlets that still favor Obama seem to be following a strategy of portraying Bachmann as “crazy” and, therefore, any Republican challenger to the president as crazy by association.

Paul is included in this scheme. But, mostly, he’s simply ignored. And this isn’t just a left-wing phenomenon. Fox News Channel’s top-rated host Bill O’Reilly, a statist of a nominally “conservative” stripe, goes out his way to ignore Paul. And, when pressed, O’Reilly dismisses Paul’s chances of winning even the GOP nomination as “zero.”

In the days after Bachmann’s media blitz, a slight shaft of light — from an unexpected source — cut through the willful darkness. TV talk show host and topical comedian Jon Stewart ran a humorous segment pointing out the media’s obvious denial of Paul’s presence and popularity. Stewart referred to Paul as “the 13th floor” of the presidential news coverage and took cable TV reporters to task for blatantly ignoring the congressman’s close second-place finish in Iowa.

The New York Times, the Associated Press and U.S. News (yes, it still exists as an online news site . . . but has dropped “and World Report” from its name) followed Stewart’s satire with semi-serious articles that discussed the media’s dismissal of Paul, in Iowa and in general.

The Associated Press piece acknowledged that Paul has raised enough money to stay in the presidential race for a long time. And that his supporters are more dedicated than most. But it concluded:

“Still, Paul finds himself outside the bounds of traditional Republicans. His opposition to the wars in Iraq and Afghanistan defines him as a dove. His skepticism toward the Federal Reserve has spooked Wall Street. And his libertarian views on gay rights draw the ire of social conservatives. He also tweaks Republicans on foreign policy, arguing it isn’t the United States' role to police Iran's nuclear program or to enforce an embargo with Cuba. ‘Iran is not Iceland, Ron,’ former Sen. Rick Santorum told Paul during Thursday's debate.”

An article on presidential politics that quotes Rick Santorum as an authority on anything is suspect, in my opinion.

The U.S. News piece concluded lazily by quoting an establishment media hack to characterize Paul’s candidacy:

“ 'He’s got a very dedicated cadre of people,’ says Larry Sabato, director of the University of Virginia’s Center for Politics. ‘And they're very intense, but they’re relatively few in number . . . It’s ridiculous talking about him getting the nomination.’ ”

Prof. Sabato’s record on political predictions is no more reliable than former Sen. Santorum’s.

I have no idea how well Ron Paul will do in the coming presidential primaries. But I know that he has the money and the organization in place to campaign until the GOP convention. And I know that he’s announced he won’t seek reelection to his seat in congress, so that he can dedicate himself to this presidential run.

I hope he lasts long enough to force more of the establishment media and the GOP powers-that-be to acknowledge he exists. And that his arguments for limited government are as mainstream as anything the rent-seeking Ms. Wasserman Schultz has to say.

Now, if we can only get them to acknowledge Gary Johnson . . .




Share This


The Best of the Alien Films

 | 

Our summer of the aliens ends with the best alien encounter movie of the decade. Attack the Block has it all: mysterious creatures crashing out of the sky; kids on bicycles pedaling to save the planet; a mass of hairy apes climbing up buildings; and avowed enemies unitingagainst the invaders. Add to this a truly libertarian hero who learns that "actions have consequences," and enough blood to paint an elevator. What more could you want from a summer movie?

You might not have heard of Attack the Block, but you probably know its pedigree. It's a British film produced by Edgar Wright, who made Shaun of the Dead (2004) and last year's Scott Pilgrim vs. the World. It's directed by Joe Cornish, who was also involved in Hot Fuzz (2007) and the upcoming Tintin. I have to admit, these films are an acquired taste, but I think they are a taste worth acquiring.

The story takes place in a neighborhood of high-rise apartment buildings in the poor part of south London. As Sam (Jodie Whittaker) walks home from her job as a nurse, she is mugged by a gang of threatening young men in ski masks. Their crime is interrupted by an alien falling out of the sky and into a car right next to them, and Sam is able to run away. The rest of the film follows the young thugs as they first try to make money from the beast and then run for their lives as the creature's larger pals come looking for it.

One of the unexpected delights of this film is the way we get to know the boys themselves. These are not hardened criminals but novice thugs on bicycles who strut down the street to impress each other while surreptitiously calling home to reassure their parents that they will be back by ten. Interestingly, Joe Cornish says he was inspired to write this film by being mugged by a gang of boys who seemed as scared as he was. They are led by a young tough with the unlikely name of Moses (John Boyega), who turns out to be quite the leader — almost like the preacher in Poseidon Adventure.

Moses recognizes that they can't rely on the police to help them, or even to believe them, so they must rely on themselves to escape the aliens and save the block. They don't seem to feel it is their responsibility to save the world, just their own little corner of it. As a libertarian, I like that. And then there are the unexpected side characters: the crazy drug dealers who get involved, the little wannabes who call themselves Probs (Sammy Williams) and Mayhem (Michael Ajao) . . . and the rich kid wannabe . . . and the crazy weapons . . . and clever lines . . . Just trust me. It's a great movie. And the less you know in advance, the better.

This is the best kind of sci-fi horror movie. Early encounters with the aliens take place off screen or behind walls, with sudden quick bursts of teeth or fur that don't let us focus enough to see what they look like. We just know they are terrifying. We see them creeping through the shadows, with occasional glimpses of their neon-bright teeth, but we don't have a full view of the creatures until at least halfway through the film. To be sure, there's enough blood and gore to warrant the R rating, but the violence is brief and somehow fun.

Give Attack the Block a try. You'll be laughing with horror and screaming with delight.


Editor's Note: Review of "Attack the Block," directed by Joe Cornish. Studio Canal, 2011, 87 minutes.



Share This


No, Really — Why Does He Do That?

 | 

Have you ever noticed that President Obama runs whenever he sees a set of stairs?

He always (check this out, and you’ll find I am right) runs up and down the steps to his plane. If there are steps to the platform where he’s going to speak, he runs up and down those steps. I mean, even when he’s speaking in the White House, where he lives, he thinks he needs to run the two steps to his podium.

Now he’s touring the Midwest on a gargantuan bus that is supposed to make him look like a normal Midwesterner. Good luck. But the vehicle has the usual three steps between the ground and the body of the thing. So after every speaking engagement, Obama hauls off and runs up the steps of the bus. He runs up three steps.

I don’t mean that he walks fast. I mean that he runs like a junior high school kid doing his first competitive sprints. And he doesn’t just run the steps, he takes off from ten feet away, as if he needed to win third place at the Kalamazoo County JV meet and knew that you’ve gotta show your hustle if you wanta win. His arms are pumping, his legs are striving to please the coach, and his demeanor is like, “I can do it! I can run up these steps.”

For God’s sake, what kind of person is this?




Share This


"The Help" Deserves the Buzz

 | 

The Help is the film everyone has been talking about this week. Based on the bestselling novel of the same name by Kathryn Stockett, it has been eagerly awaited by book club members and sensitive readers nationwide since it was published two years ago. The film provides an intimate look at the often-demeaning relationship between white women in Mississippi and the black maids who served them during the turbulent 1960s.

During this time, women up north were beginning to recognize the vast career options available to them. But in the Deep South, women were still staying at home with their children, joining the Junior League, hosting bridge clubs, and criticizing "the help" — and each other. In this story, Hilly Holbrook (Bryce Dallas Howard) is the "queen bee" whose opinion matters to everyone, black or white. She controls the social life of the town by voicing her opinions firmly and then leads the shunning of anyone who dares to disagree with her. Her kind of female has always existed, of course, and not just in the South. She has been immortalized in such films as The Women and Mean Girls, and can still be found controlling social groups, PTA meetings, cheerleading squads, and even board rooms, with a raised eyebrow and a withering look. No one likes her, but no one dares to cross her.

In the story, Hilly has been leading her group of friends since grade school. All of them are now married with children, except Skeeter (Emma Stone), who has chosen to finish college and wants to become a writer. She lands a job at the local newspaper as an advice columnist answering questions about house cleaning. Ironically, of course, Skeeter has never polished a spoon or scrubbed a bathtub ring in her life. So she turns to "the help" for help, in the person of Aibileen (Viola Davis), her friend Elizabeth's maid. Eventually she convinces Aibileen and a dozen other maids to share their stories, and a book is born.

As a nation we are proud of how far we have come in terms of civil rights. But we still notice racial differences and often act accordingly.

Aibileen is what Skeeter ought to be. Like many white college graduates, Skeeter simply "wants to be a writer." She doesn't have a burning topic just itching to come out. She wants the title of "writer" as much as she wants the occupation. When she applies for a job at Harper & Row, the editor (Mary Steenburgen) tells her, "Write about something that disturbs you, particularly if it bothers no one else." Skeeter looks for a topic that will allow her to become a writer, rather than using her writing to expose a problem she cares deeply about. Aibileen, by contrast, is simply a writer. She writes every night for an hour or two. She writes what is in her soul. She writes her prayers.

In many ways, Viola Davis as Aibileen carries the show and at the same time embodies the central conflict of the story. I say this because, although Davis is one of the finest actors in Hollywood, with an Oscar to her credit, you will seldom see that accolade in print without the modifier "black actress." As a nation we are proud of how far we have come in terms of civil rights: our schools and neighborhoods are fully integrated. We have a black president in the White House. But we still notice racial differences and often act accordingly. I would love to ask Davis how she feels about the roles she has been offered.

Equally impressive is Octavia Spencer as Aibileen's best friend, Minny Jackson, an outspoken maid who has lost so many jobs because of her sassy back talk that she now works for the last woman in town who will hire her — Celia Foote (Jessica Chastain), who is shunned by the ladies because of her "white trash" background. Celia doesn't know the rules of maid-employer relationships. Ironically, Minny teaches Celia the boundaries she and the other maids are trying to expose with Skeeter’s book. Spencer's large liquid eyes alternately shine with sharp-witted laughter and melt into pain-filled tears. If Aibileen is the soul of this black community, Minny is its heart.

Having read the book, I wasn't pleased to learn that the beautiful Emma Stone had been cast as the tall, skinny, unattractive Skeeter, since her gangly appearance is such an important part of her character. But somehow Stone manages to look like a plain Jane in this film — her eyes are too big, her lips are too thin, her hair is too curly, and her face is too pale. In short, she is perfect.

Despite having grown up in Jackson, Skeeter really doesn't fit in with her snooty friends. She is disturbed by Hilly's insistence that Elizabeth install a separate bathroom for Aibileen. In fact, Hilly wants a law mandating separate facilities in private homes, "for the prevention of disease." This prompts Skeeter to examine the way maids are treated by the women who employ them. "Colored women raise white children, and twenty years later these white children become the boss," she muses. "When do we change from loving them to hating them?" Aibileen observes the same dilemma: "I want to stop that moment coming — and it come in ever white child's life — when they start to think that colored folks ain't as good as whites."

Toilets, and the material that goes into them, become the strongest recurrent image in this film. From diapers and potty training to vomiting and pranks, toilets are a symbol for what was wrong with the "separate but equal" policy in the south. The facilities were separate, but they most assuredly were not equal. Aibileen's bathroom is a plywood closet located in a corner of the garage with a bare bulb hanging from a wire, and toilet paper resting on a bare 2x4. The symbol, which emphasizes how badly blacks could be treated by whites in those days, provides moments of both shame and laughter.

However, the film misses the richer, darker, and more sinister tone that underlies the book. For black women to write about their employers was no joke, and the book makes it clear that its women are risking real dangers when they decide to tell the truth. Permanent job loss, physical violence, and even jail are real threats in a society where the mere accusation of a crime can lead to vigilante justice with lifetime consequences. By showing this clearly, the book gains a tension and suspense that is missing from the film.

The most important question asked by The Help is this: how did these southern women go from loving the black maids who reared them as children to degrading them in adulthood?

Strangely, I found it more difficult to enter the minds and lives of the maids while watching the film than I did while reading the book. The story is told through the three voices of Aibileen, Minny, and Skeeter, who narrate alternating sections of the book. These voices are strong and rich, and I could enter their worlds, empathizing with their experiences vicariously. In the film, however, I was merely an observer. I often felt defensive, rather than empathetic, about what I was seeing, as though I were somehow responsible for the actions of those women long ago, simply because I am white. If we learn anything from our battle for civil rights, however, it is that each person should be judged individually, and not collectively as part of a race.

The most important question asked by The Help is this: how did these southern women go from loving the black maids who reared them as children to degrading them in adulthood? Stockett, who was reared in Mississippi by a black maid whom she says she loved, suggests that they learned it from their mothers, by example as well as by instruction. To quote Oscar Hammerstein in South Pacific, racism "has to be carefully taught." But books like this also suggest that children can be carefully taught not to be judgmental. Every day Aibileen tells Elizabeth's little girl, "You is smart. You is kind. You is important." She says nothing about little Mae Mobley's appearance, good or bad. Knowing that she will likely be fired or retired before Mae Mobley reaches her teen years, Aibileen hopes desperately that these words will be enough.

As is often the case, the film is good, but the book is so much better. Don't take a short cut this time. Read The Help first, and then see the movie. You will enjoy both so much more if you do it that way.


Editor's Note: Review of "The Help," directed by Tate Taylor. Dreamworks, 2011, 137 minutes.



Share This


Global Warming Updates

 | 

Two recent stories concerning the theory of Anthropogenic Global Warming (AGW) caught my eye and are worth noting.

The first is the news from Forbes that a recent study of NASA satellite data from the last decade (2000–2010) shows that far more heat is escaping the earth’s atmosphere than has been predicted by AGW computer models. This in turn means that there will be far less global warming than predicted by those models, which were used by the UN climate science panel which took a dire view of the planet’s “warming.”

As the study’s co-author, Dr. Roy Spencer — a climate scientist at the University of Alabama at Huntsville and Science Team Leader for the Advanced Microwave Scanning Radiometer on board NASA’s Aqua satellite — put it, there is a huge discrepancy” between the empirical, observational data coming from NASA’s Terra satellite and what has been predicted by the climate warming crowd.

If you still have the quaint and antiquated notion that scientific theories ought to comport with observed data, this gap is, to say the least, disconcerting.

But the Terra satellite data are consistent with earlier data from another NASA satellite (the ERBS satellite) from an earlier period (1985 to 1999), which showed that vastly more long-wave radiation (therefore heat) escaped the atmosphere than was predicted by the global warming models. We now have a quarter of a century of data from two different satellites, pretty much saying the same thing.

The problem for the computer models seems to be that they predict that the increase in CO2 will cause an increase in atmospheric humidity and cirrus cloud cover, which in turn will trap heat, but the data seem at variance with the prediction. Curious, no?

The second story is about the scientist — one Charles Monnett, to be precise — who published an influential article in the journal Polar Biology in 2006 urging the claim that polar bears were drowning in the Arctic Ocean, presumably because the ice had melted from global warming. The article was based on Monnett’s observations, and this “peer-reviewed” article became an instant hit in the world of environmental activists. The article helped bring the polar bear to the forefront of the worldwide enviro movement. For example, the allegedly beleaguered animal figured into Al Gore’s movie, An Inconvenient Truth, which showed sad polar bears — oh, so cute and cuddly! — swimming desperately in search of ice.

That research was then cited in 2008 when the Department of the Interior decided to put the polar bear on the endangered species list. And it is frequentlyused as part of the evidence that global warming is an imminent threat to animal life, so we need massive policy changes, with potential costs in the trillions.

Now, this particular bit of “science” should have aroused some scrutiny before, because it reeks of tendentious incompetence at work. The observational base of the study allegedlyconsisted of four (count ’em, four) polar bear carcasses floating in the ocean, observed from a plane flying at an altitude of 1,500 feet, on a research expedition studying — whales! No autopsy was done on the bears to see if they had drowned; their drowning was just “inferred.”

Note: the internal “peer review” panel included Monnett’s wife! “Yeah, Honey, your paper looks super! Please pass the pasta . . .”

Monnett is now under investigation for scientific misconduct by the Department of the Interior’s Inspector General’s Office, and has been placed on administrative leave from “the federal agency where he works.”Researchers are talking to him and his research partner about their work. Monnett’s career may wind up looking like . . . well . . . a dead polar bear from 1,500 feet up.




Share This

© Copyright 2013 Liberty Foundation. All rights reserved.



Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.