Kashmir: The Constant Conflict

 | 

On February 26, 2019, the Indian Air Force, for the first time since 1971, conducted a raid inside Pakistan, and allegedly hit a terrorist training camp, killing more than 250 terrorists. Pakistan showed photographs of damage to a tree or two. According to Pakistani officials, no one died and no infrastructure was damaged.

It is hard to know the truth, for India did not provide any evidence, nor did Pakistan allow journalists access to the site. Both governments blatantly lie to their citizens, retailing falsehoods so hilarious that even a half-sane person could see through them. But drunk in nationalism, Indians and Pakistanis normally don’t.

India’s intrusion was in response to a suicide car-bombing on February 14 in Kashmir, a bombing that killed 45 troops. Indians were moving a convoy of 2,500. They were in buses, not in armoured cars, as officially stated. Challenging the army is sacrilegious, so no one asks why their movement was so badly planned, and why they had not been airlifted, which would have been far cheaper and easier.

Both governments blatantly lie to their citizens, retailing falsehoods so hilarious that even a half-sane person could see through them.

In all the ramping up of emotions in the aftermath of the suicide bombing on the troops, it became very clear that the Indian Prime Minister, Narendra Modi, would lose the next elections, which are due in a couple of months, unless he retaliated. Sending India to war was a small price.

Soon after India’s intrusion, Pakistan closed its airspace. Tension at the border went up significantly, and continues.

A day later, Pakistan attempted airstrikes in India. In the ensuing challenge, one of India’s MIG-21, known as flying coffins because they are very old and outdated, was shot down by a Pakistani missile. The Indian pilot parachuted into Pakistani territory. India claimed to have downed a Pakistani F-16. Pakistan denied the claim.

TV stations in both countries were singing songs about the valor of their troops, which consist of uneducated rural people with no other job opportunities and absolutely no clue about what they’re fighting for. These troops act as gladiators for the spectacle of the bored, TV-watching masses, who feel vicariously brave while munching their chips. Of course, the social media warriors know that it is not they who would be at the frontlines in any serious conflict.

It became very clear that the Indian Prime Minister, Narendra Modi, would lose upcoming elections unless he retaliated. Sending India to war was a small price.

It is not in the culture of the Third World masses to feel peace and happiness. Either they are slogging away in the field and going to sleep a bit hungry — which helps to keep them focused and sane — or, if they have time on their hands, they become hedonistic and graduate to deriving pleasure from destructive activities. The latter becomes apparent as soon as they have enough to eat. This feedback system in their culture applies entropic force to take them back to Malthusian equilibrium.

Pakistan’s raison d'etre is to obsess over Kashmir and the human rights violations therein that the Indian army inflicts, oblivious of a much worse tyranny provided by its own army and fanatics, particularly in Baluchistan. Once Pakistan’s social media had put the people into a trance of war, officials had no option but to retaliate.

Both armies are thoroughly incompetent and disorganized, and extremely corrupt. (Troops in India actually double up as house-servants of their bosses — something that would be inconceivable to a well-organized and truly nationalistic body of soldiers.) The tribal societies of Pakistan and India merely posture; they have no courage to go into a real war. But alas! Posturing can become reality.

On this occasion, threats of nuclear bombing were made. The bombs would probably have failed to explode, but it was obvious that the United States could not be a bystander. Despite the fact that Trump was busy in Vietnam with another nuclear-armed country, North Korea, he had to make a few calls. He had to interfere, as an adult does when two kids are fighting. Those of us who complain — quite rightly — about the US military-industrial complex should consider the unseen, unrecognized good that the US does in helping to avoid a nuclear holocaust.

Once Pakistan’s social media had put the people into a trance of war, officials had no option but to retaliate.

The cause of such a war — the stated point of contention between between India and Pakistan — is Kashmir. They both want to have Kashmir. And, just to complicate things, some Kashmiris want full independence. But it must be said: the approach of everyone involved is grossly stupid.

Kashmir (including Jammu, “the gateway to Kashmir”) has a GDP of US $22 billion. It has only 1% of India’s population, but it gets 10% of federal grants. India’s defense budget is US $52 billion, with Kashmir as the primary reason; and because of Kashmir a lot of additional funds are spent on internal security, including the 500,000 Indian troops positioned there.

Kashmir is a bottomless pit for India, and the money does no good for Kashmir, either. Kashmir must exist under the tyranny of terrorists and of Indian forces, who under the law do not face accountability in the courts. Kashmir has no resources of value or any economy of substance; its populace is inward-looking and fanatic. There is no reason for India not to kick Kashmir out of the federation.

Pakistan, with a fraction of India’s economy, spends money comparable to India’s to try to take over Kashmir, occupy the one-third of Kashmir that it has right now, train terrorists, and, as a consequence, destroy itself economically and socially. Were Kashmir to join Pakistan, it would offer only negative value, dragging down Pakistan’s per capita GDP. There is no rational reason for Pakistan to accept Kashmir, let along fight for it.

Threats of nuclear bombing were made. The bombs would probably have failed to explode, but it was obvious that the United States could not be a bystander.

Kashmir as an independent country would be landlocked and not much different from Afghanistan. No sane Kashmiri would want to be independent from India. Although India is backward and wallows in poverty and tyranny, in relative terms it is the best hope for Kashmir. Moreover, Ahmadi Muslims who went to Pakistan after the separation of 1947 are deemed non-Muslims by mainstream Pakistanis and by Pakistan’s constitution. The same fate awaits Kashmiris if they join Pakistan.

In a sane world, there is nothing to negotiate. As you can see above, I could be on any of the three sides of the negotiating table and accept demands of the other two without asking for anything in return. Unfortunately, my compromises would not be seen as such. In keeping with Third World proclivities, they would be seen as signs of weakness, and new demands would soon be made, ceaselessly generated by superstition, ego, expediency, tribalism, and emotion. This, not Kashmir, is the primary problem, and this is the reason why here is no solution, ever.

Muslims are not the only culprits — it is merely that talking about them post-9/11 is politically more acceptable. In the Indian subcontinent, the Middle East, and Africa, Christians, Hindus, Buddhists, and Muslims are all included in the cycle of tyranny and irrationality. If Islam comes across as worse, it is mostly because in these places it has institutionalized irrationality, fed on it, and been self-victimized by it.

Kashmir is a bottomless pit for India, and the money does no good for Kashmir, either.

Since the inclusion of the sharia in Pakistan’s constitution in the 1980s, Pakistan, which was until then richer than India on a per capita basis, has taken a rapid slide downwards. Today, freedom of speech is so constrained that any accusation of having said a word against the “holy” book or the army can result in capital punishment — if, that is, one avoids getting lynched before reaching the courts.

A Christian woman, Asia Bibi, was sentenced to death in Pakistan in 2010 for the crime of drinking water from a cup reserved for Muslims. After a decade of prison, she was released, not because the supreme court saw the case as utterly stupid, which it should have, but because it didn’t see a clear proof that she had committed the “crime.” Pakistan erupted in civil chaos as millions walked the streets, asking for her blood. In my totem pole of values and consequences, Pakistan is 25 years ahead of India in self-destruction.

I arrived in India last week. Corruption these days hits me soon after I land. It has now become customary for the toilet-caretaker at the airport to demand a tip. With his dirty hands he offers tissue paper to me and tries to make me feel guilty if I don’t accept it.

In the Indian subcontinent, the Middle East, and Africa, Christians, Hindus, Buddhists, and Muslims are all included in the cycle of tyranny and irrationality.

The Indian government has tried to control corruption, through the demonetization of 86% of currency in 2016 and the imposition of a nationwide sales tax a year later. While these haven’t controlled corruption, they have managed to seriously harm the economy by destroying the informal sector, which employs 82% of Indians. And without the informal sector, the formal sector will falter.

Financial corruption is not even the real problem. Were bribery to stop, India would rapidly become North Korea or Eritrea. I say that because financial corruption is a necessary safety valve in overregulated societies. When such backward societies do manage to control bribery in isolation, they create extremely suffocating environments. North Korea and Eritrea have actually controlled bribery by getting their citizens to snitch on each other and by extraordinary levels of punishments. Backward societies like these are necessarily subdued and stagnant, lack of skills being the real reason for their backwardness; and the lack of the safety valve of bribery constricts whatever potential they have. But financial corruption, a symptomatic problem, is seen as the prime problem by politically correct kids who go to study at Ivy League colleges and then to work for IMF, the World Bank, etc., without a real-life experience. They see financial corruption being removed from one place, only to find it reappearing in another; they don’t understand what is happening.

India is an ocean of corruption, but it’s not just financial. More importantly, it’s cultural. The real corruption is cultural irrationality, the irrationality of people who operate not through honesty, pride, compassion, or honor, but through expediency. Trying to control bribery in such societies does not work, because bribes are just a part of the whole package of social corruption and irrationality.

Financial corruption is a necessary safety valve in overregulated societies. When such backward societies do manage to control bribery in isolation, they create extremely suffocating environments.

As the economy has grown, India has been on a path to increased fanaticism and violent nationalism. These days, if you are found to be in possession of beef, you risk getting lynched. Nationalism is on the rise, rather rapidly. You are forced to stand up for the national anthem before the start of movies in cinema halls. Complaining against the Prime Minister on social media can land you in prison. Opposing his policies can get you beaten up. India’s constitution stays secular, but the trend is in the same direction that Pakistan has been on.

The World Bank, IMF, etc. continue to report that India is among the fastest growing economies in the world, and is perhaps even faster growing than China. While these numbers are completely erroneous, even if they weren’t, institutionally the Indian subcontinent has been rudderless since the time the British left. All economic growth since the time of so-called independence has come because of importation of technology from the West.

But what about the fact that India has one of the largest numbers of engineers and PhDs in the world? It is easy to get a degree without studying — and not just in India — and the results are obvious. In the age of the internet, when a competent engineer can work remotely for a Western client, Indian “engineers” work as taxi drivers, deliver Amazon products, or get jobs as janitors. Their degrees are just degrees on paper.

India has been on a path to increased fanaticism and violent nationalism. These days, if you are found to be in possession of beef, you risk getting lynched.

Moreover, education is a tool; so is technology. They must be employed by reason. Without reason, “education” and technology serve the wrong masters: tribalism and superstitions. No wonder that with increasing prosperity, “educational” achievement, and better technology, India is regressing culturally.

India is massively lacking in skills. As I write sitting in India today, I ask my maid, who is joining the university soon, not to put the dusty carpet on my bed. But I must remind her this every day. She struggles to write her own name. Very simple algebra is beyond her grasp. Her case might be an extreme one, but most Indians are completely unprepared for the modern economy. This is the reason why you hardly see anything in Western markets that is made in India, despite India’s having more than one-sixth of the world’s population. It is virtually impossible to form a company of five people in India and expect it to work with any kind of efficiency.

People often blame China for copying Western technology. While that is true, one must recognize that copying takes a certain amount of skills that people in some other economies simply don’t have. The situation of India has worsened as the best of Indians now increasingly prefer to leave for greener pastures, even including Papua New Guinea. Lacking leadership, post-British India is rapidly becoming tribal, fanatic, and nationalistic. We must remember that India as a union is together only because of inertia from the days of the British. When the inertia is gone, India will fall into tribal units, as will Pakistan and much of the rest of the Third World.

Without reason, “education” and technology serve the wrong masters: tribalism and superstitions.

A horrible war will one day break out between India and Pakistan. It will not be because of Kashmir, which is just an excuse, but because irrational people always blame others, envy, and hate them. They fail to negotiate. They have no valor, but constant posturing will eventually trigger something. There is no solution to their problems. Every problem that the British left behind has simmered and gotten worse.

As soon as India reaches a stage where it can no longer grow economically because of imported technology, its cultural decline will become rapidly visible. Though India is 25 years behind Pakistan, both are walking toward self-destruction, to a tribal, medieval past.

As for the US, the job of any rational US president is to help ensure that destruction stays within the borders of India and Pakistan.




Share This


The Economics of Compassion

 | 

America’s elite class would argue that the US economic policies of the past half century have been, overall, quite successful. They produced strong economic growth while supporting the ever-expanding role of governments (federal, state and local) in American society. Adhering to an ideology of compassion that was adopted by the Democratic party of the Great Society era, these policies reeked with concern for the poor, the marginalized, the little guy — often including poor, marginalized, little guys in other countries. Yet, even under the enormous fiscal drag of their profligate benevolence, the US economy performed as desired. Consumption and corporate profits soared, and GDP growth tripled. Our rulers no doubt admire their work.

But admiration from American workers should be withheld. For only the expression of compassion has reached the American working class. The revealed compassion for the average worker has been wage stagnation, and for many millions of displaced workers and their families immeasurable harm. Until 1973, wage rates rose with productivity gains; workers were rewarded for their efforts. From 1973 to the present, however, while worker productivity increased 77%, wages for the average worker increased only 12.4%. Gains for the top 1% of wage earners exceeded 150% during this period; those of the bottom 90% increased by a meager 21%. The reason, explains the Economic Policy Institute: “the fruits of their labors have primarily accrued to those at the top and to corporate profits.”

Prior to 1973, those at the top (let’s call them the old aristocracy), paid workers a wage dictated by the limited supply of American workers. Ironically, the old aristocracy had few, if any, compassionate policies toward its workers. In those days, compassion was the responsibility of individuals, families, churches, local charities, and community organizations. Nevertheless, wages kept pace with productivity gains. By 1973, those at the top (let’s call them the new aristocracy) had shifted the responsibility for compassion to government and large corporations. Through War on Poverty programs, hiring quotas for women and minorities, education and welfare reforms, and many others, and through the decline of labor unions, the new aristocracy emerged as the champion of the working class — while stanching its wage increases. It began paying workers a wage dictated by a large supply of foreign labor. Outsourcing jobs to, and importing labor from, Third World nations suppressed the wages of American workers and threw millions of them into chronic unemployment.

From 1973 to the present, while worker productivity increased 77%, wages for the average worker increased only 12.4%.

After 50 years of ignoring wage stagnation and its effect on American workers and American families, the champions are prepared not only to ignore its continuation but additionally to ignore the ongoing, frenetic wave of automation. Even as numerous studies (e.g., here, here, and here) predict that by 2030 automation could reduce the demand for US workers by tens of millions, the new aristocracy pursues policies (e.g., higher fertility and more immigration) to increase the supply. It would be hard to find a better prescription for suppressing wage rates and widening the income inequality gap. A study by the Council on Foreign Relations, which is hardly an anti-immigration or anti-globalist group, warns that “automation and artificial intelligence (AI) are likely to exacerbate inequality and leave more Americans behind.”

Leaving Americans behind is a core economic principle of the new aristocracy. As far back as 1992, The Revolt of the Elites and the Betrayal of Democracy, by liberal historian Christopher Lasch, described the new aristocracy as a globalist, professional-managerial class that is cosmopolitan in its world view, and unlike the aristocracy that it replaced, holds only a weak sense of civic responsibility to its local and regional communities. Happy to internationalize the division of labor, it follows policies that have diminished middle-income America and condemned low-income America to a permanent lower class. Lasch lamented its rise to power, chastising its members for “turning their back on the heartland and cultivating ties with the international market in fast-moving money, glamour, fashion and popular culture. It is a question whether they think of themselves as American at all. . . . Theirs is essentially a tourist view of the world.”

This tourist view has shaped economic policies that prioritize the growth of GDP over the welfare of those who produce it, including the welfare of their families and communities and of American society. The new aristocracy professes its concern for American workers but treats them with disdain. In his 2012 book Coming Apart, libertarian Charles Murray discussed what he called the New American Divide, in which the common civic culture once maintained by the old aristocracy has been unraveled by the new aristocracy. In Murray’s account, one side of the divide lives in upper-middle-class suburbs, statistically represented by a fictitious neighborhood called Belmont. Its inhabitants have “advanced educations, often obtained at elite schools, sharing tastes and preferences that set them apart from mainstream America.” Its most powerful residents, our new aristocracy, run the country: “they are responsible for the films and television shows you watch, the news you see and read, the fortunes of the nation's corporations and financial institutions, and the jurisprudence, legislation and regulations produced by government.”

This tourist view has shaped economic policies that prioritize the growth of GDP over the welfare of those who produce it.

In contrast, the fictitious neighborhood of Fishtown represents working class America. Its inhabitants “have no academic degree higher than a high-school diploma.” They hold blue-collar jobs, low-skill service jobs, or low-skill white-collar jobs, if they work at all; the work ethic, along with the institutions of marriage and religion, plummets. Of the unraveling, Murray writes:

If you were an executive living in Belmont in 1960, income inequality would have separated you from the construction worker in Fishtown, but remarkably little cultural inequality. You lived a more expensive life, but not a much different life. . . . Your house might have had a den that the construction worker's lacked, but it had no StairMaster or lap pool, nor any gadget to monitor your percentage of body fat. You both drank Bud, Miller, Schlitz or Pabst, and the phrase "boutique beer" never crossed your lips. You probably both smoked. If you didn't, you did not glare contemptuously at people who did.

Little has changed since the publication of Coming Apart. If anything, the new aristocracy’s sense of civic responsibility has weakened. It glares even more contemptuously at Fishtown.

In his article “The Working Hypothesis,” Oren Cass asks, “What if people’s ability to produce matters more than how much they can consume?” and offers the hypothesis “that a labor market in which workers can support strong families and communities is the central determinant of long-term prosperity and should be the central focus of public policy.” Policies that have catered to “marginalized” identity groups and the cheap-labor demands of corporate America have failed to bring true prosperity to mainstream America. Without meaningful work at a wage that rewards the worker with dignity and respect, families suffer (if they are formed at all) and communities crumble. Who would marry a man who could not find a job, or one whose wages are so low that it’s not worth building a robot to replace him? Who would want to live in a decaying community overrun with unemployed and unmarried men, pacified by drugs and videogames, and unencumbered by civic responsibility? Writes Cass, “In a community where dependency is widespread, illegality a viable career path, and idleness an acceptable lifestyle, the full-time worker begins to look less admirable — and more like a chump.”

Yet this is the society that has emerged from the policies of the new aristocracy — an unraveled, divided culture in which any desire to rehabilitate the citizens of Fishtown has long since left Belmont. Equally shameful, it is a society that can find no productive use for tens of millions of its working-age adults. These citizens constitute an immense, chronically unemployed underclass that has been omitted from the political arithmetic of GDP growth because they have been deemed unsuitable for work: criminals, alcoholics, the homeless, the disabled, the suicidal, not to mention the victims of welfare dependence, family disintegration, and opioid addiction. Not that many members of this forlorn cohort are without responsibility for their predicament, but there are effectively no procedures to redeem their productive value. It makes better economic sense to replace them with clever machines and cost-effective foreign labor.

Who would marry a man who could not find a job, or one whose wages are so low that it’s not worth building a robot to replace him?

Indeed, mere replacement may not be enough. In a tasteless satire called “Only Mass Deportation Can Save America,” New York Times opinion columnist Bret Stephens recommends that to spur GDP growth “complacent, entitled and often shockingly ignorant” Americans should be deported. Why go through the trouble of making underclass Americans better citizens? Simply ship them out and order “new and better” upgrades: immigrants from the Third World. America’s criminals, academic underachievers, the unhealthy, infertile couples, and shockingly ignorant mothers (of out-of-wedlock children) are contemptible enough, laments Stephens, to jeopardize his deportation plan. “O.K.,” he comments, “so I’m jesting about deporting ‘real Americans’ en masse. (Who would take them in, anyway?)”

But aristocratic preference for younger, more fertile, harder working foreign labor could fade — as new and better immigrants are replaced by new and better machines. According to a recent Pew Research report, immigrants will constitute 100% of the increase in the US labor force between now and 2030. To a very large extent, however, the jobs that immigrants perform are the very jobs that, by 2030, automation will eliminate. For example, a 2016 Obama administration study found that automation-induced job destruction will be “highly concentrated among lower-paid, lower-skilled, and less-educated workers,” noting that “83 percent of jobs making less than $20 per hour would come under pressure from automation.” And the elite upper class will come under pressure to express its compassion for the millions of hastily invited immigrants who will then be herded into the underclass.

The idea that meaningful work might be important to the worker, and to American society, has escaped the new aristocracy, especially its members who inhabit America’s centers of economic power — hulking citadels for millionaires, hipsters, and tourists. In terms of consumption and GDP, they are the only cities that matter. Yet these cities are smothered by dense low-income populations, immigrant and native-born, and their diverse miseries. The new aristocracy is as oblivious to these miseries as it is to life in America’s heartland. It is not concerned that the economy that has enriched itself is headed to a lopsided state in which the number of unemployed exceeds the number of employed — the underclass outnumbering the chump class. Nor is it concerned that its policies have produced citadels such as San Francisco, where homelessness is rampant, “poop patrols” clean human feces from the sidewalks, and injection drug addicts outnumber high school students.

New Aristocratic preference for younger, more fertile, harder working foreign labor could fade — as new and better immigrants are replaced by new and better machines.

The new aristocracy should worry that working-class America will discover the hoax of liberal compassion. American gilets jaunes might take to the streets of Washington DC, the wellspring of policies that have relegated the working class to what French writer Christophe Guilluy would call “peripheral America.” In Guilluy’s view, while the globalist economic model produces much wealth, “it doesn’t need the majority of the population to function. It has no real need for the manual workers, labourers and even small-business owners outside of the big cities.” In France, as in America, the model is embraced by “celebrities, actors, the media and the intellectuals,” who are unconnected with life outside New York, Boston, Chicago, Los Angeles, and other economic juggernauts.

Chic liberal thinkers see no downside to the ongoing wave of automation. New compassionate polices, they assume, will surely be developed to help the many millions of workers — citizens of Fishtown — whose jobs will be eliminated. Displaced workers will be retrained; they will go back to school; and surely, this group will do better than the shiftless, slothful underclass that has already been left behind. But the new aristocracy, warns Guilluy, “needs a cultural revolution, particularly in universities and in the media. They need to stop insulting the working class, to stop thinking of all the gilets jaunes as imbeciles.” America’s imbeciles should heed Lasch’s warning that, with the liberal elite, “compassion has become the human face of contempt.”




Share This


The Second Machine Age and Worker Creativity

 | 

The United States is the most advanced industrialized nation in the world. Its economy produces the greatest GDP (total quantity of goods and services), doing so with the greatest productivity (GDP per capita). Rapid productivity increase through technological advance has been, and continues to be, the prosperous economic history of the US.

The present wave of technological progress (aka automation, which includes advances in software, semiconductor devices, artificial intelligence, and robotics) is accelerating the pace of productivity increase. That is, while automation is a tremendous gift that allows the US economy to produce more goods and services with less human labor, millions of workers will be displaced. It is incomprehensible, therefore, that the solution to a shrinking demand for workers is, apparently, to increase the supply.

As early as 2030, 39 to 73 million jobs could be eliminated by automation.

This shrinkage became evident at the turn of the century, when job growth, particularly among lower-paid, lower-skilled, and less-educated workers, fell into stagnation. According to MIT researchers Erik Brynjolfsson and Andrew McAfee, US job growth has now become decoupled from US productivity growth. They believe that productivity growth will increase relentlessly, as it has since the end of WWII, but US employment growth, which had previously risen with productivity growth until 2000, will remain flat, as it has done so since 2000. The authors attribute the cause of the Great Decoupling to the rapidly declining price of digital labor relative to human labor. The dramatic economic and cultural implications of such automation is examined in their bestselling book, The Second Machine Age.

A recent McKinsey Global Institute study estimates that, by as early as 2030, 39 million to 73 million US jobs could be eliminated by automation. There are numerous other studies with similarly ominous predictions. The jobs that are most susceptible to automation are ones that involve routine procedures that can be performed by computerized algorithms. These include machinery operation, vehicle operation, fast food preparation, data collection and processing, financial services, and customer services. Well-paying jobs that can be obtained today with only a high school education will be the first to go. The jobs that will survive, and thrive, are those involving cognitive tasks, favoring occupations where skill and education are important. They will include engineers, scientists, medical professionals, and teachers. Luckily, they will include jobs for those who apply high cognitive ability and skill to actual work, such as carpenters, electricians, mechanics, plumbers, and welders.

Future jobs, the jobs that will be created by the 2nd Machine Age, will no doubt favor those educated in the STEM (Science, Technology, Engineering, and Mathematics) disciplines. Beyond these job types, it’s anyone’s guess. In a Harvard Business Review interview, Brynjolfsson and McAfee discuss three job categories that will survive automation. The first is “high-end creativity that generates things like great new business ideas, scientific breakthroughs, novels that grip you.” The second is deciphering body language. People who excel at “emotion, interpersonal relations, caring, nurturing, coaching, motivating, leading, and so on” will be in demand. Dexterity and mobility comprise the third category: because “it’s unbelievably hard to get a robot to walk across a crowded restaurant, bus a table, take the dishes back into the kitchen.”

Well-paying jobs that can be obtained today with only a high school education will be the first to go.

But it’s a pretty good guess that the jobs-of-the-future market will be much smaller than markets that followed any previous waves of technological advance. The number of high-end creativity jobs will be about the same. That is, the number of elites that currently hold such lofty positions is not likely to grow. Even if it did, say doubling, it would still be a statistically insignificant portion of the labor force. Similarly, it’s not likely that we will need a greater number of body language decipherers than we need today. Robot controllers will obviously be needed (e.g., to control fleets of driverless trucks and flocks of pilotless drones). And, of course, the demand for robot repairmen will skyrocket.

Elsewhere, things look bleak. Robots that serve up to 360 (basic to gourmet) sandwiches per hour are looming, and they don’t have to navigate crowded restaurants. Machine operator jobs will no doubt survive. After all, the machines of the future will need human operators. So the guy that ran the old machine that replaced ten workers will be retrained to operate the new machine that replaces a thousand workers. And even in occupations that experience an increase, the gains will be short-lived. For example, AI will eventually replace vaunted robot controllers. Technological advance is simply destroying jobs faster than it is creating them.

Why is it, then, that with a rapidly shrinking demand for labor, we need more workers? In an era of stagnant wage growth, an increase in the labor supply can only cause a decrease in wage rates. The standard answer is that the US economy must be infused with millions of young workers to support America’s aging population in retirement. That is, the working age tax base must be expanded to keep our intergenerational Ponzi schemes (aka Social Security and Medicare) from going broke. America’s declining birth rate, warns Lyman Stone, of the American Enterprise Institute (AEI), “will likely have far-reaching negative economic consequences.” To stave off insolvency, Mr. Stone proposes a “fertility dividend” policy, where the benefits paid to retirees would be based, in part, on the payroll tax contributions of their children. Without some such scheme, “our long-term obligations will have to be financed with substantially fewer people (or, perhaps, substantially more immigrants).”

Technological advance is simply destroying jobs faster than it is creating them.

Our political leaders prefer the latter option: substantially more immigrants, including about 20 million working age adults projected to enter the US by 2030. It’s the only way to increase US GDP growth, say Republican stalwarts such as Paul Ryan, Marco Rubio, Jeb Bush, and Larry Kudlow, and every Democrat worth his open-borders salt. As to the jobs of the future, education reform is their answer. They’ll simply transform the system that has, for many decades, failed many tens of millions of working class Americans, into a system that prepares students and retrains displaced workers for the millions of yet to be determined jobs that they hope will eventually materialize.

There is growing evidence that, over the next decade or two, tens of millions of jobs will be eliminated through automation. Many experts believe that technology has advanced to the point where an advanced industrialized nation can produce economic abundance with a paucity of human labor. Given this possibility, what responsible policymaker can conclude that the US economy needs more fertility and more immigration? Is he willing to bet that tens of millions of new jobs of the future will be created — jobs that will be accessible to working-class America? Or that our education system can be reformed to prepare students and retrain workers for jobs whose only currently known characteristic is that they will generally require greater technical knowledge, cognitive ability, and creativity than the jobs that they will eliminate?

Policymakers’ time would be better spent in contemplation of how American society can benefit from the prosperity that, thanks to automation, can be produced by a smaller and smaller labor force. For there is no evidence that a shrinking population is inimical to GDP growth in advanced industrialized nations. A recent study, “Secular Stagnation? The Effect of Aging on Economic Growth in the Age of Automation,” published in the American Economic Review, found “no negative association between aging and lower GDP per capita.” As one might expect, advanced industrialized economies with aging populations are the most likely to adopt automation. Indeed, “when capital is sufficiently abundant, a shortage of younger and middle-aged workers can trigger so much more adoption of new automation technologies that the negative effects of labor scarcity could be completely neutralized or even reversed.”

Is any reasonable policymaker willing to bet that tens of millions of new jobs of the future will be created — jobs that will be accessible to working-class America?

In the US, where capital is abundant, automation relentlessly propels GDP growth. If the projections of the McKinsey Global Institute, et al. are accurate, then by 2030, the US will produce more GDP with as few as 118 million, possibly as few as 84 million, workers — a truly remarkable feat of efficiency. The problem will be that the US labor force will then consist of 225 million working age adults. Those who clamor for more fertility and more immigration should explain their plan for the 39 to 73 million who will be waiting for the new jobs to open.

It had better be creative. At minimum it should include stipends for interim living expenses. One hopes it will include dexterity and mobility training. Nothing will be more important to tens of millions of former truck drivers, burger flippers, and farm workers than being defter than a robot. And those who are not drawn to body language deciphering, should bone up on their STEM skills.

Finally, no plan would be complete without staunch support for our best and brightest. These include many millions of students and displaced workers who will settle for nothing less than the envied high-end creativity jobs — the high-paying ones where you get to work on scientific breakthroughs and novels that grip.




Share This


Pachacuti’s Revenge

 | 

“Check your premises!”

Ayn Rand’s admonition still rings true, in spite of her abrasive personality and cultish intellectual heirs. The admonition, and the name of her philosophy, Objectivism, encapsulate the core of her obsession: epistemology.

A few years ago, my work forced me to examine some of my premises in ways I’d never considered when I had the good fortune of landing a job as a consulting anthropologist for a pair of Canadian Cree Indians, Wendy Bigcharles and Laurie Gauchier, who were filming a documentary about traversing — on foot — the entire Royal Inca Road, a prehistoric engineering marvel stretching from Quito, Ecuador to Santiago, Chile — well over 3,000 miles. The project’s sponsors included the Canadian Broadcasting Company and Statoil, the Norwegian state oil company.

The challenge, for a rational empiricist — me — was to explain belief systems without offending believers or pulling my punches.

I first met Laurie at the 14,000 foot mountaineers’ base camp on Aconcagua in Argentina. He was pursuing his attempt to climb the Seven Summits, the highest peaks on all the continents. He’d become a hero to thousands of First Nations children and toured Canada’s Native Reserves lecturing and inspiring. He liked my understanding of aboriginal cultures. More importantly, we shared a libertarian outlook and similar sense of humor, laughing a lot together.

The Crees’ novel conceit was to explore aboriginal attitudes toward, and beliefs about, the earth, in hopes that these might have modern applications for conservation and sustainability. My job, among many other tasks, was to provide context for the project. The challenge, for a rational empiricist — me — was to explain belief systems (inevitably including the evolution of religion) without offending believers or pulling my punches. Ideally, I wanted to propose a paradigm everyone could enthusiastically embrace. What follows was my effort. Did I succeed? You be the judge.

The Empire Strikes Back

In 2001 Alejandro Toledo legitimately became the first Native American head of state in the Andes since Huascar Inca was murdered by his half-brother, Atahualpa, in 1532. The Stanford-trained economist proved honest and competent. By continuing the sound economic policies instituted under the previous Fujimori regime, he provided Peru with high growth and low inflation. However, for some unknown reason — perhaps his lack of personal pizzazz — he was dismally unpopular and declined to run again.

The 2006 elections pitted Ollanta Humala, another Native American — this time of a leftist, populist bent — against Alan Garcia, a disastrous ex-president, previously ideologically indistinguishable from his opponent but newly converted to liberal, free-market policies. Garcia narrowly defeated Humala and, against all expectations, stuck to his promises. For 2009, Peruvian economic growth proved to be the third highest in the world, after China and India. For the 2011 elections, Humala moved to the center-left with a promise of continuity. Even Nobel Prize-winning writer, one-time presidential candidate, and libertarian-leaning writer Mario Vargas Llosa endorsed Garcia. For the time being, economics trumped ethnicity, with the percentage of those in poverty dropping from 55% in 2001 to 21% in 2016.

For some unknown reason — perhaps his lack of personal pizzazz — Toledo was dismally unpopular and declined to run again.

In next-door Bolivia, Juan Evo Morales, an Aymara Indian and head of the indigenous coca growers union, was elected president in 2006. Morales never finished high school. To date, his rule has been marked by incompetence, confrontation, expropriation, power grabs, and the near destruction of the republic. Local councils, little more than mobs, bypass the courts and mete out justice with gasoline-filled tire-collars. Yet he retains a measure of popularity.

The Inca heartland is in the throes of a native revival. Though Ecuador hasn’t yet elected a native head of state, the Indians have gone on the warpath. In 2000, the Confederation of Indigenous Nationalities of Ecuador, in alliance with junior military officers, overthrew President Jamil Mahaud and formed a new political party named after Pachacuti Inca Yupanqui (1438–1471), tenth emperor of the Inca Empire. Pachacuti is still revered as pre-Columbian America’s most accomplished leader, combining the military prowess of Alexander the Great with the sagacity of Roman Emperor Justinian.

Tahuantinsuyu, as the Inca Empire called itself, did not go gently into that good night. Though Francisco Pizarro’s band seemed to make quick work of the conquest, mostly through luck, disease, and the confusion of an Inca civil war of succession, serious resistance to the new order continued for nearly 250 years. The last Inca rebellion, led by Tupac Amaru, culminated in the siege of Cusco in 1781 and the execution of the last Inca pretender. Ironically, the only firsthand account of the Spanish conquest and its initial aftermath was written by an Inca lord, Titu Cusi Yupanqui; it is still in print.

Evo Morales' rule has been marked by incompetence, confrontation, expropriation, power grabs, and the near destruction of the republic. Yet he retains a measure of popularity.

Native Andean resurgence has not been limited to politics or, for that matter, to South America. Over the past 20 years a quiet and low-key enthusiasm for Andean epistemology has burgeoned in Europe, North America, and South Africa, gaining prominence through the teachings of the Cusco school and elaborated by Americo Yabar, a paqo (or, loosely, a mystic-cum-management consultant); the writings of Joan Wilcox and Diane Dunn, now a paqo herself; anthropologists Inge Bolin and Catherine J. Allen, and Oakley Gordon, a psychologist who has tried to reconcile Yabar’s teachings with non-Andean epistemologies. Not that, from a certain perspective, there is any conflict — after all, if “Western” epistemology can accommodate chaos theory and postmodernist deconstructionism, it is truly an all-inclusive tent.

Epistemology?!

Nothing induces instant boredom in a reader more acutely than a word like epistemology in the title or first sentence of a casual read. And well it might. It’s not an easy concept to grasp, and is at least one or two categories of abstraction away from concrete reality. Epistemology is not what most people would consider as fun or relaxing as, say, Canadian politics or Chinese opera.

Webster’s Third New International Dictionary eye-glazingly defines it as “the study of knowledge systems,” or “the theory of knowledge.” Dr. Oakley Gordon elaborates (and I paraphrase slightly), that it encompasses not only the formal fields of science, philosophy (of which it is also a subcategory), and religion, but also “patterns of assumptions, beliefs and behaviors” that are tacit, subconscious, or otherwise unexamined. Simply put, epistemology is the study of how we know what we know.

If “Western” epistemology can accommodate chaos theory and postmodernist deconstructionism, it is truly an all-inclusive tent.

The word is cobbled together by academics from the Greek “episteme” (knowledge) and “ology” (study of) because they had no run-of-the-mill word at hand to describe the convoluted concept they’d come up with. So they coined a new word, modeled on words like geo-logy (study of earth), bio-logy (study of life), philo-logy (study [or love] of words), psych-ology (study of the mind) and many other words ending in “ology.”

What makes epistemology so much more difficult to grasp is that epistemology is twice as abstract as these other “ologys.” Whereas most “ology” words are the study of the concrete things that precede them, epistemology is the study of those studies — in other words, the hows and whys of what we know.

But epistemology can attain an even more rarefied and abstract level. According to the Oxford English Dictionary, it is also “the study of the validity of knowledge.” So judgments can be — and often are — part of this philosophical endeavor.

Hector Sizes Me Up

On our first trip together down to the Andes one of my employers, Wendy introduced me to Hector, a Peruvian labor organizer who had spent 13 years in a Peruvian prison during the Fujimori and Toledo regimes. While there, he had dedicated his time to studying the Inca road system, over 15,000 miles in its entirety (including the Royal Road, its main trunk). It has been described as the Inca’s “dialogue with the land." Wendy characterized Hector as having “the sort of knowledge that can’t be put into a book.”

Well, to a professional writer and dyed-in-the-wool rational empiricist, that phrase bristled my neck hairs and set my BS antennae quivering apoplectically. I couldn’t wait to engage this sage of the unwritten.

Nevertheless, when I first met Hector I was struck by his modesty and transparency. Figuring that full disclosure was the best strategy for establishing rapport, I leveled with him about what Wendy had told me. Hector looked at me thoughtfully and, after a minute’s reflection, said that he could size up a person’s character within minutes of meeting him or her — and that that sort of knowledge he could not convey in writing.

He had a point.

To a professional writer and dyed-in-the-wool rational empiricist, that phrase bristled my neck hairs.

Still, though it might take thousands of words to describe the process one goes through when one sizes up someone, it can be done. Malcolm Gladwell did it in his book, Blink: The Power of Thinking without Thinking, 2007. And the skill is not uncommon. Research by Jefferson Duarte of Rice University suggests that one of a person’s most telling moral features, his creditworthiness — as well as his sexual orientation, socioeconomic status, teaching ability and personality — can be seen in his face, often after only half a minute’s exposure to even a years-old photograph.

On the other hand, effectively teaching the art of successful snap judgments is much more difficult. But I digress. What Hector’s lesson suggested to me is that well-honed intuition is not fundamentally in conflict with rational empiricism; both are ways of grasping and understanding the world around us — both are the proper study of epistemology.

Not too unrelated is kinesthetic knowledge, or physical, as opposed to intellectual, knowledge. Remember learning to ride a bike or play a musical instrument? No matter how long you leave it, your torso, fingers, and lips never lose that knowledge.

What Hector’s lesson suggested to me is that well-honed intuition is not fundamentally in conflict with rational empiricism; both are ways of grasping and understanding the world around us.

Timothy Leary, the late guru of LSD, and Carlos Castaneda, author of The Teachings of Don Juan: A Yaqui Way of Knowledge, offer to expand epistemology’s territory even further — much further. They argue that the judicious use of psychotropic drugs yields an entirely different understanding of, and approach to, reality. For some people, psychoactive drugs have the unusual effect of suspending the perception of time. To these users, past and future disappear and all reality is concentrated into the here-and-now. This extraordinary sensation intensifies events experienced during the “trip” and alters the user’s perception of reality.

But Back to Chinese Opera

Have you ever read a poem or listened to a piece of music and had one of those “aha!” moments when a light bulb goes off in your mind and suddenly your understanding changes? Art, literature — even music — are media that help shape our view of reality. So then, these too are also the proper territory of epistemology: the study of how those transcendental endeavors create new insights. Problem is, one man’s Chinese opera is another’s cat-scratching-on-a-blackboard. Why the difference?

Different people, brought up in different cultures and traditions, respond differently to different sensory stimuli. Exactly why this might be is not only the subject of anthropology and psychology but also, ultimately, of epistemology. Language is a case in point.

Benjamin Whorf, in Language, Thought, and Reality, argues that since much of our conscious thought is in the language(s) we speak, its conventions affect our perception and interpretation of reality. Modern English grammar with its built-in verb tenses inadvertently forces us to think in terms of past, present, and future. Mandarin Chinese, on the other hand, has no tenses; a Chinese speaker must consciously include a reference to time if he wants to convey tense. Perhaps that’s why Chinese has such an aphoristic inscrutability.

The judicious use of psychotropic drugs yields an entirely different understanding of, and approach to, reality.

Quechua grammar, like that of many indigenous American languages, has a construct that would benefit Anglo-Saxon jurisprudence and science no end: referential reliability. This requires the speaker to attach suffixes that clarify his relationship to the data being conveyed. While English now has three basic tenses — past, present, and future — built into its verbs, Quechua has three levels of validation built into its.

When the content of a sentence is directly experienced by the speaker, he uses the first level of reliability indicator, called a witness validator suffix. When conveying information learned secondhand, the speaker uses the hearsay validator suffix. Finally, when speculating without evidence, or when uncertain, the speaker employs the conjectural validator, indicating that the reliability of the information is a complete crapshoot.

How might this structure affect a Quechua speaker’s perception of reality? How does it affect his social relationships and what he knows about people? Does our English grammatical structure with its tenses make us particularly vulnerable to the way psychedelic drugs stop time? Epistemology seeks to answer this and many other even more subtle relationships between the world, our senses and our minds.

Have We Lost our Minds?

Just how does one actually do epistemology? Is it anything like doing biology or geology? Let’s take a (short) look at epistemological investigations into the nature of consciousness.

For centuries, philosophers, biologists, social scientists, bartenders, and every sort of investigator of the human condition has been clawing away at that black box and, more importantly, at one of its central underlying assumptions, what is known as the mind-body problem: what is mind and how does it emerge from physical and chemical processes?

Quechua grammar, like that of many indigenous American languages, has a construct that would benefit Anglo-Saxon jurisprudence and science no end.

The investigations have yielded few answers but have shed much light on the nature of the problem. On the one hand, diehard advocates of mind still harbor faith in the existence of a “thing” that is mind and attribute our failure to unlock its secrets to not enough hard work. At the other extreme are those who have thrown up their hands in frustration, declared that we’ve done all possible research, still not found “mind,” and that therefore “mind” does not exist.

Into this quagmire jumped some researchers who decided that perhaps part of the problem lay in how we approached the problem. Traditional Victorian scientific investigative techniques approached a complex problem by analyzing its many parts separately, gaining individual insights and then summing the parts in order to understand the whole — a very productive methodology in most instances. However, in some cases, results were a bit like the Indian parable of the blind men and the elephant: the particulars didn’t add up to the sum of the whole.

These investigators decided that perhaps it might be better not to analyze certain phenomena into their parts but rather to investigate complex entities — such as sentient beings — as a whole. That, perhaps, in order to understand the nature of sentient beings, these shouldn’t be approached as mind-body dualities but approached as a single entity — a monistic approach. Ironically, this “new” approach mimicked “primitive” animistic philosophies, according to which the interrelationship of everything was impossible to separate (a subject further investigated in the following subsection).

Zindler suggested that mind is better perceived as a process, a dynamic relation, an emergent property, and not a thing.

In the spirit of this new approach, Professor Frank R. Zindler, both a biologist and linguist, late of the State University of New York, and others decided to take a fresh look at “mind.” They believed that through purely historical accident the word mind, in all European languages, is grammatically a noun. Because of our grammar and the hidden assumptions of our language, we tend to think of nouns as substantive “things” such as tree, table, brick, etc., despite many nouns being quite insubstantial, e.g. truth, beauty, velocity, etc.

Zindler suggested that because mind was a noun, it was conceived to be a thing. Because it was thought to be a thing, it was thought to have existence apart from the brain. Neurobiological studies offer no supporting evidence for these ideas. Rather, mind is better perceived as a process, a dynamic relation, an emergent property, and not a thing. If we change the processes of the brain, we change the mind. If nothing else, psychedelic drugs have taught us that fact.

For perspective, Zindler posits that to wonder what happens to the mind after the brain decays is as silly as asking where the 70-miles-per-hour have gone after a speeding auto has crashed into a tree:

Now that scientists recognize mind as a process rather than a thing, they are making rapid advances in understanding the specific brain dynamics that correspond to the various subjective states collectively known as mind.

Foremost among these advances is the discovery (at Goldsmith’s College, London, and the University of Houston) that the previously mentioned “aha!” moments of insight are detectable with an electroencephalograph up to eight seconds before an individual’s mind is aware of them. More startling, Dr. Allen Snyder at the University of Sydney has been able to induce savant skills through repetitive transcranial magnetic stimulation. Clearly, epistemology is useful.

John Dewey, in the early 20th century, was one of the first Western philosophers to minimize the mind-body duality. He argued that philosophers’ obsession on creating a problem of the relation between the mind and the world was a mistake. He retorted that no one had ever made a problem about the relation between, for example, the hand and the world. As Louis Menand puts it in The Metaphysical Club:

The function of the hand is to help the organism cope with the environment; in situations in which a hand doesn’t work, we try something else, such as a foot, or a fishhook, or an editorial . . . They just use a hand where a hand will do. Dewey thought that ideas and beliefs are the same as hands: instruments for coping. An idea has no greater metaphysical stature than, say, a fork. When your fork proves inadequate to the task of eating soup, it makes little sense to argue about whether there is something inherent in the nature of forks . . . or soup that accounts for the failure. You just reach for a spoon.

Pachacuti vs. Plato: The Nature of Religious Faith

With modern man facing a slew of problems ranging — according to some — from overpopulation, pollution, and global warming to the recent economic-fiscal crises, many people are questioning our approach to, and use of, knowledge. This analysis has resulted in the recognition of a distinct “Western” epistemology as opposed to epistemologies that developed outside the modern tradition — epistemologies that bear examination in hopes that these might be mined for some nuggets of wisdom. This was Wendy’s and Laurie’s project.

“Aha!” moments of insight are detectable with an electroencephalograph up to eight seconds before an individual’s mind is aware of them.

Western, or modern, epistemology has its roots in a movement that emerged independently throughout centers of high culture all across the Old World in the first millennium BC. Robert N. Bellah, an anthropologist, has characterized it as “an extremely negative evaluation of man and society and the exaltation of another realm of reality as alone true and infinitely valuable.” This dualistic perception emerged in Greece with Plato’s classic formulation of a realm of the ideal that mirrors mundane reality; in Israel with the conception of a transcendent god in whom alone there is any comfort; in India with the Buddha as the only refuge from a world of chaos; in China with the Taoist admonition of withdrawal from human society.

It is hardly necessary to cite comparable Christian sentiments, while the Koran stresses that the life to come is infinitely superior (virgins or no virgins) to present reality. Even in Japan, usually so innocently world-accepting, Shotoku Taishi declared that the world is a lie and only the Buddha is true, and in the Kamakura period the conviction that the world is hell led to orgies of religious suicide by seekers after paradise.

This world-rejection zeitgeist is unlike anything that came before or after. Prior to this radical shift in perception, epistemology was more concerned with the maintenance of personal, social, and cosmic harmony and with attaining specific goods — rain, harvest, children, health — as men have always been concerned. Salvation in an afterlife was virtually absent; the intellectual focus was on getting along with the natural forces that ruled existence — the sun, the earth, water, wind, thunder, fertility, game, plants, etc — in a word, animism.

Animism is a holistic approach that reveres natural phenomena without separating the whole into constituent parts — vital essences, soul, energy, their physical manifestations, or their subsystems — while still including them. It is a monistic perception as opposed to the later revolution’s dualism. Animism seeks integration while world rejection seeks transcendence.

This world-rejection zeitgeist is unlike anything that came before or after.

The revolution that took place in the first millennium BC, by creating a schism between the real world and an ideal reality, started an intellectual movement that helped humankind look at the world differently. If reality could be broken down into its constituent parts — real and ideal — couldn’t everything else? Might there be more than two parts? Now that natural phenomena could be analyzed into building blocks, these became much easier to understand and, ultimately control. It was the first step towards a philosophical reductionism that later culminated in Victorian science.

With the existence of an ideal world theorized, comparative analyses between the real and ideal worlds also led to the possibility of reform on this earth — of the self, society, government, technology, etc. — as separate entities. There was nothing that couldn’t bear some improvement. Even the ability to separate itself became a tool of reform. Power could be split into political-military and cultural-religious, as could social classes and economic specialties. Doubtlessly, this new paradigm greatly aided mankind’s adaptation to growing population pressures worldwide by facilitating technological innovation.

Still, the revolution was not democratic. Control was still vested in authority: church and state. It took yet another revolution another 2,000 years later (more or less) for epistemology to undergo as radical a change as it previously had with world rejection — the Protestant Reformation.

The Protestant Reformation made man’s relationship with — and understanding of — reality, much more democratic. Prior to the Reformation only the religious elite had access to God’s revelation, the Bible. They, in turn, interpreted God’s word for the rest of society. In some European countries it was actually a crime to own a Bible in the vernacular.

There was nothing that couldn’t bear some improvement. Even the ability to separate itself became a tool of reform.

The second revolution that the Protestant Reformation wrought was simple: the concept that everyone had the right to own, read, and interpret God’s word for himself. This principle of individual autonomy and the devolution of ultimate authority spread to governmental and economic realms where it presaged the end of monarchy and the rise of capitalism. In the meta-sphere of epistemology, it was the beginning of the end for the earlier, world-rejection, revolution. Western epistemology diversified. It was an age of enlightenment — in science, philosophy, and even religious thought, where hundreds of new sects flowered.

Jerusalem, Rome, Mecca, Cusco

Whatever its benefits, the Protestant Reformation had little effect on the divide-and-conquer dualistic-reductionist approach that was still yielding so many insights. If anything, Victorian scientists were predicting mankind’s imminent omniscience. By first identifying the constituent parts of whatever was being studied (dualism), investigators could more easily analyze each part individually and finally re-assemble the lot into a complete explanation (reductionism).

However, there was a problem. With time, the dualistic-reductionist approach proved inadequate when faced with highly complex, integrated systems such as climate, ecosystems, economies, and living organisms. A new, more holistic approach was necessary because understanding constituent parts didn’t fully explain the whole.

Philosopher and ecologist Robert Ulanowicz said that science must develop techniques to study ways in which larger scales of organization influence smaller ones, and also ways in which feedback loops create structure at a given level, independently of details at a lower level of organization. In an attempt to reconcile this difficulty, James Lovelock, an exobiologist for NASA, postulated the Gaia Hypothesis, named after the Greek goddess of Earth.

With time, the dualistic-reductionist approach proved inadequate when faced with highly complex, integrated systems such as climate, ecosystems, economies, and living organisms.

The Gaia Hypothesis proposes that the biosphere and physical components of the earth — atmosphere, cryosphere, hydrosphere, and lithosphere — are closely integrated to form a complex interacting system that maintains the climatic and biogeochemical conditions on earth in a preferred homeostasis.

Today we are in the throes of a new revolution. If a more holistic approach has been productive in studying the mind, perhaps other fields of inquiry might also benefit from a less dualistic-reductionist approach. The door was opened for a re-evaluation of the animistic approach.

A New Age (literally) dawned. The democratization of religious belief, initiated so violently during the Reformation, flowered into a thousand blooms. Now that each person could find his own road to enlightenment, people scoured the world for alternative spiritual approaches, particularly holistic, natural phenomena-centered (as opposed to world-rejecting) beliefs.

In popular culture, that search probably began with the Beatles’ discovery and popularization of Transcendental Meditation as practiced by the Maharishi Mahesh Yogi. Though world-renouncing in its transcendentalism, it did not renounce worldly involvement or pleasures.

The democratization of religious belief, initiated so violently during the Reformation, flowered into a thousand blooms.

More recently, the search for a new approach has focused on indigenous American animist epistemologies, particularly Andean epistemologies, not only because of their geographical isolation from world-rejection creeds but also because of the sophistication and success of Inca civilization — a civilization that absorbed virtually its entire known world, maintained a burgeoning population more-or-less peacefully and without famine; and displayed a reverence for their environment that attracts many today.

A Formicable Society

If nothing else, anthropological theory has taught us that increasing population density tends to beget technological innovation in a continuous feedback loop but is always associated with increased governmental power, regimentation, and social stratification.)

Inca society was the antithesis of a liberal ideal. Tax freedom day didn’t fall until September — in most years; subjects worked for state and church two-thirds of the year. Even when too old, poor, or incapacitated to pay, individuals had to at least pay their tribute in lice to maintain the integrity and fairness of the system. (Perhaps giving some of their lice in lieu of anything else was at least a required token of participation in the system.) There was no private property beyond one’s clothing, tools, guinea pigs, and such. Individual organized trade was not permitted; the government monopolized trade. Nuclear family autarky was the rule and the economy was rigidly centrally planned. Beyond the firstborn, a family’s children were at the disposal of the emperor and were sometimes subject to human sacrifice. Inca and subject populations were resettled at the whim of the emperor to discourage revolt. Inca society was the human equivalent of a bee or ant colony.

Food supplies, information, justice, and armies moved between what is today Colombia and southern Chile along a road system that rivaled the Roman Empire’s. And it was fast.

Still, it had its attractions. Tahuantinsuyu was the first successful, corruption-free welfare state in the world. Few starved, lacked a home, or suffered arbitrary injustice at the hands of local administrators. Corvée labor kept state granaries full to supply the army and areas hit with failed harvests. A sophisticated network of spies and informants provided instant polling to the Inca. Local chiefs and administrators guilty of malfeasance suffered swift punishment. Food supplies, information, justice, and armies moved between what is today Colombia and southern Chile along a road system that rivaled the Roman Empire’s. And it was fast. Trained relay runners with sophisticated mnemonic devices could cover 140 miles in one day and could go from Quito to Cusco in 7 days. (For comparison, the US Pony Express could cover 250 miles per day, or about 10 days from the Atlantic to the Pacific.) Giant llama caravans carried supplies and the armies’ kit over the paved, bridged and accommodation-replete Royal Roads. And medical care was free, with Inca cranial surgery being second to none.

Best of all, Inca rule created peace. The Pax Incaica was achieved with a minimum of bloodshed. Inca ambassadors would regale prospective conquests with the benefits of Inca rule while massive Inca armies camped outside the walls of the target city. Generous bribes were offered both to the chiefs — who would retain their positions, if accommodating — and the populace. Siege tactics could last for years but were always punctuated with truces during planting and harvest. But ultimately, if the peaceful incentives failed, Inca armies were known to inflict hideous torture and annihilate entire male populations.

The Cusco Creed

Inca theology reflected the sophistication of such an advanced civilization. Like the Romans, the Incas allowed their subjects religious freedom but required a few minimal doctrinal concessions. During his reign, Pachacuti convened the Council of Curicancha, which, like the Council of Nicaea, set out to standardize doctrine. It integrated all the beliefs of the newly conquered trans-Andean subjects with Inca theology. Pachacuti started by recognizing three contradictions in traditional simple sun worship:

  1. Inti — the Sun — cannot be universal if, while giving light to some, he withholds the light from others.
  2. Inti cannot be perfect if he can never remain at ease, resting.
  3. Nor can he be all-powerful when the smallest cloud may cover his face.

Therefore, the Council of Curicancha postulated the existence of Viracocha, an all-powerful, creator meta-deity that it imposed on all Inca subjects. Viracocha was more or less simply grafted onto what was becoming an extremely complex doctrinal edifice while not conflicting with minor, local deities, and the still intact Inti.

Generous bribes were offered both to the chiefs — who would retain their positions, if accommodating — and the populace.

Not only was Andean animism suddenly flirting with monotheism but its theological subtleties and elaborations — too numerous and esoteric to describe here, but including much practical wisdom in addition to abstruse arcana — were becoming every bit as mysterious and complex as the Christian creeds of transubstantiation and the Trinity. But instead of rejecting the world, Inca religion embraced and revered it. Alongside the more traditional (at least from a Judeo-Christian-Muslim perspective) — and challenging — coming-of-age ceremonies, fasts and trials of endurance, there were feasts, sex, drugs (such as ayahuasca), alcohol, and music. Faith, commitment, and pleasure were celebrated.

An important part of Andean animism is its self-help aspect. Practical nuggets of wisdom that promote character, conviviality, and satisfaction — akin to combining the teachings of Dale Carnegie, Stephen Covey, and Deepak Chopra with the Bible — are canonical in it.

Heady stuff.

But probably Andean epistemology’s biggest draw today is its reverence for Pachamama, mother earth, and her myriad natural constituents — soil, water, wind, plants, animals, and even rocks. In today’s enlightenment-seeking, ecology-conscious, hedonistic modern world, Andean epistemology seeks to provide a holistic balance — in living with oneself, society, and the earth.

Animism vs. Atheism? Hector Weighs in

One evening after dinner Wendy sought Hector’s advice on a matter that had been troubling her. She asked me to translate.

She recounted that the Inca Road Project, to her great surprise, had not been well-received by certain members of her Cree Band. In fact, she’d been attacked by a shaman who tried to stab her between the shoulder blades with a porcupine quill. At the last moment, Laurie, her husband, deflected the thrust. Still, the medicine man succeeded in casting a spell. Ever since, Wendy had been troubled by headaches, chest pains and neurasthenia. Could Hector recommend an Andean shaman who might lift the spell?

instead of rejecting the world, Inca religion embraced and revered it.

By this time I was internally cringing and questioning my involvement with such people and this project. I braced myself for Hector’s response.

Hector looked at both of us thoughtfully and, after some moments’ reflection, said that Wendy had undertaken a big project with tremendous responsibility that generated a lot of envy and jealousy amongst a close-knit clan in an Indian reservation setting. She was no doubt stressed out and weary. Hector suggested rest, a positive attitude, and forging ahead with a clear conscience. When Wendy insisted on a referral to a medicine man, Hector recommended a more conventional doctor.

The contrast struck me. Wendy, a modern Canadian in every sense, had been raised in a fundamentalist Christian home. She had discovered her tribe’s animist Cree beliefs as an adult, and had embraced them — in my opinion — in a New Age-ish sort of way as a reaction to the hypocrisy that she believed she had encountered as a child. The dream catchers, medicine wheels, frequent allusions to Indian beliefs, and other paraphernalia that was now a part of her seemed necessary reinforcements for a new-found faith.

In fact, she’d been attacked by a shaman who tried to stab her between the shoulder blades with a porcupine quill.

Hector, on the other hand, had been raised in an Andean animist milieu that subtly permeated his practical character. His animism wasn’t a compensatory reaction. Their different approaches got me to thinking about my own epistemological evolution.

Keeping the Faith

Humankind’s oldest and most widely held religious beliefs are animism and ancestor worship — both almost always found together (a linkage that begs to be investigated) — and both labeled a bit too glibly by condescending observers with a monotheistic background, little regard for translation difficulties, and therefore scant appreciation for the depth, complexity, or subtlety of alien beliefs (especially the often slippery concept of hierarchical divinity).

Animism literally refers to a belief that everything — living or inanimate — has an essence — a soul, or anima, if you will — and that this soul need not be a spirit or ghost-like being with a potentially independent existence. Animism usually regards human beings as on a roughly equal footing with (other) animals, plants, natural forces, and even objects — all deserving respect. Humans are considered a part of nature, not superior to or separate from it. However, it is not a type of religion in itself but rather a constituent belief or virtue — analogous to polytheism, monotheism, or even filial piety — that is found in many belief systems.

In the Aristotelian version of animism all things are composed of matter and form — the essence — the latter being the defining characteristic and corresponding to a “soul,” albeit one that is neither immortal nor deserving of any sort of worship. It was merely an expedient he proposed to account for things such as butterflies whose “matter” undergoes radical transformations during its life cycle but whose “form” ostensibly remains a butterfly. This concept was appropriated and greatly elaborated by the early Christian church into the modern “soul” most of us are familiar with.

There is a constellation of “transformational” retreats and festivals in the United States that cater to New Age seekers. But to keep away the riff-raff, many charge hefty fees.

Likewise, ancestor worship is not a religion but rather a practice, one that is a part of nearly all religious traditions. And it is better rendered as “ancestor veneration,” a more accurate description of what practitioners actually do, which is to cultivate kinship values such as filial piety, family loyalty, and family continuity, often with rituals such as visiting graves, offering flowers and grave decorations, burning candles or incense, reciting genealogies, or simply displaying photographs in special locations. Prayer, actual worship, belief in the transformation of dead relatives into deities or communication with them may or may not be present.

Today’s shift from a world-rejection zeitgeist to a more holistic, New Age approach to spirituality is by no means limited to Andean epistemology. The August 25 issue of The Economist (“With Spirits Kaleidoscopic”) reports on a constellation of “transformational” retreats and festivals in the United States that cater to these seekers: the Beloved Festival, SoulPlay, Sonic Bloom, Kinnection Campout, Stilldream, Wanderlust, Symbiosis, and, yes, Burning Man. But to keep away the riff-raff, many charge hefty fees. Tickets for the Beloved Festival start at $265.77.

Not Keeping the Faith

Sometimes, when pressed for my religious beliefs, I’ll respond — as a heuristic device with a less polarizing tendency than, “I’m an atheist” — that I’m an adherent of reformed ancestor worship-animism. In other words, I venerate the DNA line that I now represent; and my habits are respectful and frugal — about life and the things that support it. Is there a conflict between these values and atheism? I don’t think so. What do you think?

Epilogue

After a few years, the Royal Inca Road Project was put on hold indefinitely. Wendy and Laurie, the Cree principals behind the project, waged a (so far) losing battle to maintain adequate funding from corporate and government donors while holding on to full-time jobs. But their efforts continue. They moved to Stavager, Norway where Laurie worked as a pilot for a subsidiary of SAS airlines and Wendy managed a branch of Statoil’s international office for indigenous affairs, a position that allowed her to pursue their quest.




Share This


A Train to Nowhere

 | 

The other day I watched Snowpiercer. I have a taste for post-apocalyptic science fiction, and also for stories that illustrate political ideas, and Snowpiercer is both of those. Co-written and directed by South Korean filmmaker Bong Joon Ho, the 2013 movie also has a strong flavor of anti-capitalism. Wondering who had picked up on that, I googled “Snowpiercer, Socialism,” then “Snowpiercer, anti-capitalism.”

The socialists had picked up on it. On a web page called Socialist Action (“In Solidarity with Workers and the Oppressed Everywhere”) writer Gaetana Caldwell-Smith calls Snowpiercer “an original, inventive, futuristic work” that pictures “what might happen in the future if the outmoded and anarchistic capitalist system goes on unchecked for much longer.”

On another socialist web page, Jacobin writer Peter Frase calls Snowpiercer an “action-movie spectacle” with “a message of class struggle” that “evokes some of the thorniest dilemmas of socialism and revolution, in the twentieth century and today.”

In an attempt to reverse global warming, humanity overdid it and froze the planet.

Your Film Professor, the highbrow lefty, praises Snowpiercer’s “incredible capacity to cuttingly capture — or ‘cognitively map’ — how our current and future dystopian milieu is informed by our (globalized) capitalism system. . . . The reason this film is just SO important is because it cuts through the fog of ideological distractions (e.g., consumerism, status quo/reformist [capitalistic] rhetoric, patriotism, nationalism, etc.) and didactically spells out the REAL of ruling class ideologies in a way that is to my mind almost miraculous.”

I don’t know about all that. It does tell you how some on the Left think (and write).

Snowpiercer is a science fiction story set on a frozen earth. In an attempt to reverse global warming, humanity overdid it and froze the planet. But a capitalist named Wilford, who was fascinated with model trains as a kid, had put his corporate fortune into a high-speed train with an enclosed ecosystem: tanks of fish, hooches of chickens, an engine to propel the train and keep the contents warm. For 17 years, Wilford’s shinkansen has been rushing over the world’s continents, one full loop each year, pushing through the wasteland of snow and frozen machines around it. Every human alive is a passenger on this train.

After the police-state cars with hooded goons wielding truncheons and automatic pistols come the lumpenproletariat at the tail end.

It doesn’t make a lot of sense — but cut some slack for surrealism. The story of Snowpiercer is from a graphic novel, in other words, a comic book. A French socialist comic book. The film is quite well made, and on the Internet Movie Database (imdb.com) is rated 7.1. That’s not quite up to the 8.2 rating of V for Vendetta, another political tale based on a comic book, but well above the 6.2 for Waterworld (1995).

Snowpiercer’s train comes with a recognizably Marxist class structure. Wilford, the egoistic owner played by Ed Harris, is the deity at the train’s head. Next in line are train cars of sybarites with their club music, dancing, and drug-fueled orgies, then the genteel with their classical music and handmade sushi, then the obedient workers tending the orange groves, tanks of fish, and hooches of chickens, and the smiling teacher (Allison Pill) in the grade-school car of fresh-looking kids. After the police-state cars with hooded goons wielding truncheons and automatic pistols come the lumpenproletariat at the tail end. In Marxist terms, you might think of them as “workers,” but they mostly just suffer. They live in rags and squalor and are terrorized by goons. For food they are issued “protein bars” made from pulverized cockroaches.

And they are the folks the movie is about.

Wilford’s mouthpiece to them is an unctuous woman played by Tilda Swinton, who was the White Witch in The Chronicles of Narnia: The Lion, The Witch and the Wardrobe (2005). Early in Snowpiercer she instructs the rabble in a style that parodies Margaret Thatcher. “Order is the barrier that holds back the frozen death,” she declares. “We must all of us, in this train of life, remain in our allotted station. We must each of us occupy our preordained particular position.”

The society around us has a “one percent,” but its membership is not fixed. People go in and out of the “one percent” all the time.

She holds a man’s shoe and puts it on the head of one of the proles. “A hat belongs to your head,” she bellows. “A shoe belongs to your foot. I am a hat. You are a shoe. I belong on the head. You belong on the foot.”

And again: “I belong to the front. You belong to the tail. When the foot seeks the place of the head, a sacred line is crossed. Know your place.”

Here is the message of the movie. Society — capitalist society — is a hierarchy of assigned privilege.

Well, the society around us is surely a hierarchy, just as its Canadian defender, Jordan Peterson, allows, though he calls it a hierarchy of competence. And it is mostly that, else today’s world would not work. It has a “one percent,” but its membership is not fixed. People go in and out of the “one percent” all the time. Margaret Thatcher was part of the political one percent, but she famously started out as a shopkeeper’s daughter, in what the Marxists call the petit bourgeoisie.

Capitalism is an economic system of private workers and owners who buy and sell in a market, making their own decisions. In Snowpiercer there is no market. Wilfred’s chickens produce eggs, and one of his men wheels them in a cart and gives them away. He doesn’t sell them. There is no buying or selling in Snowpiercer and no money. There is no property other than Wilford’s. The supposed “Wilford Industries” cannot buy or sell anything, because there is no other entity to sell to or buy from.

As an ideological venture, a kind of leftist "Anthem" or "Animal Farm," "Snowpiercer" does seem to be part of something.

Watching Snowpiercer, you can’t help but identify with the lumpen heroes (especially the characters played by John Hurt and Octavia Spencer) who disobey the faux Margaret Thatcher and refuse to remain “shoes” on the godhead’s foot.

But why care about a five-year-old movie that had only a limited release in the United States? Worldwide it did better; in its first year, Snowpiercer brought in $87 million, more than half of what V for Vendetta did. As a business venture Snowpiercer did all right. As an ideological venture, a kind of leftist Anthem or Animal Farm, it does seem to be part of something.

There has been a small upsurge of socialism in the United States. So far it is a pale image of the leftist tide of the 1930s, when private investment had collapsed and millions were out of work. Then it looked to many as if capitalism was finished. In the 1930s socialism was a relatively new thing, and intellectuals might be excused for not knowing what a defective product it was.

Now my hometown, Seattle, has a socialist on its city council. Her supporters are raucous and young, full of resentment of the billionaire rich. Maybe they believe because they read Karl Marx and Thomas Piketty, but more likely because they have imbibed their history and politics from left-wing teachers, or maybe from graphic novels and movies like Snowpiercer.

In the 1930s socialism was a relatively new thing, and intellectuals might be excused for not knowing what a defective product it was.

But socialism — really? Like Peterson, I want to yell at them: Did you miss the 20th century? And a lot of them did. They are that young.

I lived through the last third of that century as an adult. I saw socialism collapse in Europe, abandoned in China, and decaying in Cuba. Now it is collapsing again in Venezuela. It’s time for the socialists to give up.

All the Left’s bellowing about hierarchies and social classes makes me think of the guys I grew up with. The son of a small-town optometrist became an airline pilot. The son of an aerospace engineer became a sheet-metal worker, then lost it all when he married a crackhead. We are all of retirement age now, though some of us are still working and one, a teacher, has been retired on a fat pension for more than ten years. I have a grade-school friend who lived for years in a ruined trailer and a former colleague who lived for seven years in his truck. Both have now been put in decent housing, courtesy of the welfare state.

The kids I grew up with did not achieve equality, at least not as the Left defines it. They weren’t promised it, didn’t aim at it, and didn’t get it. They went in all different directions. None was assigned his position in life, and most of them, over time, changed what that position was. No doubt some of their paths were shaped by “power relations” under capitalism, and I know some were touched by luck. But where each one ended up depended mostly on the decisions he made, the sort of work he did and how diligently it was done, how much present satisfaction was sacrificed for the future, and, crucially, on whom he married.

Why would anyone think his world is like Snowpiercer?

A software man of the new generation predicted that robotics will extinguish so many jobs that the government will have to offer a universal guaranteed income.

I think back to when I was 20, and a student at the university. I used to go on long walks through a city neighborhood with big houses, many of them brick, built in the early 20th century. I was bunking in a rental house shared with other students, eating meals of hot dogs and ramen and working a part-time job for $1.75 an hour. That neighborhood of big homes was a foreign country to me. It would have been easy to think I was looking at the brick walls of an impenetrable class, and that I was doomed to a life of instant noodles. But I wasn’t.

There is another thought, which I heard recently from a software man of the new generation. Noting the divide in his fellows between those with brain work and those living in parents’ basements, he predicted that robotics will extinguish so many jobs that the government will have to offer a universal guaranteed income.

I do see the loss of jobs. My health clinic, which used to have a row of clerks checking in patients, has replaced them with touchscreens. The local superstore (which would have been called a department store, years ago) has replaced half of its checkers with touch screens. Several downtown parking garages that used to employ Ethiopians to collect the money have replaced them with card scanners.

Then again, three blocks from my house is a shop that concocts such fluffy desserts as Mexican chocolate pie. A pie from that shop costs $36; a slice, $6. That sort of pie was not available here two decades ago. Nor was nitrogen-infused ale. Or black sesame ice cream. My neighborhood now has artisan bread, artisan ice cream, artisan chocolate, artisan beer and, more recently, artisan spirits. Within a few miles are stores offering artisan cannabis.

A young man I know, the son of a bank vice president, has chosen to be an organic farmer. He shares in an old house on a muddy farm and produces organic vegetables and free-range, grass-fed beef. He sells his artisan hamburger for $6 a pound.

Capitalism can be about much more than efficiency.

A century ago, a middle-class family here might have a Swedish girl to cook, clean, tend the children, and mend the holes in socks. Now we have au pairs, housecleaning services, gardening services, and (I can hardly believe this) dog-walking services. I even know of a poop-removal service for people who keep dogs in backyards. In my neighborhood the environmentally sensitive no longer put in concrete walkways. They hire Mexican immigrants to put in brick walkways, carefully laying each brick by hand, using no mortar, so that rainwater can soak sustainably into the earth.

Back in the 1970s, my university professor of marketing predicted that the future of consumer products was Miller and Bud, two brands distinguishable only by labels and the ads on network television. All consumer markets were going to go that way, he said. I suppose it would have been the most efficient outcome. But look at the beer shelf at your grocery today. And what has happened to television? Capitalism can be about much more than efficiency.

The young man selling $6-a-pound hamburger makes far less money than a programmer at Amazon. Probably he officially qualifies as poor. But he is no serf. To him, the programmers working 60-hour weeks are the serfs.

Snowpiercer is, as the socialists say, “an original, inventive, futuristic work,” totally unlike the black-and-white TV westerns and World War II shows I grew up with. I enjoyed it. I cheered for the rebels at the back of the train along with everybody else. I just hope that most of those who saw the film took it as an artisan product of an affluent culture and not as any sort of wisdom on the world around them.




Share This


Ominous Parallels?

 | 

A congressman wrote to a friend about an argument on the floor of the House of Representatives:

I never said a word to anybody, but quietly cocked my revolver in my pocket and took my position in the midst of the mob, and as coolly as I write it to you now, I had made up my mind to sell out my blood at the highest possible price.

An historian described the atmosphere in the Capitol in this way:

Recurrently, speakers lashed out in passages that threatened to precipitate a general affray. . . . Practically all members were now armed with deadly weapons. In both chambers, Senator Hammond said, “the only persons who do not have a revolver and a knife are those who have two revolvers.” For a time a New England Representative, a former clergyman, came unarmed, but finally he too bought a pistol. A Louisiana Congressman threatened to fetch his double-barrelled shotgun into the House. Supporters of both parties in the galleries also bore lethal weapons, and were ready to use them.

I quote from Allan Nevins’ The Emergence of Lincoln (New York, 1950; 2.121, 124), the best study I know of American politics in the late 1850s. The passages I cite refer to events of early 1860. In the middle of 1861, such events and the emotions that accompanied them produced their final effect — civil war.

What produced this expansion of political and military force, much of it permanent, though unimaginable in earlier American history?

Daniel Webster (and many others) had warned that factional disputes, intensified without limit, could result only in catastrophe:

Sir, he who sees these states, now revolving in harmony around a common centre, and expects to see them quit their places and fly off without convulsion, may look the next hour to see the heavenly bodies rush from their spheres, and jostle against each other in the realms of space, without producing the crush of the universe. (Speech in the Senate, March 7, 1850)

The warnings were heard and understood; yet, as Lincoln was to say in his second inaugural address, “the war came.”

What produced this awful effect, this war in which a million people perished, and more were dreadfully wounded? What produced this war of limbs hacked off without anesthetic, of towns put to the torch, of economic and psychological devastation on an enormous scale? What produced this expansion of political and military force, much of it permanent, though unimaginable in earlier American history? And what produced the peace that followed the war, a peace in which black people, the objects of the victors’ alleged solicitude, languished in poverty and systematic humiliation, generation after generation? And this sorry peace was inseparable from the war itself.

In the second inaugural Lincoln identified what he considered the causes of the conflict:

Both parties deprecated war, but one of them would make war rather than let the nation survive, and the other would accept war rather than let it perish.

Lincoln’s words impute to the major actors more conscious choice and final purpose than most of them felt. Jostling one another in the pursuit of immediate ends, leaders on both sides employed political methods that were not intended to produce a war, yet turned out to be the best means of doing so.

Let me put it in this way. Suppose you want to effect a violent disruption of human life. Here are some things you can do.

1. Convince yourself that you and your friends are right, entirely, and no one else is right, at all, about anything, thereby creating as many political divisions as possible. Reject any speculation that other people, though wrong, may have serious reasons for being that way.

A flood of propaganda spread the idea that no one who disagreed with the latest version of partisan orthodoxy could possibly have any but immoral reasons for doing so.

2. Try to make sure that the political field is cleared of everyone but deadly enemies.

It is often said, and this is true, that before the 1830s Southerners were in general agreement that slavery was an evil, and many Southerners were more than amenable to limiting and eventually getting rid of it. There is also general agreement that the great majority of Northerners were happy enough to endorse ideas for the gradual abolition of slavery; indeed, every Northern state that started with slavery had successfully ended it. Even in the slave states, there were large numbers of free black people — by 1860, 250,000 of them.

Yet 30 years of being labeled enemies by both the partisans of slavery and the partisans of abolition progressively immobilized the ordinary, mildly well-intentioned middle range of public opinion. A flood of propaganda, emanating from each camp of zealots, spread the idea that no one who disagreed with the latest version of partisan orthodoxy could possibly have any but immoral reasons for doing so. Of the thousands of low points in this supposed dialogue, I will mention one — the political emasculation of Webster, formerly the North’s most admired public figure, at the hands of his fellow New England intellectuals, for the crime of supporting the Compromise of 1850. Thus Whittier, the supposedly gentle Quaker poet, depicting Webster as Satan in hell and Noah in his drunkenness:

Of all we loved and honored, naught
Save power remains;
A fallen angel’s pride of thought,
Still strong in chains.

From those great eyes
The soul has fled:
When faith is lost, when honor dies,
The man is dead!

Then, pay the reverence of old days
To his dead fame;
Walk backward, with averted gaze,
And hide the shame! (Whittier, “Ichabod”)

Note the instructive tone, the ecclesiastical certainty (“the soul has fled”), the moralistic comments and commands. These methods, though repulsive to almost everyone, are necessary to your purpose. You cannot be too self-confident when affixing the mark of Cain. Guard yourself: you must never become conscious of the irony involved in damning people while pretending that they are only worth ignoring.

Leaders on both sides employed political methods that were not intended to produce a war, yet turned out to be the best means of doing so.

3. Once you’ve converted potential collaborators into scorned opponents, and multiplied those opponents, do your best either to silence or to enrage them. Southerners were better at this than Northerners. In the South, the mails were censored to prevent dissemination of anti-slavery opinion, and mobs were formed to rid communities of people who gave signs of being anti-slavery; in ten Southern states, the Republican Party wasn’t even on the ballot. But in the North as well, jurists, writers, and teachers were targets of political correctness. Mobs were raised against “agents of the South,” non-abolitionists were purged from Protestant clergies, and politically active people were hounded into choosing between an official Democratic Party, directed by an incompetent president, which insisted that the Kansas-Nebraska Act be renounced and reviled, and a rising Republican Party, which insisted, for opposite reasons, that the Kansas-Nebraska Act be renounced and reviled.

4. Turn marginal positions into moral and political tests. The great issue of the 1850s was the question of whether slavery should be permitted in the Western territories, where no one but wild fanatics had ever believed that slavery could subsist. The North nonetheless demanded that it be banned by act of Congress, and the South nonetheless demanded that it be promoted by act of Congress. Sectional moralists indignantly rejected the Kansas-Nebraska idea, once favored by the South, that the question be left up to the people of the territories. Here was an issue of no practical importance, but it became the test of political viability. Emphasizing politically marginal questions makes it certain that marginal politicians will rise to the top; and if trouble is what you want, these people will give it to you.

5. Try to win, not by debate, but by definition; this is what “principled” people do. To the South and its friends, Republicans were always Black Republicans; that’s what they were. To radical Northerners, all proposals from south of the Mason-Dixon line were by definition products of the Slave Power, which was attempting to spread chattel slavery throughout the North, and ultimately to rule the Western hemisphere. It followed that useful proposals, such as gradual emancipation, which had attracted great sympathy on both sides of the Ohio, were by definition entering wedges of the opposition’s Satanic schemes, to be rejected out of hand.

Emphasizing politically marginal questions makes it certain that marginal politicians will rise to the top; and if trouble is what you want, these people will give it to you.

6. Do your best to promote identity politics — the quest for power considered as a right derived from group membership. Southern partisans applauded the Supreme Court’s bizarre decision in the Dred Scott case, asserting that the Constitution governed everyone but protected only persons of non-African descent, while the cultural leaders of the North assumed that the Constitution was of no effect whenever it contradicted the will of God, which was effectively the will of Northern clergymen.

7. Render yourself blind to your own hypocrisy. The goal of hardcore abolitionists was (hold on to your hat) the secession of the North from the South, an act that would relieve the North of any possible association with slavery. To say that this idea expressed maximal concern for the tender consciences of abolitionists and minimal concern for the welfare of the slaves would be a pathetic understatement. As documented by such historians as Edward Renehan (The Secret Six, 1997), few abolitionists (John Brown was an exception) had any respect for actual, living African-Americans. Distinguished leaders of the abolition movement spoke of them in terms I do not wish to quote. Most hardcore abolitionists were also pacifists, advocates of “non-resistance.” Yet when secession happened, they became fervent advocates of violence as a means of crushing the other section’s suddenly illegal and immoral rupture of the union. Southern publicists cultivated a similarly gross hypocrisy — a growing emphasis on the Christianizing and civilizing effects of slavery, amid increasing attempts to criminalize the education of black people and curtail their practice of religion.

8. The fact that you can’t perceive your hypocrisy doesn’t mean that other people can’t; to prevent its public disclosure, you must therefore remove from positions of influence everyone who sees you as you are. Any pretext will do. You can follow the example of the religious proponents of slavery who removed honest preachers from the pulpit, as punishment for being divisive. Or you can take your cue from the religious opponents of slavery, who attacked all who differed with them as foes of Christian love.

Few abolitionists had any respect for actual, living African-Americans. Distinguished leaders of the abolition movement spoke of them in terms I do not wish to quote.

9. Flirt with, encourage, and finally idealize violence. In 1856, Charles Sumner, Republican of Massachusetts, delivered a speech in the Senate that was so insulting to a Southern senator, a person who had aided and befriended him, that Stephen Douglas, listening, muttered to himself, “That damn fool will get himself killed by some other damn fool.” The candidate for other damned fool was Congressman Preston Brooks, Democrat of South Carolina. He didn’t try to kill Sumner, only to humiliate him, but he went to the Senate chamber and assaulted him with a cane. Once he had started, he became more enthusiastic and wounded him so badly that he might have died. The response of Southern partisans was to celebrate Brooks’ achievement, often with souvenirs of model canes, as if caning your political foes were an act of Arthurian virtue. In 1859, John Brown’s attempt to abolish slavery by inciting a servile insurrection — a campaign in which the first enemy slain was Heyward Shepherd, a free black man — sent Emerson, Thoreau, and other Best People of the North into paroxysms of idolatry. Their celebrations of Brown were immediately followed by a wave of Southern lynchings of people erroneously suspected of being in league with him. The participants seem never to have regretted their mistakes; it was all in a good cause.

When things have gone that far, what’s left but war? It’s true, few people, North or South, black or white, wanted a civil war; comparatively few people in the South actually wanted secession, and none of them would have wanted it if they’d had enough sense to visualize its consequences. But when zealots who hold political power cannot stand to be in the same room with one another, except when they are armed — physically or rhetorically — with weapons of destruction, the only choice remaining is the choice between peaceful dissolution and civil war. And few people of that kind will settle for peaceful dissolution.

Once Brooks had started, he became more enthusiastic and wounded Sumner so badly that he might have died.

So much for the events and feelings of the mid-19th century. Do they have anything to say to us, about our own time?

You can answer that question as well as I can. The idea of “ominous parallels” is basically a joke — nothing is really parallel in history, and the most ominous thing about purported parallels is probably the strength of people’s belief in them. But alleged parallels can suggest real similarities, however distant — and important dissimilarities, too.

When I compare 1860–61 with 2018–19, one dissimilarity seems especially important: the difference in intellectual culture, historical knowledge, and capacity for complex political thought between the leaders of then and the leaders of now. Seward, Lincoln, Crittenden, Davis, Benjamin, Douglas, Stephens, Houston, and immediately before them, Webster, Benton, Clay . . . We can discuss their delusions, their false perspectives, their sacrifices of long-term to short-term benefits, their strange errors of judgment. But please show me a list of equally intelligent, capable, knowledgeable, or even personally interesting political leaders in America today.

You can’t? That’s what I call ominous.




Share This


Beer, Bikes & Brexit

 | 

“Ride left, die right!”

Our mantra, continuously repeated to each other, often as a cheerful, running admonition but sometimes shouted in panic, was mostly repeated as a silent meditation while pedaling our bikes from the toe of Cornwall to Scotland’s sagittal crest during June and parts of May and July. The Land’s End to John O’Groats quest has become something of an obsession not only in the UK but also to a cross-section of aficionados worldwide — a British version of the Way of St. James, if you like. Like the Via de Santiago, it has many alternates, with the shortest at 874 miles. Our choice, the National Cycle Trails’ Sustrans Route, is 1,200 miles long.

One aspirant, who with his wife runs a B&B in Bodmin, Cornwall (in which we overnighted) was leaving for John O’Groats the following day to run one variant. Yes, run. Or as the placard on the back of the ungainly plastic box that contained his essentials (including a sleeping-rough kit) and was duct-taped to a tiny day-pack he’d strap to his back proclaimed:

Colin is running for ShelterBox disaster relief charity
1 man 3 Peaks
1 ShelterBox
1,000 miles marathon a day

The 3 Peaks were Ben Nevis, Scotland’s highest; Scafell Pike, England’s highest, and Brown Willy, a small hill atop Bodmin Moor, Cornwall’s highest point. (and one over which Tina, my wife and I biked on our adventure). The man was 53 years of age, and this was his third British Isles end-to-end ultra-ultra-marathon, as these inconceivably long long-distance runs are called.

He wasn’t the only eccentric adventurer we encountered. Another runner, whom we met at John O’Groats just as we were finishing, was just starting out. Unlike Colin, he’d packed his gear into a two-wheeled trailer attached to his chest with a harness. As we watched him start, he jogged into the arm swing and curious gait that ultra-marathoners affect to make it look as if they were running when in fact they proceed little faster than a fast walk, or about four miles per hour. We never found out his raison de run. One tiny, 73-year-old man with a backpack the size of a manatee and a pace that rivaled varve deposition in Loch Lomond (where we encountered him) was doing the walk to commemorate the Queen’s longevity. He presented us with his card. It requested donations to cover his expenses.

The man was 53 years of age, and this was his third British Isles end-to-end ultra-ultra-marathon.

Ian Stocks was bicycling a 20-day, 1,500 mile variant that included the UK’s eastern and westernmost salients, for Alzheimer’s Research UK. At a projected 75 mile-per-day pace he ought to have been much further south than where we met him. I noticed that his gear — bike, panniers, electronics — all looked new, and my BS antenna began to quiver. The received wisdom in the classic-liberal view is that as welfare programs take over the functions of private charities, the latter tend to atrophy. Great Britain, definitely a welfare state, seems to have a surfeit of charitable initiatives. What was going on?

I’d once been solicited for a contribution to “raise awareness for breast cancer” by a group of women breast cancer survivors who were planning on skiing across the Greenland ice cap. They were all seasoned adventurers. I knew what they were up to. Contributions would pay for gear and transportation first; any money left over would go to the “raise awareness” bit.

At this point, let me clarify a popular misconception concerning folks who participate in extreme sports, objectives, deeds, adventures, and such for charity. Their shtick is to persuade the public that they are willing to undergo extreme exertion and privation for a good cause. But nothing could be further from the truth. They do what they are doing because they are addicted to adventure, unique accomplishments, firsts, personal challenges, transcendence of the quotidian, making their mark, even adrenaline or drama; in a word — they love what they do. But extreme adventures are costly, so many fund their objectives by invoking the charity label. I told Trish, the leader of the Greenland expedition (who, years before, had taught me to cross-country ski), that I needed my money to fund my own adventures and that I wished them luck. She didn’t take that well.

Great Britain, definitely a welfare state, seems to have a surfeit of charitable initiatives. What was going on?

So I checked out Ian Stocks’ website. What a surprise! All contributions go directly to the charity; nothing passes through Ian’s hands. Ian’s motivation is his father’s dementia. As of this writing, Ian is still behind schedule, mile-wise, but he has raised over 100% of his targeted contributions.

To me the more fundamental question is why this whole charade is necessary. If an individual wants to make a charitable contribution to a cause he cares for, why does he need a sideshow with no connection to the cause to spur him? Is it even entertainment? Perhaps, in a gladiatorial sort of way.

My wife Tina and I had decided to tackle the end-to-end ride for purely selfish reasons: beer — unlimited traditional cask ales (more on them later), single malt whiskies, and daily feasts of delicious UK food: the full breakfast fry — bacon, sausage, egg, baked beans, fried mushrooms and tomato, black pudding, hash browns and fried toast; the best curry restaurants in the world (Kashmiri, Bengali, Pakistani, Bangladeshi, South Indian); fish and chips to die for; carveries featuring four roasts with all the trimmings, including Yorkshire pudding and meat pastries that defy counting; steak and ale and steak and kidney pies, sausage rolls, Cornish pasties, shepherd’s and pork pies, and many local, flaky variants. And, of course, the chance to explore a country in low gear, meet the people — prone to extreme civility with a subtle but outrageous sense of humor — and get our fair share of exercise in order to continue such adventures well into our senility.

If an individual wants to make a charitable contribution to a cause he cares for, why does he need a sideshow with no connection to the cause to spur him? Is it even entertainment?

Many end-to-end bikers from a variety of countries crossed our path. Unlike motorists, long-distance bikers always stop to pass the time of day, to inquire about one another’s journey, objectives, provenance, etc. Nearly all who were heading north targeted John O’Groats. Just to add a little spice to the repetitive answers and one-up them all, I decided to tell everyone that Tina and I were headed for Scapa Flow. Only the Brits got the joke.

Two separate couples had come all the way from New Zealand. I asked them why they’d come halfway around the world for this biking adventure, when they lived in a country famous for its natural beauty and low population density, a country that would seem to offer a biking paradise. Both couples shook their heads and looked at each other. They both — separately — responded that New Zealand was a relatively new country, and so did not have a well-developed network of old or secondary roads crisscrossing the two main islands. Only primary highways, mostly two-lane, bind the country together. These have narrow shoulders (when at all), and drivers are not sensitive to bikers.

The Road Less Traveled

The Sustrans route we chose uses traffic-free paths and quiet single-lane roads, hence its 1,200 mile length. Those quiet single-lane roads have their own quirks. Nearly all are bordered by 6–9’ hedges, perfectly vertical and maintained by vertical mowers. They are so narrow that planners have installed “passing places” and “lay-bys” about every 100 yards. The occasional oncoming or passing car encountering another car — or bike — must wait for one of these to get by. However, the Sustrans route also seems to go out of its way to stitch together every hill top, traverse watersheds cross-wise instead of following drainages, and generally adhere to Mae West’s observation that “the loveliest distance between two points is a curved line.”

Just to add a little spice to the repetitive answers and one-up them all, I decided to tell everyone that Tina and I were headed for Scapa Flow.

England’s myriad roads, in plan view, mimic the pattern formed by cracked tempered glass — an intricate grid twisted and crabbed beyond any recognizably geometric shape and resembling a Voronoi tessellation. They started out that way and only got more complex as time went on. When Rome conquered England, according to Nicholas Crane in The Making of the British Landscape, “the web of footpaths and tracks serving settlements were an ill-fitting jigsaw of local and regional networks which were difficult for outsiders to navigate.” The bends and salients in England’s roads had evolved over hundreds (or even thousands) of years to link settlements, sources of raw materials, strongholds, religious sites, and so on. These evolved organically before the invention of bulldozers and certainly of modern road engineering with road cuts and fills that reduce gradients and straighten out unnecessary curves. Except for the historic nature of English roads, which sometimes subjected us to 20% grades and less-than-direct transects, they’re a biker’s paradise.

The Hills Are Afoot

Cornwall, the forgotten Celtic country, was a disheartening start to a very challenging ride. Not only does the Cornish section of the route gain little in a northerly direction — and sometimes even trends south — its ride profile resembles tightly clustered stalagmites with significant climbs over Bodmin Moor, the St. Burian and St. Columb Major summits, and a queue of lesser hills. Our old bodies required two rest days in quick succession — at Truro and Bude — if we were to have any chance of reaching Scotland pedaling.

Cornwall might seem forgotten because it’s bedeviled by an identity crisis. Although Celtic in origin, distinct in language and separate as an entity from England, with many unique cultural traits, it somehow missed Tony Blair’s devolution revolution in 1997. Rob, our host at the very modest Truro Lodge, told us that Truro, a cathedral city, was the capital of Cornwall. Since he’d been Truro’s last mayor, I asked him if that made him First Minister of Cornwall. He smiled wryly, admitting that Cornwall had had such an influx of English settlers that there wasn’t much enthusiasm for Cornish devolution, much less independence.

Except for the historic nature of English roads, which sometimes subjected us to 20% grades and less-than-direct transects, they’re a biker’s paradise.

But there is some ambivalence. The Cornish language is being revived. Cornish music, nowhere near as popular as Irish or Scottish music, can still be heard. Outside Truro Cathedral, a couple of buskers with fiddle, guitar, and microphone played traditional tunes to an enthusiastic audience. And in Penzance, along the waterfront promenade, a Cornish band led by a baton-waving, tails-wearing drum major marched in front of our hotel evenings at dusk playing Cornish Morris-type music (I later found out that the all-volunteer ensemble was short on musicians and was soliciting participants).

In 2007, David Cameron promised to put Cornwall’s concerns "at the heart of Conservative thinking." However, the new coalition government established in 2010 under his leadership did not appoint a Minister for Cornwall. Although Cornwall only holds the status of a county in Great Britain, as recently as 2009 a Liberal Democrat MP presented a Cornwall devolution bill in Parliament (it got nowhere), and advocacy groups demanding greater autonomy from Westminster have been waxing and waning over the years.

On June 5 we left Cornwall and entered Devon, the heart of Thomas Hardy country, complete with irresistibly cute, white-washed thatched roof cottages. Though every bit as hilly as Cornwall (1,640’ Exmoor, the Mendip Hills and the Blackdown Hills to the fore), it welcomed us with a roadside sign declaring: Where there are flowers there is hope.

The Cornish language is being revived. Cornish music, nowhere near as popular as Irish or Scottish music, can still be heard.

“Oh, how much fun!” Tina declared — her enthusiastic response to any novelty, serendipitous triviality, unanchored excess of exuberance, or even the prospect of another 20% uphill grade. Up these our odometers would sometimes only display zero miles per hour, even though we were making progress. To pass the time on the slow pedal I recounted the libertarian themes in Hardy’s Jude the Obscure, a novel she’d never read: his depiction of marriage as a crushing force, his belief that organized religion complicates and obstructs ambition, and his critique of the Victorian class system.

At Glastonbury (another rest day) our route finally turned resolutely north. The famous abbey town and final resting place of the legendary King Arthur has become a bit of a Wessex Sedona with crystal shops, goddess centers, metaphysical bookstores, vitamin, herb, and natural food shops, and a vibrant cast of street characters in a variety of stages of mendicancy, sanity, and hygiene exhibiting extremes of sartorial flourishes from total nakedness through purposeful dishevelment to natty eccentricity. Even our B&B hostess had a claim to fame. Sarah Chapman held the Guinness Book of World Records women’s record for walking five kilometers upright on her hands! But the ruins of the abbey, legendarily founded by Joseph of Arimathea in 63 AD and associated with Saints Columba, Patrick, and Bridget but sacked and burned by Henry VIII when he broke with Rome over its refusal to submit to him instead of the Pope, are the town’s saving grace.

By the time we reached Bristol we were deep in the bosom of Old England. Bristol, once England’s doorway to the world, is a thriving, lively, modern city. In its harbor, lovingly replicated, docks the Matthew, John Cabot’s ship. A plaque next to his oversize statue reads: In May 1497 John Cabot sailed from this harbour in the Matthew and discovered North America. The only drawback to being a port city is the seagulls, loud giant avian dive bombers. They are brazen and incorrigible in their quest for food. Early mornings reveal overturned trash bins throughout the city. Gulls have been reported snatching burgers out of hands and even whacking a pedestrian eating a snack on the back of the head so that he drops it and the gull steals the tidbit. One municipal mayor complained that gulls are a protected species.

On to the Midlands

Past the moors and fens, the landscape turned to rolling farm and toft landscape dotted with rhododendron copses. Through the humid and fecund West Midlands, we developed a fondness for the heady odor of pungent silage mixed with barnyard manure — definitely an acquired taste. One evening at a pub, a morris troupe, performing traditional English music and dance dating from before the 17th century, enhanced our after-ride pints. The all-male troupe wearing bells — perhaps the original source of the phrase “with bells on their toes” — and accompanied by a squeeze box, was delighted to entertain foreigners familiar with morris dancing. We stayed in an old Tudor building with buckled floors, absurdly low pass-throughs, and narrow winding stairs whose commemorative plaque read: Crown Hotel: Rebuilt in 1585 on site of a much earlier inn destroyed by the fire of 1583. A coaching stop on the London-Chester run.

By now Britain’s schizophrenic weights and measures standards were beginning to puzzle us. Road distances were in miles, temperatures in centigrade, beer and milk in pints, and folks gave their weight in “stone” with large weights measured in Imperial tons. While the metric system may be simpler in computation, the English system is ergonomic and evolved organically, thereby rendering it more intuitive. And, most curious of all to me, a northern country that in summer experiences 19 or 20 hours of daylight and invented standard time, which it measures from the Prime Meridian of the World at Greenwich, succumbs to the idiocy of Daylight Savings Time.

Refreshingly, the government has not been able — by and large — to impose metric mandates or force observance of DST throughout the realm. When the time changes, businesses readjust their opening and closing times to GMT. With barely four or five hours of total darkness, how much daylight needs to be “saved”? As to the other weights and measures, one informant told me that, except for centigrade temperatures, all new and traditional systems coexist peacefully, with only a handful of rigid requirements such as strong spirits in pubs, which must be sold in 25ml, 35ml, 50ml, and 70ml increments.

Up these hills our odometers would sometimes only display zero miles per hour, even though we were making progress.

Worcester (pronounced Wooster), is the home of Worcestershire Sauce and site of the last battle of the Civil War, in which Cromwell decisively defeated the Royalists. Even more importantly, Worcester Cathedral holds the remains of King John, he of the Magna Carta. The mausoleum was extremely moving, not just for its considerable age and all the empty space surrounding it, but also for the immense significance of Magna Carta itself. For all that a lot of it is unintelligible, Magna Carta was the first assault on the absolute power of English royalty through the separation of powers and the recognition of the rights of a portion of the populace.

In keeping with Sustran’s objective of avoiding traffic, we bypassed Birmingham, Britain’s second largest city. Not so for Manchester. Inevitably, we got lost there. Signage was poor, our map not detailed enough, and Google not up to the task. So, contrary to the clichéd stereotype of a male, I asked a passerby for directions. The lady responded, “You’re in luck, I’m a geographer. Where are you going?”

Now, asking passersby has its drawbacks — too many to detail here — but, in this instance, we weren’t going to a particular place but rather trying to find National Cycle Network Route 6 to get back on track. Never mind; an academic geographer informant — here was the gold standard! After detailing our trip to her I showed her our guidebook’s map. She was no biker and had never heard of the National Cycle Network. She wasn’t impressed by either our guidebook or our map, of which she couldn’t make sense. At once she launched into a tirade about computer generated maps and lectured us on the preeminence of British ordnance survey maps.

Through the humid and fecund West Midlands, we developed a fondness for the heady odor of pungent silage mixed with barnyard manure — definitely an acquired taste.

I responded that she was absolutely correct, except that we would have needed over 100 ordnance survey maps to cover our entire route, at a prohibitive cost in space and pounds sterling. Then she and Tina, interrupting their on-again, off-again chitchat, in between attempting to solve the riddle at hand, pulled out their smartphones — the last resort of the self-unreliant — and sought guidance from Google.

By now I was losing patience. We’d eaten up precious time getting nowhere, so I resorted to a navigator’s last resort: bracketing. I thanked our geographer for her help, gently disengaged Tina from her, and explored four separate directional salients for a mile each, starting from the roundabout we’d stopped at in order, to ensure that one of those was or wasn’t where we were headed. Through the process of elimination, a compass, a closer examination of the clues in our guide, and not a little intuition, we found our route. Lo and behold, we were nigh on it! A block further along the last salient explored, we encountered a National Cycle Network Route 6 sign.

The lessons: Never mistake a geographer for a cartographer: the former specializes in the distribution of the human population over the surface of the land; the latter makes maps. And . . . have confidence in your own abilities.

North by Northwest

The Yorkshire Dales, Cumbria, and the Lake District welcomed us with a smorgasbord of all-you-can-climb hills, appetizers to the Scottish Highlands. By now we’d talked to a lot of innkeepers, publicans, bikers, walkers, shopkeepers, and random strangers. With the 70th anniversary of the National Health Service (NHS) imminent on July 5, I sought infrequent opportunities to gather anecdotes about people’s experience with the service, especially now that Conservative governments had floated proposals to make the NHS financially more viable, most of which included increasing copays. I never brought up the subject but always managed to get folks to elaborate on offhand remarks. One lady mentioned that she’d recently broken her wrist playing cricket. So I asked her if the NHS had taken care of her (Britain has a dual — private and public — insurance and medical system).

For all that a lot of it is unintelligible, Magna Carta was the first assault on the absolute power of English royalty.

“Yes, they did,” she said. But then she backtracked, saying, “No, they didn’t.” So she explained. She went to the nearest hospital with her hand bent at an unnatural angle to her forearm. The staff said they had no room for her, to go to another hospital. So she did. The next hospital looked at her wrist and said it was broken. But they had no room for her. “Go home and wrap it up,” they said. Luckily, her husband had private insurance. The private doctor immediately took care of the fracture.

Another B&B host, an elderly lady who had recently lost her husband and ran a very modest one-woman operation told us she’d had a hip replacement. I asked how well the NHS had treated her. She responded that it had taken a while to get the procedure done, but only because she didn’t understand and had difficulty navigating the bureaucratic requirements. Once she mastered them she was put in queue, awaited her turn, and was very happy with the results.

Of course, the other hot topic of conversation was Brexit. I wasn’t shy about soliciting opinions on that. Two issues determined the close vote: immigration and EU rules (trade, a third issue, was uncontentious: everyone favored trade. However, the first two are interpreted very differently along the political continuum.

Luckily, her husband had private insurance. The private doctor immediately took care of the fracture.

In the course of our five-week traverse of the island we encountered numerous resident immigrants from a very broad array of countries working in sales, labor, and the service sector. I made a point of listing the countries they hailed from: Italy, Romania, Poland, Venezuela, Eritrea, Somalia, India, France, Pakistan, Greece, Spain, Bangladesh, Hungary, Czech Republic, Ethiopia, Thailand, Russia, Germany, Argentina, China, Latvia, Bulgaria, Slovakia, Belgium, Brazil, Philippines, Ukraine, Ireland, and the USA. These were not tourists or ethnic waiters at ethnic restaurants.

Left-leaning reportage attributes the pro-Brexit, anti-immigration vote to “racism,” or “little Englanders,” the British version of chauvinist rednecks. Right-wingers claim that immigrants are taking over jobs. Neither of these glib explanations stuck a chord with us or our informants. But all, regardless of whether they were “leavers” or “remainers,” expressed strong concern about Britain’s nearly limitless immigration. One Welsh AI entrepreneur — a remainer — averred that with an unemployment rate of 4.1% there was no employment problem in the UK. Gareth was so fixated on trade that he blithely dismissed any other concern as illusory.

As to racism, none of the immigrants we interviewed alluded to it; in fact, all expressed a great deal of affection and respect between themselves, the Brits, their neighbors, and their employers (ours was a very limited random sample). And none of the Brits expressed any — even the slightest — unfavorable sentiment about foreigners. Only when riding through Muslim enclaves did we sense any, admittedly vague, tension. So what was going on?

One waitress complained about the niggling EU rules — another erosion of British sovereignty — that even control the shape of bananas an Englishman can eat.

My sense is that the Brexit vote was a continuation of a British exceptionalism that goes back to 1066 — it’s been nearly a millennium since the last invasion. Compared to the continental countries, Britain has been uniquely stable, especially — being an island — as to its borders. In that sense, there is a nebulous perception of continental countries as entities akin to banana republics, with feuds, invasions, and shifting boundaries. To Brits, joining that club has always cost some degree of sovereignty. Margaret Thatcher personified that sentiment when she was unwilling to sacrifice the pound sterling, the world’s oldest, most stable currency (except under Callahan and Wilson) to a committee of continental bureaucrats. Britain did not join the Euro currency; but it did join the European Union, a continuation of the aspiring free-trade policies of the earlier Common Market. The Brits want to trade but don’t want others to control them.

One Scots barmaid was in favor of leaving, but voted to remain for the sake of her children. She complained about the niggling EU rules — another erosion of British sovereignty — that even control the shape of bananas an Englishman can eat. Gareth, our Welsh informant, thought this a red herring issue. But immigration rules are part of the broader EU rules: both require a surrender of sovereignty that the Brits have had enough of ceding.

Finally, there was a general concern that Britain was losing its identity — its culture, if you will — and becoming a nation of immigrants like the US. The August 11 issue of The Economist reports that “more than a third of London’s population was born abroad.”

Scotland the Heatwave

It was uncanny. As soon as we crossed the unmarked border into Scotland, the plaintive tones of a highland bagpipe filled the air. Around the corner we suddenly found ourselves in Gretna Green, once Britain’s answer to America’s Texas, where Scottish law allowed marriage between underage couples, but now a slightly pathetic tourist trap where couples with a romantic disposition to elopement still choose to tie the knot. Never mind, we were entranced and let the piper grab our souls, wrench our hearts, draw tears, and make us feel that we could transcend our limits. And, remarkably, accents turned on a penny from Yorkie to brogue.

As they say in Kentucky, “we were in pig heaven!”

On the first day of summer hordes of embarrassingly (to us, anyway) scantily clad Scots crowded along the shores of every loch, river, canal, and estuary, suntanning their albescent flesh. The unusually hot and dry weather, which had started earlier, was the cause of much comment. Tina, ever one to engage anyone in friendly conversation, asked a middle-aged lady if the unusual circumstances might be caused by global warming. The lady replied that if they were, “Bring it on!” In the 20 days we spent in Scotland it never rained. On June 29 at Pitlochry, the temperature hit 89 degrees Fahrenheit while we were there — leading to a hot muggy night with little sleep in a land where air conditioners and fans are a waste of money.

We looked forward every day to a pint or two of “real ale,” available in participating pubs everywhere but sadly lacking in Gretna Green — another disappointing aspect of the little town. I’m an avid fan of British Real Ale, a beer nearly unavailable anywhere else, and a primary reason for our trip. Real or cask ales (cask-conditioned beer) are unfiltered (they still retain yeast, though that drops to the bottom of the cask) and unpasteurized beer, conditioned (by processes including secondary fermentation) and served from a cask without additional nitrogen or carbon dioxide pressure. They require pumping by hand to serve and give good head in spite of being lightly carbonated compared to bottled beers. There is nothing quite like them in spite of their being brewed as bitters, stouts, porters, and even IPAs.

Breweries are small and local, and mostly supply only a handful of establishments — until recently. We visited one brewery in Pitlochry, the Moulin Traditional Ale Brewery, that brews only 120 liters per day of four different ales and supplies only one pub and one hotel. In the latter half of the last century corporate brewers began buying up pubs, pushing their beers and sidelining — or even eliminating — cask ales. Brits were not amused. In response, the Campaign for Real Ale was founded in 1971, and managed to convince the corporates not to eliminate cask ales. Some, such as Adnams, Greene King, and Marston’s, now even brew their own cask ales.

Although this anecdote is either false — Hume died in 1776 — or was altered in the retelling, it well captures Hume’s thinking.

While in Glasgow we managed to hit the Fifth Glasgow Real Ale Festival, offering over 150 different real ales from all over the realm. As they say in Kentucky, “we were in pig heaven!” We’d barely finished our first pint when the 18-piece Caledonian Brewery Edinburgh Pipe Band marched in playing “Scotland the Brave,” forcing us to freeze in place and raising the hairs on the nape of our necks. We imbibed 105 different real ales during our ride. Only space prevents me from listing them all and their creative names. As of 2014 there were 738 real ale brewers or pubs in the US. There might even be one near you.

In Killin we took a rest day and visited the Clan McNab burial grounds on Inchbuie Island in the River Dochart, along with the associated Iron Age fort and even earlier stone circle. Here in Prescott, Arizona, my hometown, David McNab books Celtic musicians who come on tour to the US. Married to a Scots lassie, he treasures his heritage. We’d be a culturally poorer town without his concerts.

As we passed Loch Tay, the Scottish Crannog Centre, an outdoor museum with a restored lake dwelling dating from about 800 BC, beckoned. The crannogs were built on stilts or artificial rock islands on the water. Villages, consisting of three crannogs, each with about 90 inhabitants, were common in Scotland and Ireland as early as 5,000 years ago and as late as the early 18th century. While Scotland has only 350–500 crannog villages, Ireland — on a much larger land mass — boasts about 1,200. Doubtless, both countries have many more crannog villages, underwater archaeology presenting considerably more obstacles (in survey and excavation) than terrestrial.

This odd dwelling pattern was first glibly explained as being of a defensive nature (most 19th century archaeologists being retired military men), but few weapons or evidence of warfare associated with the crannogs exists. The new explanation is that the dense vegetation of the Celtic countries favored cleared land for agriculture, not for mere habitation, while the riparian location facilitated extensive trade networks, evidence for which — including networks all the way to mainland Europe — is abundant.

The Loch Tay Crannog Centre, near Kenmore, Perth, and Kinross, isn’t just one reconstructed crannog with three dugouts. The staff has recreated the entire lifestyle of the inhabitants: foot-operated lathes; grain-grinding stones; wool spinning, dyeing, and weaving; and fire-starting by “rubbing two sticks together,” a practice often mentioned but seldom seen. It means using a fire drill. With the proper knowledge, preparation and materials, all things are possible. The demonstrator (even his shoes and clothing were authentic) started a fire in less than a minute.

The Braw Hielands

Somewhere beyond the Crannog Centre we crossed into the political subdivision known as the Highlands and Islands of Scotland. Trees and settlements became scarcer, midges and cleggs more numerous. Heather (purple), gorse (yellow), and bracken (green) gilded the landscape. Long-haired Highland cattle and Scottish Blackface, Wensleydale, Cheviot, and Shetland sheep predominated. It is here — not in Gretna Green — that the romance of Scotland kicks in: Rabbie Burns; Bonnie Prince Charlie; Nessie; Capercaillie and Old Blind Dogs; kilts, sporrans, and claymores; haggis; the Outlander miniseries; and even Mel Gibson berserking over the moors as William Wallace come to mind.

However, my own mind gravitated to those two giants of the Scottish Enlightenment, David Hume and Adam Smith. I’d not run across any memorials, statues, or even streets named for either in their homeland. That’s more understandable for Hume, whose somewhat counterintuitive, esoteric — albeit undogmatic — thinking isn’t readily accessible. But Adam Smith, the father of economics, the Charles Darwin (or Albert Einstein) of the dismal science, is a household name. His insights are readily accessible and intuitive.

In three separate trips to Scotland, I have been struck by the lack of Adam Smith memorials.

Smith and Hume were drinking buddies (which is saying a lot in 18th century Scotland, where getting plastered to oblivion was a national pastime). One bit of Hume’s thought that was accessible — though still counterintuitive — is encapsulated in an exchange he had with Smith. The United States had failed to agree on an official religion for the new country: a first for its time. Smith, a man of indeterminate religious beliefs, bemoaned the fact, opining that the lack of an official faith would doom the country into irreligiosity. Hume, an agnostic, disagreed. He predicted that countries without official faiths would experience a flowering of religions, while the official religions of countries that had them would wither into irrelevance. Although this anecdote is either false — Hume died in 1776 — or was altered in the retelling, it well captures Hume’s thinking.

The anecdotal Hume was right. America soon experienced the Second Great Awakening, the birth of a multiplicity of religious sects in the 1800s. Today, according to The Guardian (September 4, 2017), more than half the UK population has no religion; while nearly 76% of Americans identify as religious.

In three separate trips to Scotland (one where I walked across the country) I was struck by the lack of Adam Smith memorials. One informant said the Scots had little affection for Smith. Public opinion inside Scotland holds Adam Smith, the father of capitalism, responsible for the Highland Clearances. And public opinion outside Scotland perceives the Scots as socialist. It’s not so simple.

In the 2017 UK elections, the Conservative Party received 28.6% of the vote and overtook the Labour Party, the real far-left socialists, who received 27.1%, as the main opposition party to the majority Scottish National Party, which got 36.9%. The Scots are nationalistic, thrifty, good businessmen who hate taxes — traits not often associated with socialism (though they abhor class and status pretensions).

But back to Smith and the Highland Clearances. Smith was a strong advocate of both private property and efficiency in production. When The Wealth of Nations came out, Scottish clan chiefs decided to reinterpret their position as not just clan heads, but also fee simple owners of clan lands, according to how they interpreted Smith’s concept of private property. They became lairds, owners of the clan lands instead of feudal lords. As feudal lords they’d had a complex set of rights and duties with their crofters. However, as lairds, they suddenly became absolute owners of what was now their private property. Since Scottish law had not formalized feudal rights and duties, the transition from a feudal system to a modern market economy was — to say the least — awkward.

The crofters were subsistence farmers. Their part of the deal was to give a percentage of their harvest to the clan chief in return for protection, leadership, dispute resolution, and so on. Advances in agronomy and a booming market for wool indicated to the new self-declared lairds that sheep grazing would enrich them much more than a few bushels of oats. Most chose sheep over oats and evicted the crofters, hence the Clearances. (This is a simplified version.) Not all lairds ignored the crofters’ feudal rights. Lairds’ responses ran the gamut from keeping the crofters as tenant farmers, to buying them out, to cruel dispossession and eviction. There was no uniform formula; the greediest landlords made the headlines. Adam Smith got the blame. Finally, however, in 2008, an elegant ten-foot bronze Adam Smith statue on a massive stone plinth and financed by private donations was unveiled in Edinburgh’s Royal Mile within sight of a rare statue of his friend David Hume.

Outside Inverness, capital of the Highlands, the Culloden battlefield, site of the last battle (1746) fought on British soil, cast its spell. Supporters of the Stuart (Jacobite) dynasty fought the by-then established Hanoverian dynasty army of George II. The German Hanoverians had been installed as monarchs of the United Kingdom after Parliament tired of both Stuarts and civil wars. A common misconception holds that Jacobitism was a Scottish cause because the Stuarts, before being invited to rule over England had been kings of Scotland, and most of the Jacobites were Scots. Again, not so simple.

Since Scottish law had not formalized feudal rights and duties, the transition from a feudal system to a modern market economy was — to say the least — awkward.

Monarchy has its own rules of succession. Under those rules, Charles Stuart (Bonnie Prince Charlie) ought to have become king of the United Kingdom. The problem was that the Stuarts were Catholics and a Catholic, according to the Act of Settlement passed by Parliament in 1701 — the expedient to finally dump the Stuarts — could not rule over a Church of England realm, much less head that church. Adherents to the monarchy’s rules of succession did not accept Parliament’s power to overturn those rules, hence the Jacobite uprising. Scots, English, and Irish participated. The presumptive heir to the Jacobite crown today is Franz Bonaventura Adalbert Maria von Wittelsbach who, if he were on the throne, would be known as King Francis II.

We took a rest day in Inverness and got a dose of fire-and-brimstone Scottish Calvinism and attended a couple of ceilidhs — once both at the same time. A determined preacher in white shirt and tie stood on the Crown Road, Inverness’s high street, reading the Bible in thunderous and orotund sonority to the passersby while fiercely gesticulating with his free hand. We were entranced. Particularly when a young fellow in a t-shirt and a newsboy cap took a stance across the street, pulled a bagpipe out of its case, laid out the case to collect donations, and hit the chords of “MacPherson’s Lament.” He completely drowned out the homilist, who nonetheless persevered, impervious to all external distractions. As to the other ceilidhs, one particular impromptu session at a pub included two fiddles, a guitar, uilleann pipes, and a saxophone — the last two instruments a particularly innovative and sonorous combination.

North of Inverness nearly all the trees disappeared, as did fences, buildings, and power poles; even the livestock thinned. It was a magical, surreal landscape with the odd abandoned stone sheep enclosure. At Tongue, the North Sea finally came into view. When the Orkneys appeared on the horizon, our hearts skipped a beat: we knew we were nearly done. Stroma, the nearest Orkney, presented a spectral appearance. It had been abandoned in 1962. A scattering of old stone cottages, unconnected by roads, eerily dotted its landscape. Soon John O’Groats, little more than an inn and tourist shops, materialized out of the grassy plain. We’d covered 1,291 miles — according to our bike odometers — in 29 days, with an additional eight rest days.

The piper completely drowned out the homilist, who nonetheless persevered, impervious to all external distractions.

After a shuttle to Inverness and an overnight ride on the Caledonian Sleeper we arrived at Euston Station, London. During the ride — both of them — we reflected on Britain’s virtues. It’s a country with no earthquakes, volcanoes, hurricanes, tornadoes, forest fires, or mudslides; having an ideal climate with no extremes of heat or cold, aridity or rain; a varied and undulating topography of grasslands, moorland, woodland, glades, estuaries, highlands, and lowlands; hamlets, villages, towns, and cities with a minimum of sprawl; little crime, few slums or homelessness; a cultured people with a generally sensible disposition (and oodles of book stores); and enjoying separation of head of state from head of government. Finally, it’s always been Great, and, best of all — has unsurpassed beer and whisky. What more can you ask for? Lower taxes?




Share This


Los Pollos Coming Home to Roost

 | 

When President Trump — El Jefe, as he is known to his precious few Mexican devotees — started his jihad against the NAFTA treaty two years ago, some of us predicted trouble. NAFTA — conceived by President Reagan, negotiated by President Bush the Elder, and signed in 1994 by President Clinton, never was a “bad deal” for the US. It dramatically increased North American trade, and while we ran trade deficits with other North American countries, they were small in comparison to our major deficits (with Germany, Japan, and China — none with which we ever bothered to do free trade agreements), and were matched by counterflows of investment.

But the protectionist populist Trump believed his own propaganda that free trade “costs” Americans their jobs. He still maintains this, as our unemployment rate approaches a miniscule 3%. And his method of negotiation was as crude as it was thuggish. He repeatedly attacked Canada and Mexico, both their leaders and — in the case of Mexico — their citizens.

There were two results.

First, he was able to get a new deal. But it is worse than the original, at least from the view of the classical liberal. In exchange for a few tariff reductions, Trump’s new NAFTA forces regulations on Mexico to pay its autoworkers more — so they won’t be so competitive against US autoworkers. But while that satisfies Trump’s union supporters, it screws the rest of us, who will now have to pay more for cars.

NAFTA was never a “bad deal” for the United States.

In other words, the man who claims we need fewer regulations just jacked them up. Yes, the Boss is the Master of the Deal — the Raw Deal.

But the other effect of the Boss’s infantile bullying is to have driven anti-American sentiment through the roof in both Canada and Mexico. This sentiment helped elect the extreme leftist Andres Manuel Lopez Obrador — aka “AMLO” — to the presidency of Mexico. AMLO took office on December 1, and a few recent reports in the Wall Street Journal indicate that the chickens — los pollos! — are coming home to roost with their afterburners on.

One report is a sketch by the Journal’s Latin America expert — the estimable Mary Anastasia O’Grady — on just what this inaugural is inaugurating. For one thing, AMLO had as special guests Venezuela’s Marxist dictator Nicolas Maduro and Bolivia’s caudillo Evo Morales, both Fidel Castro wannabes.

For another thing, AMLO is reverting to his demagogic character (he was known for his fierce leftist philippics and for summoning forth his myrmidons to march in the streets). And he has already shown an inclination to disregard existing contracts and rule by diktat. For example, he opposes the new Mexico City International Airport, $6 billion in bonds for the construction of which have already been sold. He prefers to expand the existing location, which just by chance is located in the district where he has traditionally held power, thus funneling the funding to his supporters.

The new deal is worse than the original, at least from the view of the classical liberal.

As a consequence, bond investors appear to be getting ready to sue, in case of default. For this reason, three of the five worst performing quasi-sovereign bonds in the world this quarter are Mexican. These include bonds for the airport, the Mexican Federal Electricity Commission, and Pemex, the Mexican national oil company. The drop in Pemex bonds especially indicates a fear among investors that AMLO will undercut or even repeal the game-changing 2013 revision of the Mexican constitution that allows outside — read “gringo” — investment. AMLO’s energy advisors are all long-standing opponents of the reform; they are all Mexico-first protectionists. In other words, they are all Trumps-in-Pancho-Villa costumes.

Add to this two other AMLO policies, and investors have every right to revolt. First, he wants to spend money freely to “create” jobs — rather like Trump’s own advocacy of infrastructure spending. AMLO would start by spending nearly $20 billion on a new refinery (in his home state, of course), a new thousand-mile train in the Yucatan, and a jobs bill for the youth. Like Trump, AMLO has no fear of deficits.

Second, AMLO proposes to fight the high crime rate in his country by creating a new “National Guard” combining regular army soldiers, marines, and federal police to fight the cartels. Of course, this standing AMLO army would be a perfect SS, should he decide to go into full Hugo Chavez mode.

Lopez-Obrador's energy advisors are all long-standing opponents of free-market reform; they are all Mexico-first protectionists.

Just the thought of a toto-AMLO government has sent the Mexican IPC stock index and the peso itself down into the tank with the bonds.

The death of Bush the Elder came at an ironic time. Bush 41 was a masterful conciliator and international diplomat. He maintained our alliances while overseeing the peaceful dissolution of the Soviet Union. By contrast, Trump the Infantile has managed to alienate our former allies, and so antagonize the Mexican people that they elected a populist leftist radical.

The result is shaping up with astounding rapidity. Mexico has elected an extremist leftist, who will likely turn Mexico into a veritable Venezuela. The result of that will be a wave of Mexican immigration that will dwarf any prior waves. Consider this. So far, three million Venezuelans have fled their country to avoid the economic disaster. Mexico has four times the population of Venezuela, so if a like number flee Mexico we can expect about 12 million more immigrants. The irony is breathtaking: Trump’s policies creating a massive wave of his least favorite people moving in.

A further irony inheres in Trump’s attempt to protect American autoworkers.GM has just announced that it will be getting rid of 15% of its salaried workforce in North America, and will be shuttering five plants. The total loss will be almost 15,000 jobs. GM will focus on its more profitable cars, especially pickup trucks and SUVs.

Mexico has four times the population of Venezuela, so if a like number flee Mexico we can expect about 12 million more immigrants.

The news cheered investors but enraged Trump, who immediately blasted the company and threatened to remove GM’s continuing subsidies — which immediately lowered the value of GM’s share by 2.6%. The anger was shared by other politicians, who remember the nearly $50 billion the taxpayers gave the company to rescue it from bankruptcy less than a decade ago — not to mention the continuing tax credit of $7,500 for each of its electric vehicles sold. The bad publicity resulted in GM’s announcing, a few days later, that it will be adding about 2,700 jobs at some plants in other states, and that some laid-off employees could apply for those jobs. But it still means a major drop in high-paying jobs.

I am not merely saying that Trump’s protectionism didn’t help GM enough to stop its layoffs, though that’s bad enough. I am saying that his actions are going to make it harder to prevent future layoffs. To avoid them, GM would have to sell more cars, but there is a limit to how many expensive SUVs and pickups it can sell. So it would need to increase sales of lower-end cars. But given the high US labor costs, this would require moving more of the supply chain to lower-cost venues, such as Mexico. By blocking that, therefore, Trump won’t protect highly paid American workers making low-end cars, using components manufactured in low-cost Mexico) he will force the automakers simply to stop making low-end cars, thus eliminating jobs.

Clearly, we cannot say whether the Trump renegotiation of NAFTA played any role in GM’s recent decision. After all, we cannot read minds. But just as clearly, the USMCA will make it hard for American automakers to save on labor costs.

I am not merely saying that Trump’s protectionism didn’t help GM enough to stop its layoffs, though that’s bad enough. I am saying that his actions are going to make it harder to prevent future layoffs.

So there you have Trump’s magisterial strongman running of the economy. He bullies two of our closest allies to get only trivial benefits in tariff concessions, but at the cost of electing an anti-American in our southern neighbor. The main thing he got was a regulation placed on plants in another country to pay inflated prices with the intention of protecting highly paid American union jobs. But the result isn’t likely to be a gain in American jobs; it’s likely to be a loss, as the heavily regulated US automakers struggle to make a reliable profit. The only probable gain will be a massive new wave of immigration, as lower wage Mexicans get priced out of their jobs.

Why anyone would suppose that Trump or anyone else can run American industry by divine fiat is beyond me. But people crave the Strong Man who will guide the economy like a god. And that, dear readers, is in my view goddamned stupid.




Share This


Not to Praise, But to Bury

 | 

As another elder statesman dies and the nation is caught in the grip of another bout of panegyrics, it’s worth stepping back to concentrate on the individual lives that they touched during their time in the halls of power. For George Herbert Walker Bush, specifically, that means considering also the plight of Keith Jackson.

In 1989, Jackson was a high school senior in Anacostia, southeast DC, living in one of the worst zip codes in the country. Like many of his peers, Jackson was a low-level drug dealer, one of the smallest cogs in a larger machine, like the Baltimore towers in The Wire. Crucially, he had reached his 18th birthday when the federal government started setting him up for a presidential publicity stunt.

See, George Bush, seemingly desperate to prove he was man enough to live up to his successor, wanted a set piece to kick off his own extension of Reagan’s War on Drugs. So his staff came up with the idea of busting someone for selling crack cocaine—still the drug warrior’s enemy of choice—in the shadow of the White House.

Bush demanded more cops to arrest drug dealers, more prosecutors to seek harsher penalties for them, and more prisons to hold all the extra convicts.

DEA agents offered up Jackson as a patsy. He’d been on their radar for months—so if selling drugs in and of itself was really such a big deal, they could have grabbed him at any point (and then he’d be replaced by another young slinger with no other prospects, and then another, ad infinitum). No, he was only worth it if he could be sacrificed for a higher purpose, like making a weedy, “wimpy” Massachusetts desk-occupier look like a tough guy. That purpose in hand, the undercover DEA agent on Jackson’s case asked him to meet at Lafayette Park, promising an extra premium to lure Jackson to Northwest DC, where black residents of the city almost never went. (As a measure of how stratified and segregated DC society was at the time — not to mention how complete the failure of the educational system — when the undercover DEA agent asked Jackson to meet him in the park across from the White House, Jackson didn’t know where that was until piecing together that it was “where Reagan lives,” and he was hesitant to make the trip because one thing he did know is how much greater the police presence would be in Official DC.)

The purchase took place on September 1, and on September 5 Bush was holding up a plastic baggie of crack cocaine during a White House address, noting that it had been “seized” (not bought) just across the street. He demanded more cops to arrest drug dealers, more prosecutors to seek harsher penalties for them, and more prisons to hold all the extra convicts. He got all of those things, often in connection with mandatory minimum laws that eliminated judicial discretion in sentencing (and which perpetuated a nonsensical divide in sentencing between powdered and crack cocaine, the burden of which fell almost entirely on the black community).

If George Bush ever cared about those whose lives didn’t intersect with his, he certainly never showed it.

Keith Jackson was one of those who fell prey to a mandatory minimum. The DEA arrested him, not at the sale for whatever reason, but immediately after Bush’s speech. After his first two trials ended in hung juries, a third trial saw him convicted and sentenced to a legally-mandated decade in prison without parole. The judge in the case, uncomfortable with the mode of Jackson’s entrapment, urged him to ask the president for a commutation. But Bush had almost immediately washed his hands of the matter: facing criticism from a variety of sources including even those had a stake in the Drug War’s continuance (like the head of the city’s police union), Bush said, “I cannot feel sorry for [Jackson]. I’m sorry, they ought not to be peddling these insidious drugs that ruin the children of this country.” And so, for the crime of selling 2.4 grams of crack cocaine to another consenting adult in a place where there had been no recorded drug busts in the past, Keith Jackson served almost eight years in prison.

What happened to him after that point is not known. One doubts that Bush ever dwelt on Jackson or any other of the thousands affected by yet another surge in the War on Drugs—young men and occasionally women losing their futures to ruthless sentencing guidelines and the economic incentives of incarceration, or often just their lives to police enforcement or to the criminal turf wars that invariably follow the artificial limiting of a highly in-demand substance. Add in the families and communities that depended on this suddenly absent and incarcerated generation, and it’s hundreds of thousands if not millions.

But if Bush ever cared about those whose lives didn’t intersect with his, he certainly never showed it, as the Iraqi people had ample opportunity to learn. In the rush to war with one-time American ally (indeed, almost appointee) Saddam Hussein over the invasion of Kuwait, Bush infamously allowed himself to be swayed by the testimony of a supposed refugee of the conflict, known only as Nayirah, who spoke of Iraqi soldiers raiding Kuwaiti hospitals, pulling prematurely born infants out of incubators and tossing them aside to die. By the time it was discovered that Nayirah was actually the daughter of the Kuwaiti ambassador to the U.S., and the entire thing had been organized by an American PR firm in the employ of the Kuwaiti government, the war was already over — though its repercussions will persist long after our lifetimes.

Between his year directing the CIA and his time as vice president, he was involved in some of the most notorious operations run through the US government: Operation Condor, the School of the Americas, the Iran-Contra affair.

An estimated 100,000 Iraqi soldiers and an unknown number of civilians were killed in that first Gulf War, with the particular highlight of the Highway of Death, in which American forces blockaded and massacred retreating Iraqi forces, as well as any civilians unfortunate enough to be within cluster bomb range. Content with this level of slaughter, Bush called off hostilities the next day—a point in his favor, perhaps, when compared to those overseeing the unceasing carnage of today’s forever wars. But Bush hardly had clean hands before this, having already orchestrated an illegal invasion of Panama. Between his year directing the CIA and his time as vice president, he was involved in some of the most notorious operations run through the US government: Operation Condor, the School of the Americas, the Iran-Contra affair; it will be decades though, if ever, before we learn just how deeply he was implicated.

There’s much else to dislike about the elder Bush and the legacy he is leaving behind, in particular his enablement of many awful people. You can draw a direct line from his campaign manager Lee Atwater and his infamous Willie Horton ad to the race-baiting scare tactics used by Donald Trump. A look at Bush’s administrative appointees reveals many of the big names—Dick Cheney, Paul Wolfowitz, Donald Rumsfeld — who would go on to botch the Iraq and Afghanistan conflicts, all the while pushing for ever more wars on ever more fronts. (Which is not even to mention his son who, in signing off on Gulf War Redux, committed what is thus far the greatest geopolitical blunder of the century.) You could talk also about his surrender to the tax-and-spenders on budget issues, or to the Religious Right about gay rights. You could also give him credit where it’s due: for handling the end of the Cold War with flexibility and grace, for committing himself to promoting volunteerism and community service, for not following in the footsteps of his father, Prescott Bush, and signing on to any half-baked fascist coups against the US government.

All this, at least the good stuff, or the bad stuff that various media figures want to recast as good, will be gone over ad infinitum. But when you see the footage of his funerals, when you take in the official outpouring of grief that is increasingly mandatory on such occasions, when above all you hear anyone talking about how George H.W. Bush advocated for a “kinder, gentler conservatism,” spare a thought for Keith Jackson. It’s more than Bush ever did.



Share This


The Great Anti-Climax

 | 

I do not believe in the saying that “all politics is local.” If that’s true, why are we always getting into wars in other countries? But during this election cycle I was very interested in California, which is my own locale.

As predicted, California elected as its next governor one Gavin Newsom, a wealthy former mayor of San Francisco and currently lieutenant governor of the state, who is about as smart as the average doorknob. This was a year in which handsome men were thought to have an enormous advantage; they seemed to remind people of John F. Kennedy, who when you think about it was handsome only when compared with Dwight D. Eisenhower. Newsom is handsome-for-a-politician, but that’s not why he won. He won because the Republican Party in this state dissolved about a decade ago, giving place to a fairly well-oiled Democracy run by the state employees’ unions. The surprise is that Newsom’s opponent, a small-government tax hawk named John Cox, received 41% of the vote and was considered a remote possibility to win. Cox was an excellent campaigner and got his votes by himself, with little help from a rumored “Republican Party.”

The same amount of help was rendered by that party to the biggest ballot initiative, Prop 6, which would have rolled back a large tax increase imposed in 2017 by the Democratic legislature, supposedly to “fix the roads.” When people list the core items that they expect to see in any state budget, roads usually rank first or second. But not in California. The money that should go for roads — even money granted by the voters in previous ballot propositions — goes instead for bike lanes, parks, and other “environmental” matters, and for astronomical employee salaries. (I don’t mean that the employees are astronomers; if they were, they might actually do some work. California is a place where people often have to wait seven hours to do their business at the DMV.) Before the latest tax increase, California already had the highest gas taxes in the nation; now they are higher. The new gas tax is one of the most regressive imaginable. It means that breadwinners have to pay the government about $400 a year, extra, or not be allowed to drive to work.

This was a year in which handsome men were thought to have an enormous advantage.

Prop 6 was designed to end this tax and not let it happen again. It was the brainchild, not of the Republican Party, but of a gay, hyper-energetic San Diego talk show host, Carl DeMaio, who is very good at pushing a cause. Pre-election surveys indicated, predictably, that two-thirds of voters were in favor of a proposition rolling back the gas tax. But Prop 6 went down, 45 to 55. Why? Because the Democratic secretary of state entitled and summarized it as an attack on road repair:

ELIMINATES CERTAIN ROAD REPAIR AND TRANSPORTATION FUNDING. REQUIRES CERTAIN FUEL TAXES AND VEHICLE FEES BE APPROVED BY THE ELECTORATE. INITIATIVE CONSTITUTIONAL AMENDMENT.

SUMMARY

Repeals a 2017 transportation law's taxes and fees designated for road repairs and public transportation. Fiscal Impact: Reduced ongoing revenues of $5.1 billion from state fuel and vehicle taxes that mainly would have paid for highway and road maintenance and repairs, as well as transit programs.

Note that telltale “as well as transit programs,” which clearly indicated, to anyone who read that far, that the money, as usual, would be spent on other things than fixin’ the roads. California voters didn’t read that far.

Yet while naïve voters were killing Prop 6, they were also killing Prop 10, which would have permitted and encouraged more than 500 local governments to impose rent control on the helpless population. They voted this one down by 62 to 38.

Large majorities on each side. Why? How? I don’t know. You tell me.

Note that telltale “as well as transit programs,” which clearly indicated, to anyone who read that far, that the money, as usual, would be spent on other things than fixin’ the roads.

Turning now to the nation at large: we’ll see whether there was a blue wave or a red wave when we see some kind of sophisticated, non-axe-grinding study of voters. We may wait a long time for that. In the meantime, we can say that if there was a blue wave, there was a red wave to meet it.

But remember: most congressional races in this country were decided on the yaller-dog principle: “Some people will vote for a yaller dog as long as he’s on the Democratic [or Republican] ticket.” That’s how New Jersey Democratic Senator Robert (“Bob”) Menendez got reelected, 53 to 42, despite his public repute as a crook and not a smart or likable one, either. And that’s how California House District 50 (eastern San Diego County) got decided. The Republican incumbent, Duncan Hunter, to whom nobody ever gave much credit for brains, is under federal indictment for using about $250,000 of campaign money for vacations, eating and drinking, “personal relationships,” and other fun, though basically penny-ante, stuff. His Democratic opponent was Ammar Campa-Najjar, age 29, another one of this year’s handsome young men. Until Hunter’s indictment, Campa-Najjar, a former Obama organizer and scion of a family of Palestinian enragés, was a purely sacrificial candidate for the Republican 50th. Hunter’s indictment united almost everyone in the county, Republican and Democrat, in scorning and deriding Hunter; it dried up his campaign money and unleashed a deluge of funds for Campa-Najjar, who is said to have spent ten times more money than Hunter. But it was all for nothing. Hunter’s district was safe Republican, and remained such. He was reelected 54 to 46.

Nationwide, a lot of electoral activity consisted simply of voters returning to their natural allegiance. Missouri, North Dakota, Indiana, Tennessee — these are Republican states, and it was strange that they should have Democratic senators to begin with, or (in the case of Tennessee) that they should consider having one now. In other states, where there were real contests, the vote could usually have gone either way; the outcome therefore didn’t mean much on the philosophical plane. I’m thinking of the Florida Senate and governor race, the Wisconsin governor race, the Arizona and Nevada Senate races, and even the Montana Senate race. I’m not thinking of the Texas race, where the Republican governor won overwhelmingly, while the Republican senator, Ted Cruz, won merely respectably. Cruz, who was up against another “handsome,” “Kennedyesque,” but also overbearing “young” man, is virtually the only politician in the country who is less likable than Hillary Clinton. His Democratic foe had so much out-of-state money that he couldn’t think of ways to spend it all. But Cruz won — because Texas is Texas and Robert (“Beto”) O’Rourke is not.

Most congressional races in this country were decided on the yaller-dog principle: “Some people will vote for a yaller dog as long as he’s on the Democratic [or Republican] ticket.”

In Massachusetts, voters went overwhelmingly for a politician even less likable than Cruz, Elizabeth Warren; they also went overwhelmingly for the Republican gubernatorial incumbent. Maryland also voted Democrat for almost everything except its governor. The expression “the bland leading the bland” may apply; remember that Mitt Romney is a former governor of the People’s Republic of Massachusetts as well as a former Republican nominee for president. Romney was born in Michigan, has lived mainly in California, was governor of Massachusetts, and has now been elected a senator from Utah — a remarkable career of disaffiliation. Anywhere he hangs his hat is home, for now.

I don’t know enough about the folkways of Massachusetts and Maryland to guess why they elect conservatives to the statehouse and liberals to other offices; maybe the conservatives and the liberals are both members of the Faux Party, and the electorate loves and cherishes them for that reason. I do know that there isn’t any basis for another piece of folk wisdom, just now being uttered ad nauseam — the idea that the American people split their tickets between parties because they want balanced and limited government. Chris Stirewalt, a person who masquerades for Fox News as a political analyst, said on election night that there is “a preference among Americans for divided government.” Stirewalt instanced the coming Democratic House and Republican Senate.

This is so fatuous, it’s hard to find words for it. The completely safe districts that elect 80% of Congress are not populated by people who vote for a Democratic congressman and a Republican senator in order to preserve balanced government. When it comes to Congress — and usually every other office — they vote a straight party line. We have divided government only because other people vote an opposing straight party line. There are exceptions, as in the Republicans elected to the governorships of Massachusetts and Maryland, but they are just that — exceptions. Californians did not vote for big government when they turned down Prop 6 and then vote for little government when they welcomed Prop 10 because they wanted to balance big and little government. They did it because they didn’t, wouldn’t, couldn’t read beyond the title of Prop 6 but for some unknown reason sensed that Prop 10 was a danger. We don’t have sheep and wolves because someone decides that sheep and wolves need to balance each other; we have sheep and wolves because sheep engender sheep and wolves engender wolves.

Mitt Romney was born in Michigan, has lived mainly in California, was governor of Massachusetts, and has now been elected a senator from Utah — a remarkable career of disaffiliation.

It seems that Trump did marginally better than most other presidents at limiting his midterm losses in Congress; he lost fewer House seats than the average, and he picked up at least one valuable Senate seat. But we can’t assume that “he” was the crucial factor. He had an effect, surely; he “energized” many voters for and against him. It’s my bet that the energized Democrats were going to show up and vote anyway, but many of the energized Republicans would have stayed home, had not Trump inspired them. Yet in some cases, “he” probably “won” races despite himself. Ron DeSantis, the Florida senatorial candidate whom Trump endorsed, probably had a harder time in the general election than his primary opponent would have had. DeSantis seems to have won the general election by only four-tenths of 1%. I doubt, however, that the (failed) Republican senatorial candidate in Montana would have gotten within three percentage points of his incumbent rival without Trump’s efforts.

But speaking of the Montana election, it came within perhaps 1000 votes of being swung by the finally unwilling candidacy of a big-L Libertarian, Rick Breckenridge, who got 2.8% of the vote despite having dropped out, late in the game, in favor of the Republican. The LP guy had been polling at about 4%, but when he left, many votes had already been cast. No one knows for sure, but I assume that LP votes in Montana come mainly out of the Republicans. Some Democrats in Montana assume that too, because they sent out mailers urging “true conservatives” to vote for Breckenridge instead of the Republican — tactics that led Breckenridge to endorse the Republican.

Contrary to constant press reports about the remarkable popularity of the Democratic incumbent, Jon Tester — “a rural Democrat who still connects with the people,” etc. — the Libertarian Party appears to have been responsible for electing him in both 2006 and 2012, years in which Tester’s margin of victory over his Republican opponent was .87 and 3.72%, respectively, and the LP candidate’s vote was 2.6 and 6.56. It is painful to ask this question, but is it the LP’s job to elect members of other parties?

When it comes to Congress — and usually every other office — most people vote a straight party line. We have divided government only because other people vote an opposing straight party line.

Donald Trump may be enjoying the prospect of the next two years. In the Senate, he has achieved a significantly more Trumpian majority — no more Flake, no more McCain (although Mitt Romney will be glad to obstruct any non-RINO programs). In the House, he has gone from a slim Republican majority, out of which he got nothing except the tax cut, to a slimmer Democratic majority. God’s gift to him is the Democrats’ custom of automatically awarding committee chairmanships by seniority, which means that most of the key positions will go to elderly and loquacious men and women elected from extremely safe districts — a recipe for disaster if the election of 2020 is nationalized, which it surely will be. Maxine Waters does not play well on the national stage. She plays a little bit better than Nancy Pelosi.

But certain it is that this election was God’s gift to people who write about politics and enjoy laughing at politicians. The cast is irresistible . . . Trump . . . Waters . . . Pelosi . . . Schiff . . Nadler . . . Warren . . . To paraphrase yet another old saw, “politics is a tragedy for those who think and a comedy for those who feel — that a lot of good jokes are coming.”




Share This
Syndicate content

© Copyright 2019 Liberty Foundation. All rights reserved.



Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.