Vietnam Revisited

 | 

I never fought in Vietnam. By the time I was old enough to go, I held a high draft-lottery number and a student deferment, and was never called up. I do remember the war, though. Early on, when Kennedy sent in the advisers, I was in elementary school, and saw the pictures on TV and in Life magazine. When Johnson sent in half a million men, I was in junior high, and we argued about the war in class. When Nixon came to power I was in high school, and we debated it more. When the four protesters were killed at Kent State University, I was finishing my first year at the University of Washington in Seattle. My instructor in German cancelled classes and gave us all A’s so we could go protest. I stood aside, watching the protesters flood onto Interstate 5 and block traffic until the cops pushed them off the exit to what are now the offices of Amazon.

My sentiments on the Vietnam War, like those of most Americans, evolved. In 1975, when South Vietnam collapsed, its government appealing for help and the US Congress and President Ford offering none, I was as coldhearted as anyone. I thought, “To hell with Vietnam.” I had been reading about it, thinking about it, arguing about it since I was a kid. During that time 58,000 Americans, of whom nearly 18,000 were draftees, had been killed there, along with maybe a million Vietnamese, and for what? In economists’ terms, the mountain of corpses was a “sunk cost” — and I was ready to watch the whole damn thing sink into the South China Sea.

I was living in Berkeley, California, when the South fell. I remember standing outside my apartment on May 1, 1975, taking photographs of a parade down Telegraph Avenue welcoming the Communist victory. “All Indochina Must Go Communist,” one banner said. Well, I hadn’t evolved that much. For me the fall of South Vietnam was a day for quiet sadness.

By 1975, 58,000 Americans, of whom nearly 18,000 were draftees, had been killed there, along with maybe a million Vietnamese, and for what?

As a kid in junior high, I had supported the war. Recall the geopolitical situation: Communists had eaten up a third of the world, with big bites in Eastern Europe in 1945–48, China in 1949, North Vietnam in 1954 and Cuba in 1959. They had been stopped in a few places — in Malaya, by the Brits — but once firmly established they had never been pushed back.The Cold War’s rules of engagement were that the Communists could contest our ground — what we called the Free World — but we dared not contest theirs. And the end of that road did not look good.

When I used that argument — and “domino theory” is not a good name for it — no one knew the Communist system was facing extinction. People knew it was a poor system for satisfying private wants, but as a foundation for political power, it did seem to pass the Darwin test: it had survived and spread.

All the old arguments came back as I was reading Max Hastings’ new book,Vietnam: An Epic Tragedy, 1945–1975. Hastings, an Englishman, is my favorite military historian; for years I have had his 1987 book, The Korean War, on my shelf, and I breezed through his 752-page Vietnam in a few days. In this book Hastings has undertaken to write the narrative of the war, and not all from the American side, but also in the voices of South and North Vietnam. Hastings reveals that there were arguments and worries on their side as well as ours. Many in the North hated the draft and did not want to trek down the Ho Chi Minh Trail to fight. Over the years, 200,000 Northerners deserted while in the South. The Northern soldiers also underwent far more privations than the Americans or their Southern allies, living on rice and water spinach (sold in Asian markets here as on choy) and often starving. On one occasion, Hastings says, they killed and ate an orangutan.

People knew communism was a poor system for satisfying private wants, but as a foundation for political power, it did seem to pass the Darwin test: it had survived and spread

Hastings analyzes the assumptions and the strategies of both sides. To the low-level Vietcong, the war was mostly about getting rid of Americans who looked and acted like the “long-nose” French, Vietnam’s late imperial overlords. The cadres tried to indoctrinate the VC in Marxism, but identity politics had the stronger pull.

Strength of belief and feeling makes a difference in war, and in Vietnam the advantage went to the other side. For a military historian, Hastings makes a key admission when he says that fighting was less important than “the social and cultural contest between Hanoi and Saigon.”

In that contest, the North’s standard-bearer was “Uncle Ho,” the Gandhi-like figure of Ho Chi Minh, who had kicked out the imperialist French. In the South, a society that included landowners, merchants, and bureaucrats who had worked for the French and prayed in the same church as the French, one of the icons was Vice President Nguyen Cao Ky. One observer said that Ky, an air force pilot with slick black hair and a pencil-thin moustache, looked like a saxophone player in a cheap Manila nightclub. Writes Hastings of Ky, “He was publicly affable, fluent, enthusiastic about all things American but the taste of Coca-Cola — and as remote as a Martian from the Vietnamese people.”

Strength of belief and feeling makes a difference in war, and in Vietnam the advantage went to the other side.

South Vietnam was a society rotten with corruption and ill-gotten wealth. “Again and again,” writes Hastings, “peasants were heard to say that whatever else was wrong with the communists, they were not getting rich.” History shows, though, that life is easier in a society in which some are wrongly rich than in one in which the rich are rounded up and shot, leaving everyone else poor. Hastings writes that when the North Vietnamese army rolled into Saigon, the soldiers were amazed at how much stuff the people had.

The Vietcong were terrorists. They beheaded the village chieftains who opposed them, and sometimes buried them alive. The Americans were told to behave better than that, but with their B-52s, high explosives, and napalm they dispensed death wholesale. American soldiers, Hastings writes, went to war “wearing sunglasses, helmets, and body armor to give them the appearance of robots empowered to kill.” Back at base, “Army enlisted men took it for granted that Vietnamese would clean their boots and police their huts.” And also use the bar girls for sexual entertainment.

Hundreds of thousands of South Vietnamese still fought and died for their state, and also worked with the Americans. First-generation Vietnamese in my home state are fiercely loyal to the old Republic of Vietnam, and still fly the yellow flag with the three stripes. Apparently they were not a majority of their countrymen, else the conflict would have come out differently.

With their B-52s, high explosives, and napalm the Americans dispensed death wholesale.

As the Pentagon Papers showed, smart people in the US government saw early on that South Vietnam was ultimately not a viable cause. President Kennedy expressed his doubts, but he also believed deeply that his mission was to stop the Communists. “Nothing that came later was inevitable,” Hastings writes, “but everything derived from the fact that sixteen thousand men were in country because John F. Kennedy had put them there.”

Hastings doesn’t buy the theory propagated in Oliver Stone’s movie JFK that Kennedy was on the verge of backtracking when he was shot.

Kennedy’s successor, Lyndon Johnson, sent half a million men to Vietnam because he didn’t want to be blamed for losing it, as Truman had been blamed for losing China. Johnson’s successor, Richard Nixon, saw that the war was lost, but he took four years to pull the troops out — an indecent interval in which thousands of Americans and Vietnamese died — because he didn’t want his name on an American defeat. For each of these US leaders, the concern was his country’s prestige (a Sixties word) and his own political standing. “An extraordinary fact about the decision making in Washington between 1961 and 1975,” Hastings observes, “was that Vietnamese were seldom, if ever, allowed to intrude upon it.”

Kennedy, Johnson, and Nixon were focused on the Chinese and the Russians, and assumed they were in charge in Hanoi as much as the Americans were in Saigon. Hastings says it was not so. The Russians and the Chinese were frustrated at the North Vietnamese aggressiveness, and repeatedly advised them to cool it. Within the North Vietnamese leadership, Ho often agreed with his foreign advisors, but Hastings says that policy was set not by Ho but by Communist Party General Secretary Le Duan, “though the world would not know this.”

Nixon saw that the war was lost, but he took four years to pull the troops out — an indecent interval in which thousands of Americans and Vietnamese died — because he didn’t want his name on an American defeat.

By Hastings’ account the Americans were not the only ones who made big mistakes on the battlefield. Militarily, the biggest Communist mistake was the Tet (Lunar New Year) offensive of 1968. Le Duan’s idea was to show the flag in all the Southern cities, spark an uprising among the people, and swamp the Southern government in one big wave. In the event, the South Vietnamese didn’t rise. In Saigon, the Vietcong breached the wall of the US embassy, and in Hue, North Vietnamese regulars occupied the town north of the Perfume River for several weeks and methodically executed all their enemies. But everywhere the Communists were driven back.

The Vietcong lost 50,000 dead in Tet and follow-on attacks, five times the combined US and South Vietnamese military deaths. Largely cleansed of Vietcong, the countryside was quieter in the following year, as the North Vietnamese Army built up forces to fill the void left by the defeated Southern guerrillas. Though Tet was a military defeat for the North, the US press played it as a Communist show of strength, thereby tipping the balance of opinion in America against the war. For the Communists, a military defeat became a political victory.

The journalists had played it the way it looked, and it hadn’t looked like a South Vietnamese victory. American journalists had learned to distrust their government’s statements about the war. They should have begun four years earlier by distrusting the Tonkin Gulf Incident, which was used in 1964 to justify the de facto US declaration of war. Of the two supposed attacks on the destroyer USS Maddox, Hastings writes, one wasn’t real and the other was “a brush at sea that could easily and should rightfully have been dismissed as trivial.”

For the Communists, the military defeat of the Tet Offensive became a political victory.

In the case of Tet, US journalists inadvertently helped the enemy, but generally the press gave Americans a more accurate picture of the war in South Vietnam than the government did. The press did a poor job of reporting the shortcomings of the North, but it wasn’t allowed to go there. In 1966, when I was arguing with my schoolmates for the war, I repeatedly heard them say that communism would be a bad system for us, but it was a better one for the Vietnamese. If Americans had good reporting from North Vietnam, I don’t think my schoolmates would have said things like that. We anti-communists were right about one thing: communism turned out to be just as bad as we said it was.

The question remains as to what, if anything, America should have done to stop the Communists in Vietnam. Hastings quotes CIA officer Rufus Phillips describing what America did: “We decided that we were going to win the war and then give the country back to the Vietnamese. That was the coup de grace to Vietnamese nationalism.” But if it was wrong to do that in Vietnam, it should have been wrong in Korea, and it worked there, at least well enough to preserve the Republic of Korea. It can be no surprise that Kennedy and Johnson would try a military solution again.

What was the difference? Hastings touches on this question only briefly, mentioning the obvious: Korea is a peninsula with a border just 160 miles long, while South Vietnam had a border with Cambodia, Laos, and North Vietnam more than 1,000 miles long, perforated in many spots by the Ho Chi Minh Trail, the complex of corridors through which the Communists infiltrated the South with fighters and supplies. The warfare on the Korean peninsula was conventional, with front lines; in Vietnam it was a guerrilla contest while the Americans were there, becoming conventional only after they had decided to go. The physical climate was different, too. The Koreas were divided on the 38thparallel, about the latitude of San Francisco; the Vietnams were divided on the 17th parallel, about the latitude of Belize City. All of Vietnam is in the tropics, with attendant cloudbursts, humidity, bacteria, and bugs.

American journalists had learned to distrust their government’s statements about the war. They should have begun four years earlier by distrusting the Tonkin Gulf Incident.

And there were political differences. Ho Chi Minh was a hero of national independence; Kim Il Sung pretended to be one, but had a less inspiring story. Also, in Korea the old imperial masters were not long-nosed Caucasians but Japanese.

A penultimate thought. Hastings quotes without comment Lee Kuan Yew, the patriarch of capitalist Singapore, to the effect that if the Americans had not resisted the Communists in Vietnam, “We would have been gone.” Call this the “domino theory” if you like. It was a view I encountered in the early ’90s, when I worked in Hong Kong for Asiaweek magazine. Our founder and publisher, a Kiwi named Michael O’Neill, maintained that the American effort in Vietnam had stopped the Communists from pushing on to Thailand, Malaysia, Singapore, Indonesia, and the Philippines. Meanwhile, China had junked communist economics, and Vietnam, unless it wanted to remain poor, would have to do the same. And that, O’Neill argued, meant that the Americans had really won the Vietnam War, even if they didn’t know it.

Or maybe, I thought, we had lost the war but in the long run it didn’t matter — because the war wasn’t the decisive contest.

Twenty-one years after the war ended, I traveled to Vietnam with my wife and six-year-old son. In Danang I met a group of men who had fought for the South and suffered persecution from the victors. They weren’t bitter at the Americans, nor were the tour guys who drove us to Khe Sanh and were too young to remember. In the North, at Ha Long, I chatted up the proprietor of a tiny restaurant who said that during the war, when he had been a truck driver for a state-owned coalmine, he had lost his house to American bombing. I told him I was very sorry my countrymen destroyed his house.

He shrugged. “I have a better one now.”


Editor's Note: Review of "Vietnam: An Epic Tragedy, 1945–1975," by Max Hastings. Harper, 2018, 857 pages.



Share This


Still Amazing After All These Years

 | 

Nearly 50 years ago I was a high school student working my first summer job as a maid-waitress-cook at a rustic lodge in the Trinity Alps of northern California. The lodge was owned and still under construction by one of my high school teachers. My sister had worked there for a week and gone home, saying it was too much for too little. I stuck it out for another three weeks, until one evening when a coworker badly mistreated me. I fled the lodge and walked three miles to the nearest phone to call my parents and ask them to come get me. I then trudged the three miles back to where I had been working so they would be able to find me (life before cellphones!). It was July 20, 1969. I looked up through the trees at the starry sky, totally unaware that Neil Armstrong and Buzz Aldrin were about to touch down on the moon. Because of my call, my father missed the moon landing on TV. He never let me forget it.

Watching First Man, Damien Chazelle’s new film with Ryan Gosling as Neil Armstrong making his way to that historic step onto the moon, I finally understood why my father was so upset. What a glorious, terrifying, awe-inspiring moment that was, and we sense it with more caution than elation as Armstrong hesitates on the final rung of the ladder before finally stepping down onto the dusty landscape. I don’t know whether Chazelle got it right, but he certainly got it impressively — the absolute quiet of space, the broad expanse and rugged terrain of the moon’s surface, the suspenseful risk of training, the view from the cockpit. And the musical score by Justin Hurwitz, who has worked with Chazelle on four award-winning films, works perfectly throughout the film.

So much could have gone wrong — and did, along the way.

For two hours the film builds to that moment when Armstrong steps onto the moon, demonstrating that it was more harrowing than glorious. So much could have gone wrong — and did, along the way. The film opens with Armstrong fighting to control his X-15 supersonic jet and land it without crashing. Training requires astronauts to practice precision tasks while spinning at such dizzying speeds that it’s a race against the inevitable moment when they will pass out. As the men are strapped into the Gemini module before taking off to practice docking in space, we expect to see the excitement and jubilation of astronauts finally realizing their little-boy dreams. Instead, their faces are subdued, focused, and even a bit apprehensive. And with good reason: despite all their earthbound preparations, there was no guarantee that they would return successfully. Indeed, numerous pilots had died during the testing phase. The space race was a grim undertaking, punctuated by moments of exhilaration, performed against a backdrop of angry protestors chanting against the enormous financial and personal cost.

Armstrong is calm, almost emotionless, as he contends with the rigors of space travel, the tragedy of a child’s death, and the stoicism of his wife Janet (Claire Foy). He can roughhouse with his young sons, but he can’t tell them he loves them. As he leaves for the moon, he hugs one son but shakes hands with the other. Such passionless focus is a strength, not a weakness, for someone in his position; Armstrong’s ability to think and react impassively in an emergency is a primary reason for his success, and Gosling portrays him masterfully.

But as an audience we want our heroes to be exciting and outgoing. My husband happened to meet Armstrong, Aldrin, and Collins a few months later during their victory tour around the world. He was in Colombia at the time, and called out Armstrong’s name in his strong American English. As Mark describes it, Armstrong turned with a broad, winning smile and shook his hand vigorously before rejoining the parade. The weighty burden of weightless space had lifted for a while, and he could enjoy the gravitas of what he and the others had accomplished.

Raising the American flag and leaving it on the moon as a reminder of who got there first was a huge deal. It did not “transcend countries and borders.”

The quietness of Armstrong’s character makes the film less compelling and may explain the reason for poor turnout on opening weekend. It’s more likely that the poor turnout was owing to the controversy that preceded opening weekend. You’ve probably heard the disgruntled rumblings about the flag on the moon being left out of the movie, so let me address the controversy right here: yes, the American flag does appear on the moon in the film. It’s distant, and it’s small, but it’s there. Rumors about the flag’s absence began shortly after the film’s premiere at the Venice Film Festival, when audiences waited expectantly for that iconic moment. It didn’t happen, and social media exploded with boycott-laden outrage. (I don’t know whether the film was re-edited after the festival, or if audiences were simply expecting more, but the flag is definitely in the scene now.) Gosling explained in an interview that the moon landing “transcended countries and borders [and]… was widely regarded in the end as a human achievement,” so that’s why they didn’t include the flag-raising moment.

This is pure 21st-century poppycock, of course. Competition with the Russians was the driving force behind the space program, and the reason JFK dedicated so many billions of dollars for it. The Russians were ahead of the Americans nearly every step of the way. Raising the American flag and leaving it on the moon as a reminder of who got there first was a huge deal. It did not “transcend countries and borders.” It was the reason we were there. Chazelle and company should not have minimized it into a kumbaya moment of one-world humanism. Moreover, in his zeal to turn American exceptionalism into ordinary human accomplishment, Chazelle missed a great opportunity to make his statement — with the flag. Armstrong’s biography recounts how the astronauts struggled to assemble the malfunctioning flagpole. That struggle could have been presented as a metaphor for 21st-century American politics and the difficulty of raising the flag today.

First Man is a good film with some great special effects and fine acting all around. Jason Clarke is especially good as Ed White, and Corey Stoll is feisty as Buzz Aldrin. Claire Foy, best known for her role as Queen Elizabeth II in the excellent Netflix series The Crown, is wonderful as the stoic yet passionate Janet Armstrong. And with the 50th anniversary of the moon landing just six months away, I’m happy that this film has been made.


Editor's Note: Review of "First Man," directed by Damien Chazelle. Universal Pictures, 2018, 141 minutes.



Share This


Racism

 | 

In 1979, undercover Colorado Springs police officer Ron Stallworth noticed a phone number in a local newspaper in a small ad seeking members to begin a new chapter of the Ku Klux Klan. He called the number and pretended to be a white supremacist, hoping to infiltrate the organization in order to thwart the rising violence against black residents in general and the black student union at the college in particular. Soon the KKK leader suggested that they meet in person. The only hitch? Ron Stallworth was black.

Spike Lee’s BlacKkKlansman tells the tale, and it’s a gripping, suspenseful, often humorous, and often troubling one. As the film narrates the story, KKK leader Walter Breachway (played in the movie by Ryan Eggold) eventually asks for a face-to-face meeting with Stallworth (John David Washington, Denzel’s son), Stallworth arranges for a white undercover narcotics cop, Flip Zimmerman (Adam Driver), to stand in for him. Yes, a black and a Jew both manage to infiltrate the hateful KKK by posing as the same white supremacist. Stallworth continues to talk with Walter by phone while Zimmerman continues to meet with Klan members in person, necessitating that their stories and even their voices match. Walter’s second in command, Felix (Jasper Paakkonen), grows suspicious, or perhaps jealous, and as his sadistic streak surfaces we worry for Zimmerman’s life.

Director Lee chooses caricature rather than character with some of his KKK subjects, but after watching decades of black caricature on film, I can forgive him this hamhandedness.

During the course of his investigation Stallworth contacts David Duke himself (Topher Grace), then the Grand Wizard of the Ku Klux Klan and future Louisiana State Representative. The boyish Grace, best known for the TV series That ’70s Show, plays Duke with perfect oblivion to his bigotry. Lee is a bit heavyhanded, however, in his determination to connect Duke’s rhetoric with Trump’s “Make America Great Again” rhetoric.

Adam Driver provides a nuanced performance as the lapsed, nonchalant Jew forced to confront his feelings about his heritage when he is threatened simply because of his genetic stew. Corey Hawkins is fiery as Kwame Ture (aka Stokely Carmichael), and Laura Harrier channels Angela Davis luminously with her big round glasses and bigger round afro as Patrice Dumas, president of the black student union. Harry Belafonte is a standout as Jerome Turner, carrying with him the weary weight of his own decades in the civil rights movement. Director Lee chooses caricature rather than character with some of his KKK subjects, particularly the slack-jawed near-imbecile Ivanhoe (Paul Walter Hauser) and Walter’s perky, overweight, frilly aproned wife Connie (Ashlie Atkinson). But after watching decades of black caricature on film, I can forgive him this hamhandedness.

While the plot of BlackKKlansman covers just nine months in the 1970s, the story spans more than a century. It opens with a scene from Gone with the Wind, presents upsetting clips from Birth of a Nation, which celebrated the KKK, and ends with footage from the deadly riot in Charlottesville last year. And Harry Belafonte as Jerome Turner provides a soft-spoken, emotional, and tender account of the horrifying 1916 lynching and burning of Jerome Washington in Waco, Texas.

If there is one underlying truth about racism, it is this: government is the Grand Wizard of bigotry.

I’m always a little uncomfortable and defensive when I see films like this; it’s important to be aware of black history, and I’m glad these stories are being recorded on film. But it feels as though I’m intruding somehow, as though all whites are being accused of the same ignorant, bigoted mindset that we see on the screen. In reality, of course, white supremacists represent a tiny minority of the population, while white voters, white activists, white teachers, and white politicians have worked vigorously in the cause of civil rights.

If there is one underlying truth about racism, it is this: government is the Grand Wizard of bigotry. Government legalized slavery and enforced the Fugitive Slave Law. Government institutionalized segregation through neighborhood-based public schools and “separate but equal” policies, and governments outlawed miscegenation. Government imposed poll taxes and voting questionnaires. Government grants and welfare in the 1960s were well-intentioned, but they incentivized single motherhood, established barriers to work through public assistance programs that were difficult to relinquish for an entry-level job, and created a dragnet rather than a safety net that virtually destroyed the black family in urban neighborhoods.

Meanwhile, activists — black and white, male and female — exercising their rights to free speech and open dialogue were the catalyst for change and inclusion. Freedom of speech is the most important right we have. It’s the foundation for all other rights. Yet too many activists today are turning to government to establish hate laws that limit free speech. These films seldom acknowledge the friendship and genuine concern felt by so many white Americans, or the fact that discovery of truth is a process. Lee gives a welcomed nod to this idea at the end of the film, but it takes a long time to get there. Still, BlacKkKlansman is well made and well worth seeing.


Editor's Note: Review of "BlacKkKlansman," directed by Spike Lee. Focus Features and Legendary World, 2018, 135 minutes.



Share This


The Great Panic

 | 

I have long been a fan of the Panic of 1893, which is the usual name for the great depression of the 1890s. When I say “great” I mean it is comparable by all available measures (business losses, unemployment, political turmoil) to the Great Depression of the 1930s — with two exceptions. First, the depression of the 1930s lasted for more than ten years, ending only with the start of the Second World War in Europe; the depression of the 1890s lasted less than half as long. Second, in the 1930s the federal government intervened massively to try to end the depression, whereas the government of the 1890s did as little as it could.

These two exceptions are closely related. In 1893 and after, President Grover Cleveland had the political and above all the intellectual courage to allow prices to sink until recovery could begin. He devoted his best efforts to stabilizing the dollar, so that sound money and real prices could beget confidence, and confidence could beget reinvestment. This happened. But in 1929 and after, Presidents Herbert Hoover and Franklin Roosevelt were guided by the economic ignorance and sheer quackery of their times (and ours); they intervened to keep prices up and bail out bad investments — using money, of course, extorted from the people who had made good investments. Roosevelt’s subsidies extended to the destructive political ideas of his time; he encouraged political action to fulfill the borderline-crazy terms of his first inaugural address, in which he announced:

The money changers have fled from their high seats in the temple of our civilization. We may now restore that temple to the ancient truths. The measure of the restoration lies in the extent to which we apply social values more noble than mere monetary profit.

The result was not only chronic political turmoil but a failure of reinvestment caused by a chronic absence of confidence in the nation’s economic and political prospects. Money, as R.W. Bradford used to say, wants to be invested, but it didn’t during the 1930s, when for a series of years there was actually “negative investment” in the economy.

In 1929 and after, Presidents Herbert Hoover and Franklin Roosevelt were guided by the economic ignorance and sheer quackery of their times (and ours).

So you see one reason why I am a fan of the depression of the 1890s — it provides clear and persuasive economic, political, and, if you will, spiritual lessons. But another reason is that the economic and political controversies of the 1890s are a lot of fun. Communism is dull stuff, no matter where it appears, and in the 1930s it manifested itself in remarkably dull, stupid, pompous, and oppressive forms. Compared with that, the nostrums of the 1890s are bright, delusive rays of sunshine. You just have to smile at Jacob Coxey’s plan to save the country by a complicated scheme for the federal government to print tons of paper money and use it to give free loans to local governments so they could create jobs in public building programs — a plan he implemented in the first of the great marches on Washington, the march of Coxey’s Army. The march culminated in Coxey’s arrest at the Capitol, for walking on the grass.

And who wouldn’t have fun trying to follow the logical permutations of the Free Silver idea, the notion that the American economy would be perfected if the federal government would simply produce unlimited quantities of silver dollars (and paper instruments representing them), priced at 16 silver dollars for one gold dollar, when the market price of a gold dollar was much higher than 16 silver dollars? This was a recipe for outrageous inflation, yet in 1896 it captured the Democratic Party and could have led to the election of the Democratic candidate, William Jennings Bryan, he of the stirring Cross of Gold speech:

You shall not press down upon the brow of labor this crown of thorns. You shall not crucify mankind upon a cross of gold.

It’s a good speech, and some of the books and pamphlets written in favor of Free Silver are immensely clever complications of an argument that is clearly wrong but has a way of starting to look right if you don’t take a step backward and remind yourself of what it’s really about.

Compared with the remarkably dull, stupid, pompous, and oppressive forms of communism that manifested in the 1930s, the nostrums of the 1890s are bright, delusive rays of sunshine.

Now comes Bruce Ramsey, author of the book I am reviewing and — all cards on the table — senior editor of Liberty and a good friend of mine. Bruce is a tireless researcher of the events, theories, and movements of the 1890s. He knows their importance. He knows they reveal important truths about the ways in which economies function, and in which people function within them. And he knows they’re fun. The only problem is that the vast majority of Americans have simply forgotten about the depression of the 1890s. They forgot about it almost as soon as it was over. (I have an essay about this in Edward Younkins’ Capitalism and Commerce in Imaginative Literature [Lexington Books, 2015].) In the popular imagination, the decade of desperation was soon transformed into the Gay Nineties.

There aren’t a lot of good treatments of national politics and economics in the 1890s. Allan Nevins’ biography of Cleveland (1932) remains the best. And there are few decent treatments of the effects of the depression on individual men and women, in their local communities. That’s the vital part of the story that practically nobody knows. And that’s what Ramsey gives us in his brilliant new book about the state of Washington during the Panic.

In writing such a book, Ramsey faced one of the hardest challenges a writer of history can encounter. A straight-line narrative of national political and economic events would capture only part of the picture. So would an exclusive concern with one particular locality, such as Bruce’s home state, Washington. So would concentration on certain personalities, as in the cheap, tangential approach to history that one sees in the Ken Burns films. What Bruce needed to present was the full tapestry of local people and local events, rippling in the strong winds of national affairs; he needed to capture not only the big patterns but the individual figures in the tapestry, and he needed to show those ripples of history too. But he was equal to the challenge.

The vast majority of Americans have simply forgotten about the depression of the 1890s. They forgot about it almost as soon as it was over.

Bruce Ramsey is a quick but colorful narrator. He provides the pungent detail and the suggestive episode and then moves briskly onward to the next significant picture, whether it’s the portrait of an interesting man or woman, an array of statistics, a sketch of political developments nationwide, or a tale of something that’s too ridiculous to be true, but is. Did you know that in 1893 the Populist governor of Kansas tried to use the state militia to oust the Republicans (who happened to be in a majority) from the House of Representatives in Topeka? (If Dorothy wanted adventure, she could have stayed right in Kansas.) This absurd drama — one of many in Ramsey’s book — offers some perspective on the absurd politics of the present era. To say that Ramsey’s political narrative is entertaining is itself absurd; it’s an absurd understatement.

Here are thousands of stories, small in the number of words that Ramsey, a thrifty narrator, allots to each, but large in drama and implication. We see people who are found talking gibberish in darkened hotel rooms because their bank deposits of $256 had been lost to the panic. We see government officials who steal money, and lose it, and then escape to Argentina, or to a place off the coast of Washington called Tatoosh Island, thence to change identities and be discovered working as mowers in Idaho. We learn of a government official who is acquitted by a jury that doesn’t believe that bribery is against the law. We listen to a contractor for the Northern Pacific railway who says he “had put white men at work at $2 and gradually raised their wages to $2.50, although there was no time when [he] could not have employed Chinamen at 80 cents” (p. 51). We meet mayors who work in shingle mills because their cities can’t pay them a salary, and unionists who resort to riot and terror to keep their salaries from being cut.

The sheer number of stories that Ramsey tells is remarkable; still more remarkable is his unfailing ability to integrate them into larger contexts of meaning. Here’s one of the general patterns he sees. Businesses and banks that made it through this great depression often did so because they backed each other up. Seattle, where the spirit of cooperation was strong, suffered many fewer losses than such competing communities as Tacoma and Spokane. Seattle’s bankers went so far as to refuse deposits from people who had withdrawn them in panic from other banks. This was individual action, but it was mutually supportive. It was a kind of spontaneous order, and it often saved the day.

We see people who are found talking gibberish in darkened hotel rooms because their bank deposits of $256 had been lost to the panic.

Here’s another pattern. Led by President Cleveland, the federal government disclaimed responsibility for helping individuals — whether bankers or street sweepers — get out of their financial jam. Most public opinion seems to have backed him up. Newspapers in the Pacific Northwest counseled their readers to take responsibility for themselves — and above all not to hurt business by fleeing to some place with a marginally better economy. Their message was “stay here and keep pitching.” A Baptist potentate cautioned against giving money to the poor indiscriminately; this was “a selfish act, done to make the giver feel good” (83). Some local governments acted in what they regarded as the spirit of community and provided employment on public works projects, and some of them went broke doing it. But charity ordinarily began at home. As Ramsey observes, very perceptively, “In a world with little free public food, people tend to be generous with their private food” (93).

A darker side of community spirit was the almost universal feeling that if anyone was going to be without a job, it shouldn’t be someone white. Everywhere Asians were fired from jobs or prevented from getting any, and mobs formed to destroy Chinatowns throughout the region. It was only a temporary rescue when the wife of a local missionary faced down a mob that came for the Chinese people of La Grande, Oregon: “She appeared with a Winchester and announced that the first man to enter the house would be shot” (79). Most of the Chinese left town anyway; and although 14 rioters were arrested, none was convicted. Oregon’s Progressive governor haughtily rejected President Cleveland’s request that he protect the rights of the Chinese.

A darker side of community spirit was the almost universal feeling that if anyone was going to be without a job, it shouldn’t be someone white.

Much of Ramsey’s book is devoted to racism and progressivism during the depression. It’s quite a story, and again, it’s a gift of perspective: then as now, the predominant individualism of America was too much of a burden for many Americans to bear.

Obviously, the implications of Ramsey’s stories go far beyond the Pacific Northwest. The stories of that region cannot be explained without reference to the bigger stories of the nation’s money policy, its “reform” and “progressive” movements, and its national elections. Ramsey devotes lively chapters to all these things. If you don’t know the 1890s, this is the book for you, wherever you live. If you do know the 1890s, you know a lot about America, and this book will help you learn even more.

The Panic of 1893 is beautifully illustrated, with fine contemporary pictures, and backed by years of patient research. It is a distinguished and compelling book.


Editor's Note: Review of "The Panic of 1893: The Untold Story of Washington State’s First Depression," by Bruce Ramsey. Caxton, 2018, 324 pages.



Share This


When Nobody Knew What a Dollar Would Be

 | 

The Caxton Press has just published my book, The Panic of 1893, and I can now write for Liberty about it. Its topic is the final economic downturn of the 19th century. For more than three years, my head was in the 1890s — in books, articles, personal and official papers, lawsuits, and, especially, old newspapers, chiefly from my home state. The book’s subtitle is, The Untold Story of Washington State’s First Depression.

It is a popular history, not a libertarian book as such. But I have a few thoughts for a libertarian audience.

Many libertarians espouse the Austrian theory of the trade cycle, in which the central bank sets interest rates lower than the market rate, leading to a speculative boom, bad investments, and a collapse. In the 1890s the United States had no central bank. Interest rates before the Panic of 1893 were not low, at least not in Washington. The common rate on a business loan was 10%, in gold, during a period in which the general price level had been gently falling. Washington was a frontier state then, and it needed to pay high interest rates to attract capital from the East and from Europe. Credit standards, however, were low, sometimes appallingly low. Many of Washington’s banks had been founded by pioneers — optimistic patriarchs who lent freely to their neighbors, associates, relatives, and themselves. By a different road from the Austrians’ theory, the economy was led to the place it describes: a Hallowe’en house of bad investments.

The Sherman Silver Purchase Act was a sop to the inflationists, who wanted an increase in the money supply, and to the silver mining interests, who wanted the government to continue buying their silver.

The dollar was backed by gold, with the US Treasury intending to keep at least $100 million of gold on hand. But in 1890, at the peak of the boom period, Congress passed the Sherman Silver Purchase Act, obligating the Treasury to buy up the nation’s silver output with newly printed paper money. It was a sop to the inflationists, who wanted an increase in the money supply, and to the silver mining interests, who wanted the government to continue buying their silver, which it had been doing to create silver dollars. Politically the Sherman Silver Purchase Act was also part of a deal to pass the McKinley Tariff, which raised America’s already high tariff rates even higher.

The problem with the Sherman Silver Purchase Act was that the new paper money being paid to the silver miners could be redeemed in gold. The prospect of an increase every year in paper claims against the Treasury’s gold alarmed foreign investors, and they began to pull gold out. Two crises abroad also shifted the psychology of lenders and borrowers worldwide: Argentina defaulted on a gold loan from the Baring Brothers in 1890 and a real estate boom in Australia collapsed in 1893. These crises shifted the thoughts of financial men from putting money out to getting it back, from a preference for holding promises to a preference for cash.

By the time Grover Cleveland took office in March 1893, the Treasury’s gold cover had shrunk to $101 million. A run began on the Treasury’s gold — and that triggered the Panic of 1893.

In the Pacific Northwest, the four-year-old state of Washington (pop. 350,000 then) had 80 bank failures in the following four years.

Two crises abroad also shifted the psychology of lenders and borrowers worldwide: Argentina defaulted on a gold loan from the Baring Brothers in 1890 and a real estate boom in Australia collapsed in 1893.

Economists have listed the ensuing depression as the second-deepest in U.S. history. (One estimate: 18% unemployment.) But they don’t know. The government didn’t measure unemployment in the 1890s. And the rate of unemployment may not be the best comparison. America was less wealthy in the 1890s than in the 1930s, and living conditions were harsher. In absolute terms, the bottom of the depression of the 1890s was clearly lower than that of the 1930s.

The Left of the 1890s, the Populists and silverites, wanted cheap money. They blamed the depression on the gold standard. And gold is not an easy taskmaster; libertarians have to admit that.

The silverites wanted a silver standard. Most of them were “bimetallists,” claiming to favor a gold standard and a silver standard at the same time, with 16 ounces of silver equal to one ounce of gold. Their idea was that by using gold and silver the people would have more money to spend.

Free silver was a policy well beyond the Sherman Silver Purchase Act, which compelled the Treasury to buy silver at the market price. In the mid-1890s, silver fell as low as 68 cents an ounce. At that price, a silver dollar had 53 cents’ worth of silver in it and the silver-gold ratio was 30-to-1.

In absolute terms, the bottom of the depression of the 1890s was clearly lower than that of the 1930s.

The bimetallists wanted 16-to-1. That was the ratio for U.S. currency set in the late 1700s when the market was at 16-to-1. Later the market shifted and Congress changed the ratio to 15 1/2-to-1. Then came the Civil War, and the U.S. government suspended the gold standard, and printed up its first “greenbacks,” the United States Notes.

The United States Notes were effectively a new currency, and traded at a discount from metallic dollars. In September 1896, the Seattle Post-Intelligencer reminded readers of those times:

There never was a time from the beginning of the first issue of greenbacks down to the resumption of specie payments when the greenback dollar was ever accepted on the Pacific Coast for anything more than its market price in terms of gold.

The greenback was discounted, sometimes by 50 to 60%.

In 1873, Congress decided to define the dollar as a certain weight of gold, but not silver. The silver people in the 1890s called this “The Crime of ’73.”

Redemption of paper money under the gold standard began in 1879. To placate the silver interests, Congress had passed a law requiring the government to buy silver at the market price and coin it into dollars — the Morgan dollars prized by collectors today. At the beginning, the silver in a Morgan dollar was worth about a dollar, but by the 1890s, the value of silver had fallen.

In 1890, the silver-dollar law was replaced by the Sherman Silver Purchase Act, which created paper money. The government still coined silver dollars, and by 1896 had more than 400 million of them in circulation.

To placate the silver interests, Congress had passed a law requiring the government to buy silver at the market price and coin it into dollars.

The law did not require the Treasury to pay out gold for silver dollars, and it hadn’t. But the law declared all the different kinds of dollars (and there were five different kinds of paper money, at that point) to be equally good for everyday use except for taxes on imports. At the amounts an individual was ever likely to have, a silver dollar was as good as a gold dollar.

If you ask why a sane person would have designed a monetary system with gold dollars, silver dollars, Gold Certificates, Silver Certificates, National Currency, Treasury Notes, and United States Notes — Congress had designed it, one variety at a time.

Under the proposal for “free silver,” gold would be kept at the official price of $20.67 and silver set at one-sixteenth that price, or $1.29. Just as the world was free to bring an ounce of gold to the Treasury and take away $20.67 — “free gold” — the world would be free to bring an ounce of silver to the Treasury and take away $1.29. Free silver! The advocates called this the “unlimited coinage” of silver, but the aim was to create dollars, not coins. Most of the silver could pile up in the Treasury and be represented by crisp new pieces of paper.

The gold people argued that for the United States to set up a 16-to-1 currency standard in a 30-to-1 world was nuts. Essentially, the Treasury would be offering to pay out one ounce of gold for 16 ounces of silver. It would be a grand blowout sale on gold, and the world would come and get it until the gold was gone. The Treasury would be left with a Fort Knox full of silver, and the U.S. dollar would become a silver currency like the Mexican peso.

Surely the gold people were right about that. (And today’s ratio is 78 to 1.)

Milton Friedman argues in his book Money Mischief that two standards, with the cheapest metal defining the dollar in current use, would have worked all right. If the cheap metal got too expensive, the system would flip and the dollar would be defined by the other metal. In theory it makes sense, and apparently before the Civil War it had worked that way. But the financial people didn’t want a system like that.

The Treasury would be left with a Fort Knox full of silver, and the U.S. dollar would become a silver currency like the Mexican peso.

In 1896, America had a watershed election, with the silver people for Bryan, the Democrat, and the gold people for McKinley, the Republican. A third party, the People’s Party, endorsed Bryan. Its followers, the Populists, didn’t want a silver standard. They were fiat-money people. But Bryan was against the gold standard, and that was enough.

In that contest, the silver people were derided as inflationists. They were, to a point. They wanted to inflate the dollar until the value of the silver in dollars, halves, quarters, and dimes covered the full value of the coin. The silver people were not for fiat money.

Here is the Spokane Spokesman-Review of October 1, 1894, distinguishing its silver-Republicanism from Populism:

Fiat money is the cornerstone of the Populist faith . . . Silver money is hard money, and the fiatist is essentially opposed to hard money . . . He wants irredeemable paper money, and his heart goes out to the printing press rather than the mint.

The Populists and silverites argued in 1896 that the gold standard had caused the depression, and that as long as gold ruled, the nation would never recover. History proved them wrong. They lost, and the nation recovered. It began a recovery after the election settled the monetary question. Investors and lenders knew what kind of money they’d be paid with.

Milton Friedman makes a monetarist point in Money Mischief that starting in about 1890, gold miners had begun to use the cyanide process, which allowed gold to be profitably extracted from lower-grade ore. The result was an increase in gold production all through the decade. I came across a different story in my research. The increase in the supply of gold (about which Friedman was correct) was outstripped by the increase in the demand for gold. Prices in gold dollars declined sharply during the depression of the 1890s, including the prices of labor and materials used in gold mining. It became more profitable to dig for gold. Deflation helped spur a gold-mining boom — in the Yukon, famously, but also in British Columbia, in Colorado, and in South Africa.

The US began a recovery after the election settled the monetary question. Investors and lenders knew what kind of money they’d be paid with.

Under a gold standard, a deflation sets in motion the forces that can reverse it. This is a useful feature, but it can take a long time.

The recovery from the depression of the 1890s began not with a burst of new money but with a quickening of the existing money. What changed after the election was the psychology of the people. They knew what sort of money they held and could expect. The important point wasn’t that it was gold, but that it was certain. If Bryan had been elected and the dollar became a silver currency, people would have adjusted. With gold, they didn’t have to adjust, because it was what they already had.

The writers of the 1890s had a less mechanistic view of the economy than people have today. People then didn’t even use the term, “the economy.” They might say “business” or even “times,” as if they were talking of weather conditions. They talked less of mechanisms (except the silver thing) and more of the thoughts and feelings of the people. People today are cynical about politicians who try to manipulate their thoughts and feelings, and think that it’s the mechanisms that matter. And sometimes mechanisms matter, but the thoughts and feelings always matter.

Prices in gold dollars declined sharply during the depression of the 1890s, including the prices of labor and materials used in gold mining. It became more profitable to dig for gold.

Now some observations about the ideas of the 1890s.

The Populists, called by the conservative papers “Pops,” were much like the Occupy Wall Street rabblerousers of a decade ago: anti-corporate, anti-banker, anti-bondholder, anti-Wall Street, and anti-bourgeois, but more in a peasant, almost medieval way than a New Left, university student way. Many of the Pops were farmers, with full beards at a time when urban men were shaving theirs off or sporting a mustache only. More than anti-Wall Street, the Pops were anti-debt, always looking for reasons for borrowers not to pay what they owed. On Wikipedia, Populism is labeled left-wing, which it was mainly. It was also rural, Southern, Western, anti-immigrant, and often racist. In Washington state it was anti-Chinese.

In the 1890s traditional American libertarianism was in the mainstream. In the newspapers this is very striking, with the Republican papers championing self-reliance and the Democratic papers championing limited government. Democrats, for example, argued against the McKinley Tariff — which imposed an average rate of more than 50% — as an impingement on individual freedom. Here is Seattle’s gold-Democrat daily, the Telegraph, of September 10, 1893:

If it be abstractly right that the government shall say that a man shall buy his shoes in the United States, why is it not equally right for it to say that he shall buy them in Seattle? . . . Where shall we draw the line when we start out from the position that it is the legitimate and natural function of government to regulate the affairs of individuals . . .

Our idea is that the least government we can get along with and yet enjoy the advantages of organized society, the better.

Here is the silver-Republican Tacoma Ledger of Dec. 3, 1895:

Thoughtful men must perceive that our whole system of civilization is undergoing a revolution in its ideas; and we are in danger of gradually supplanting the old, distinctive idea of the Anglo-Saxon civilization — the ideas of the individualism of the man, his house as his castle, and the family as his little state, which he represents in the confederation of families in the state — by the Jacobinical ideas of . . . continental republicanism . . . The continental republican theory contemplates the individual man as an atom of the great machine called the nation. The Anglo-Saxon considers every man a complete machine, with a young steam engine inside to run it. The continental republican must have a government that will find him work and give him bread. The Anglo-Saxon wants a government only to keep loafers off while every man finds his own work and earns his own bread.

Contrast that with today’s editorial pages.

The Populists were anti-debt, always looking for reasons for borrowers not to pay what they owed.

Here’s a final one I particularly liked. Archduke Franz Ferdinand of Austria-Hungary — the same gent whose assassination 21 years later would touch off World War I — came through Spokane on the train in 1893. Americans, fascinated with him just as they would be a century later with Princess Diana, stood in the rain for hours to get a glimpse of the famous archduke — and they were sore because he never showed himself. On October 9, 1893, here is what the Seattle Telegraph had to say about that:

Why in the name of common sense should the people of this country go out of their way to honor a man simply because he happens to be in the line of succession to a throne . . . The correct thing is to let their highnesses and their lordships and all the rest of them come and go like other people. To the titled aristocracy of Europe there is no social distinction in America.

The America of the 1890s had some unlovely aspects. But in my view, the Telegraph’s attitude toward princes is exactly right. I recalled the Telegraph’s patriotic comment during all the blather over the wedding of Princess Diana’s son.

The 1890s had its blather, but after 125 years, sorting out facts from nonsense is easier. Silly statements, especially wrong predictions, don’t weather well. It makes me wonder what of today’s rhetoric will seem utterly preposterous in the 2100s.




Share This


Hip Replacement: Lesson One

 | 

“In a soldier’s stance, I aimed my hand
at the mongrel dogs who teach . . .”
                                      — Bob Dylan, My Back Pages (1964)

English, like every other living language, constantly evolves. Every utterance holds the promise of change: a new take, a fresh twist, an old word with a new meaning, or a neatly turned phrase that nudges the language, and the people who speak it, in a new direction. This is Donald Trump in the ’80s: “You have to think anyway, so why not think big?”

New words are created all the time. The verb “mansplain,” coined less than a decade ago, describes a practice at least twice as old: when a man explains something, a new word, say, to a woman in a condescending manner. And, of course, words just disappear. I haven’t heard “tergiversation” since William Safire died. Some words are like mayflies, here and gone. A word used only once is called an “onanym,” which, appropriately, is one.

As changes accumulate, the distance between the old and new versions of the language grows and the older version gradually becomes dated, then archaic, and, eventually, incomprehensible. Read Beowulf. (Or consider that less than 60 years ago we elected a president named “Dick.”)

And, of course, words just disappear. I haven’t heard “tergiversation” since William Safire died.

The sound of English changes, too. Its phonological components, such as tone, pitch, timbre, and even melody, change. If you learned American English, the North American dialect of Modern English, scores of years ago, as I did, you have heard many such changes and, while you can probably understand the current version, provided the slang isn’t too dense, you probably cannot reproduce its sound.

This, then, is a music lesson of sorts, designed to help you, my fellow older American, replicate the sounds of what we will call Post-Modern English, or PME, the successor to American English. Not the slang, mind you, but the sound of it, the music. If you are wondering why you should bother, reflect on this: you wouldn’t parade around in public wearing the same clothes that you wore as a child, would you? Of course not, because fashion changes and we should change with it, provided that we do it in an unaffected way. Choosing to update the sound of your English is as sensible as hanging up your coonskin cap. One must make an effort to ensure that one’s outfit looks snatched, after all.

The lesson includes a short passage from a radio broadcast I heard that contains many of the phonological changes that American English has undergone during the past several decades. While I managed to jot it down, I couldn’t get a copy of the audio. No matter. You can tune into any pop radio station and listen to the banter of the DJs. They are first-rate role models for Post-Modern English. (Dakota Fanning is not. I heard her interviewed on NPR, and to my ear she sounded like nothing so much as the valedictorian at a finishing school graduation, circa 1940. To be fair, NPR features many guests, and hosts, for that matter, whose mastery of PME is just totally awesome.)

Choosing to update the sound of your English is as sensible as hanging up your coonskin cap.

Ready? There are five parts. The first reveals the essence of Post-Modern English, so that you will know how to comport yourself when speaking it. The second will help you adjust your vocal cords to the proper register. The third comprises ten exercises drawn from the transcript of that radio broadcast. The fourth alerts you to a few problems you may encounter once you have mastered PME, and suggests practical solutions. The fifth and final part will put American English where it belongs: in the rear-view mirror. Just as Professor Higgins taught Miss Eliza Doolittle to drop the cockney and pass as posh, so I will teach you to drop your stodgy American English and sound cool. By the end of this linguistic makeover you will sound like a real hep cat. (The spellchecker just underlined “hep.” Bastards.)

* * *

Part One: The Essence

As French is the language of love and German is the language of Nietzsche, Post-Modern English is the language of wimps.

(Just now, you may have jumped to the conclusion that the word “wimps” was deployed in the previous sentence as a pejorative. It was not. It was chosen because it is the word that best embodies the defining characteristics of Post-Modern English. If you’ll bear with me, I think you’ll come to agree.)

When a French woman explains a line from Also Sprach Zarathustra,she sounds as if she were flirting. When a German man puts the moves on a fräulein in a dimly lit hotel bar, he sounds as if he were explaining how to change the oil in a diesel engine. Let us stipulate that the French woman is not a flirt and the German man is not a mechanic. It doesn’t matter; their languages make them sound that way. And when a fluent speaker of Post-Modern English asks you to move your car because he’s been boxed in, he sounds like a puppy that has just peed on your white carpet. He may not be a wimp, but he sure does sounds like one. It is simply the case that each of these languages, at least when heard by Americans of a certain age, creates a vivid impression of the speaker. It is no more complicated than that. So why does the American guy sound like such a wimp?

Post-Modern English is the language of wimps.

At the core of Post-Modern English are two directives that determine not just the attitude but also the moral stance that its speakers assume as they turn to face the oncoming challenges of the modern world. These two directives, sometimes called the Twin Primes, preempt both the laws enacted by governments and the commandments handed down by ancient religions. (Practically, this means that in the event of a conflict between any of those laws or commandments and either of these two directives, it is the latter that will be adhered to, not the laws of God and man, all other things being equal.) You may have heard one or both of the Twin Primes invoked when a speaker of Post-Modern English suspects a violation has occurred in the vicinity.

The First Directive is “Don’t judge.” The Second is “Don’t be a dick.”

How, you may be asking yourself, could two such sensible and straightforward prohibitions make millions of people sound wimpy? Enforced separately, they probably couldn’t, but enforced together, they lay a paradoxical trap that can make even the straightest spine go all wobbly.

When a fluent speaker of Post-Modern English asks you to move your car because he’s been boxed in, he sounds like a puppy that has just peed on your white carpet.

Step by step, now. To judge others is considered rude in Post-Modern English, especially if the judgment is thought to be harsh. A person who judges others in this way and then communicates that judgment to those being judged is often referred to as a dick. If, for example, you saw someone who was judging others and, in a completely sincere attempt to be helpful, you said to that person, “Don’t be a dick,” you would have, in effect, not only made a judgment about that person’s behavior, but also communicated it to that person in a harsh way. By definition, then, by telling this person that he has behaved badly, you yourself would have strayed off the reservation to which PME speakers have agreed to confine themselves, and would have become the very thing that you have judged so harshly: a dick.

Now, Post-Modern English speakers are not stupid. They are aware of this trap and, not wishing to be hoist with their own petards, do what any reasonable person would do. Not only do they rarely call other people “dicks,” but they fall all over themselves to avoid any communication that can be interpreted as passing judgment on others. Simple statements about mundane matters are qualified and watered down so that the likelihood of giving offense is minimized. Explanations are inflected to sound like questions, apologies, or cries for help. Commonplace opinions are framed semi-ironically, often attached to the word “like,” so that they can be retracted at the first sign of disagreement. This feature of the language is called “ironic deniability.” It also allows one to blame irony when the real culprit is stupidity.

As a result, fluent PME speakers, when compared with speakers of earlier forms of American English, sound more uncertain, unassertive, and nonjudgmental. To put it bluntly, they sound more sheepish. Not because they are, you understand, any more than the French woman was flirtatious. It is just that the rules of the language have prodded them, bleating, into the chute that leads inescapably to the waiting tub of dip. In short, to avoid being dicks, they end up being wimps.

By telling this person that he has behaved badly, you yourself would have become the very thing that you have judged so harshly: a dick.

Wake up, old son, and smell the nitro coffee. In this brave new world, wimpiness is cool.

And that, my crusty-but-benign student, is all you need to know. You don’t need a dissertation on the cultural and historical forces that forged this pained linguistic posture; all you need is to imitate its cringe as you complete the lesson ahead and go on to achieve fluency in Post-Modern English. Here’s an aspirational commandment: “Thou shalt be cool.” You can do this. It’s goals.

Part Two: The Vocal Register

Please say, “So, that’s pretty much it, right?” in your normal 20th century voice. OK? Now say it again, but make the pitch of your voice as low as you can.

How’d it go? When you lowered the pitch, did you hear a sizzle, a popping sound, like bacon frying? No? Try again, as low as it will go. Once you’ve achieved this effect, I’ll give you the backstory.

Ready? That sizzle is the sound of liquid burbling around your slackened vocal cords. As you may have noticed, this register, often called vocal fry, has been growing in popularity during the past few decades.

Fluent PME speakers, when compared with speakers of earlier forms of American English, sound more uncertain, unassertive, and nonjudgmental. To put it bluntly, they sound more sheepish.

In the 1987 movie Made in Heaven, Debra Winger played the archangel Emmett, God’s right-hand man, who was supposed to be a chain-smoker. As Ms. Winger was not, she had to simulate a smoker’s voice for the part, serendipitously producing a pitch-perfect proto-vocal fry. While this early mutation event does not appear to have lodged in the inheritable DNA of the language, it is fascinating in the same way that the Lost Colony of Roanoke is.

Vocal fry’s current run on the linguistic hit parade is more likely to have begun when Britney Spears croaked “Baby One More Time” in 1998, although it is occasionally said that the real patient zero was someone named Kardashian. Whatever.

Women tend to use vocal fry more than men. A wag on TV once said that women are trying to usurp the authority of the patriarchy by imitating the vocal register of the male. This would be in stark contrast to the southern belle or transvestite, both of whom artificially raise the pitch of their voices, sometimes into falsetto, to enhance the appearance of femininity.

Isn’t the theory that these bubbling vocal cords were repeatedly sautéed and baked less likely than the much simpler explanation of demonic possession?

Another theory holds that the phenomenon is simply the result of too much booze and marijuana. For this “Animal House Hypothesis” to be taken seriously, however, it must account for the fact that vocal fry did not present in the ’60s (except briefly in Clarence “Frogman” Henry’s 1956 recording of “Ain’t Got No Home”). Considering that the sound more nearly resembles an audition for the next installment of the Exorcist franchise, isn’t the theory that these bubbling vocal cords were repeatedly sautéed and baked less likely than the much simpler explanation of demonic possession? The smoker’s rasp sounds much drier, anyway.

There has been an effort to dismiss the bubbling as a mere affectation. But ask yourself: what are the odds that a vocalization nearly indistinguishable from Mongolian throat singing will be adopted by millions of young people, simply to strike a pose? I’m just not buying it. The simplest explanation may be best: it was an innocently adopted, thoroughly harmless preteen fad that unexpectedly took root in adolescence and grew into a well-established, widespread adult habit, like picking one’s nose.

Don’t sizzle when you uptalk. You’ll frighten the children.

We may not know where it came from, and we may not know why it came, but we do know that vocal fry, while not quite the sine qua non of Post-Modern English, sends the loud and clear message, to anyone who happens to be within earshot, that standing here is a proud master of the 21st-century version of American English, gargling spit while speaking. (I seem to recall once seeing something similar being done by a ventriloquist.)

Learn the sounds in the lesson below; sing them with the sizzle above, while acting like a sick toy poodle at the vet’s, and your quest will be over. The Holy Grail of this Elders’ Crusade will be yours: PME fluency. (Oh, and remember: don’t sizzle when you uptalk. You’ll frighten the children.)

Part Three: The Exercises

So, in the 2016 election, Clinton was really sure she would sort of capture the states in the rust belt, but she didn’t. I mean, the turnout there was pretty much deplorable, right?

1. Discourse markers, sometimes called fillers, such as those used above (so, really, sort of, I mean, pretty much, and right), while not integral to either the meaning or the music of Post-Modern English, enhance its aesthetics, signal that the speaker is part of the linguistic in-crowd, and help the speaker sound as if his grip on what he’s saying is less than firm. It gives him wiggle room and makes him seem all squirmy: the Daily Double. Placing fillers in a phrase to best effect calls for a keen ear, rigorous practice, and a constant monitoring of how it is being done by the cool kids.

Beginning immediately, insert at least one filler into each sentence you speak. Yes, it requires self-discipline, but don’t worry; in time, it will become habitual and you will be able to dispense with self-discipline entirely.

There are fillers galore. To gain perspective, note that like, actually, and dude, while still heard, have grown slightly stale.

Yes, it requires self-discipline, but don’t worry; in time, it will become habitual and you will be able to dispense with self-discipline entirely.

About ten years ago, like was like ubiquitous. Like it was in like every sentence like three or four times. I mean, it had like metastasized. Then, over the next few years, its rate of use fell by 73%, as though it had gone into remission. Often, when a word or fad becomes a pandemic, it burns itself out. There was a sign on a Mississippi country store: “Live Bait – Nitecrawlers – Cappuccino.” It could be that the overuse of like was deemed uncool by some shadowy teen language tribunal and labeled a bad habit, like smoking tobacco. But as with that addiction, many found it impossible to go cold turkey. You’ve probably heard of Nicorette,a gum used by smokers trying to ease withdrawal. Well, the discourse markers sort of, kind of, you know, I mean, and pretty much have been the linguistic Nicorette to millions of like addicts trying to kick the habit. Some former addicts have resorted to saying kinda-sorta. They are sincere in their belief that this constitutes an evolutionary step forward.

Actually, which often sounds a trifle pompous, has largely been replaced by so in the initial position and right in the final position, as demonstrated in the lesson. It can still be used, but sparingly. Once per minute ought to do it, actually; twice, at most.

In place of dude, try bro, or brah, or bruh, or perhaps you could consider using nothing at all.

In summary, “Actually, I like know what I’m talking about, dude,” compares unfavorably to, “So, that’s pretty much, you know, how it sort of is, brah — I mean, right?” While both sets of words still appear in the lexicon of New English, the latter reflects the more gracile stage of linguistic evolution that has been achieved, and is, therefore, preferred. It sounds more woke, too, doesn’t it, or is that just me?

They are sincere in their belief that this constitutes an evolutionary step forward.

2. The first two syllables in the word “election” should be mid-range in pitch, and clearly and crisply enunciated, while the final syllable should be lower pitched and slightly drawn out: “shuuun.” (In other applications, the terminal syllable must be uptalked. This will be covered in Lesson Two.) The increase in duration for the final “shun” is mandatory for all words ending in “-tion.” God knows why. But try it again, with a little sizzle: “elek- shuuun.” Nice.

3. “Clinton” should be pronounced “Cli/en” with a glottal stop entirely replacing the medial “nt” consonant blend. Glottal stops are a thing right now. “Mountain” is “mow/en,” and “important” is “impor/ent,” not to be confused with the mid-Atlantic pronunciation “impordent.” (Note that in the go-to example for glottal stops in American English, “mitten” becoming “mi/en,” it is only the “t” sound that is replaced, as it is in “impor/ent.” Replacing the “nt” seems to be the more recent, bolder approach, and is thus more worthy of imitation.) Practice these glottal stops in front of a mirror. To avoid embarrassment, it’s better to practice when you’re alone than to try them out in public before they’ve been thoroughly polished.

4. The word “sure” should not be pronounced like “shirt” without the “t” but rather as “shore,” rhyming with “snore,” with the long “o” and a strongly vocalized “r.” This pronunciation probably hails from Brooklyn, where it had been successfully detained for decades. Similarly, don’t say “toorist,” say, “toarist.” (By George, you’ve got it.) Again, practice. This is hot stuff. Cutting edge. Hundo P.

To avoid embarrassment, it’s better to practice when you’re alone than to try things out in public before they’ve been thoroughly polished.

5. In the word “capture,” the first syllable, “cap,” should be mid-range in pitch and clipped at the end, with a fraction of a second pause before dropping down to the second syllable, “chur,” which must be at a low pitch and slightly drawn out, so that it sounds like the endearing growl of a small dog.

This rule, first promulgated by anonymous Valley Girls back in the eighties, applies to all multi-syllabic words that end in “-ture” and most words of more than one syllable that end in “r.” The amount of fry used in this application has varied over time, and the appropriate level has been the subject of a lively but inconclusive debate. I take the view that it is a matter of personal taste. Experiment with the sizzle; go ahead. Practice with this list: rapture, juncture, fracture, puncture, rupture. Remember: Start high, go low, go long. Grrrr.

6. In “the rust belt,” “the” should be at mid-register pitch, while both “rust” and “belt” should be about five full notes higher. Yes, this is the famous sound of uptalk. The higher pitch of “rust” and “belt” suggests that a question is being asked. The goal is to create the impression that you are checking to see if the listener knows, as you are pretending to know, exactly what the rust belt is. What is desired is the illusion of a simultaneous, unspoken, empathetic interaction of mutual insecurity, something like, “Are you still with me? Am I doing OK?”, evoking at most an instant, tiny nod from the listener and a silent “Yes, I’m still with you, and you’re doing just fine, I think.” Try not to sound too needy. Aim for a subtle patina of clingy insecurity. It’s more credible. No need to ham it up.

Again, it is the legendary Valley Girls who are credited with this classic innovation. Australia recently filed a suit with the International Court of Justice disputing this claim. As if!

Aim for a subtle patina of clingy insecurity. It’s more credible.

Uptalk, like vocal fry, is used by women more than men, and is frowned upon by some, especially when it is “overused” and “exaggerated.” What crap. When it’s used once or twice per sentence, and the high-pitched words don’t pierce the falsetto barrier too often, uptalk reliably contributes to an authentic-sounding PME fluency. While I’ll grant that it may be something of an acquired taste, with practice and patience you’ll come to find its chirping high notes as precious as I do. Uptalk is cool and is likely to remain so. (I suspect that some men avoid uptalk because it makes their mansplaining hilarious.)

7. Then, after “rust belt,” comes a pause, as though the speaker were waiting for some confirmation of comprehension. This is a faux pause. The pause should not be so long that it gives the listener sufficient time to formulate and initiate an inquiry — in this instance, into the actual membership roster of states or cities in the rust belt. The duration of the pause will vary according to the speaker’s assessment of the listener’s level of expertise. Here, the assessment would involve the fields of (a) voter behavior in 2016 and (b) the deindustrialization of the non-Canadian area around the Great Lakes during the past half-century. To use the faux pause correctly, then, refer to this rule of thumb: Low expertise? Short pause. High expertise? Shorter pause. As always, the primary concern should be style, not substance.

8. The words “but she” should be two full steps lower than “belt” (from the fifth to the third), but “didn’t” should be right back at the same pitch as “belt.” That’s right, another dose of uptalk.

To master the technique, the novice should start by uptalking at least 50 times a day. When I was starting out, I kept a pencil stub and a little note pad in my shirt pocket to tally up my uses of uptalk during the course of the day with neatly crosshatched bundles of five. You might want to give it a try, as it keeps your shoulder to the wheel. I am proud to say that I uptalk effortlessly all the time now, and the surprise and sheer delight on the faces of young people when they hear an older gentleman “talking up” makes all the hours of practice worthwhile. I feel like I’m really making a difference.

While I’ll grant that it may be something of an acquired taste, with practice and patience you’ll come to find its chirping high notes as precious as I do.

A word of caution. When uptalk is employed at a very high frequency, volume, and pitch, and the whole sampler of fillers is tossed in, a critical mass can be achieved that has been known to set off a chain reaction. First your dog, then the neighbors’, then their neighbors’ — before you know it, the whole neighborhood is filled with the sound of a howling canine chorus. Once, when I overdid it, the damned coyotes even joined in. So mix fillers into your uptalk carefully. I’m just saying.

9. The word “didn’t” should be pronounced as a crisp, two-syllable “dident.” The short “e” sound should be clearly heard as in “Polident.” (Think “prissy.”) This same rule applies to “doesn’t,” which becomes “duhzent,” emphasis again on the short “e.” While “couldn’t” and “shouldn’t” also sometimes become “couldent” and “shouldent,” as one might expect, just as frequently they come out as, “coont” and “shoont,” utilizing the short “oo” of “schnook.” (Thinking back, the guys I heard using this pronunciation may have been lit.) Either of these modern variants is acceptable, but eschew the fuddy-duddy standard pronunciations of the original contractions, “could/nt” and “should/nt,” which, oddly, feature glottal stops. (Yesterday, I heard “coo/ent.” Very chill.) Oh, and don’t say “did/nt.” (With all due respect, you’d sound like a cave man.)

10. The final word, “right,” should be pronounced in a way that places it at an equal distance from (a) assuring the listener that what you just said was not only correct, but cool, and (b) seeking assurance from the listener that what you just said was not only correct, but cool. In order to achieve this effect, the coloration of “right” must be subtly blended so as to become a creature of the twilight world between the declarative and the interrogative: not falling, not rising, not whining, and never, ever abrupt. With the proper inflection, “right” will hit this sweet spot, where the listener will wonder, “Wait. What? Is he asking me or telling me?”

Practice these ten exercises. Practice hard, then get out there and commence pussyfooting.

Part Four: Problems and Solutions

As you gain fluency in Post-Modern English, what you seem to lose in self-confidence, you will more than make up for with an increased capacity to appear empathetic. Your use of PME will lower the walls and build new bridges between you and the people around you. Your sacrifice of the ability to assert your will and pass judgment on others will help create a more open, tolerant, and nonjudgmental human community. You will contribute to a world in which nobody will feel the need to say “Don’t judge me,” or “Don’t be a dick,” because there will be no one judging them and no one will be acting like a dick. That’s right: no more judges and no more dicks. It will be a world of greater respect, warmth, and, yes, love.

The bad news is that you’ll have to keep an eye out for three problems that may rear their ugly little heads.

What you seem to lose in self-confidence, you will more than make up for with an increased capacity to appear empathetic.

First, there is backsliding. Although you now sound hip, as you approach your dotage you may find among your life’s baggage a few truths that you feel should be self-evident to everyone. You may even feel a need to share these truths with the people who, sad to say, have not had the pleasure of reading the self-published revisions to your personal Boy Scout Handbook. (You may also feel a constant pain in your lower back. These symptoms often occur together.) Pretending to be wimpy may have grown so taxing that, as therapy, you decide to briefly drop the Post-Modern English charade and revert to your former pre-PME self. But how do you safely remount your high horse?

To avoid unjust accusations of hypocrisy, it is best to choose the venue and target of these code-switching episodes carefully. I’ve heard that a marvelous place to engage in them is on urban motorways. I am told that it is easy to find drivers who are unaware of your exhaustive personal list of driving dos and don’ts. What next?

You may find yourself behind the wheel of a large automobile. Some knucklehead in a little Kia cuts in front of you without even signaling, missing you by mere yards. Gunning it, you pull up next to him. You lower your window. He lowers his. Then you let him have it with both barrels — figuratively, of course. You tell him, in stark Anglo-Saxon terms, in as loud and clear a voice as you can muster, the obscene fate that awaits him and his mother before their imminent and humiliating deaths. After that, spleen thoroughly vented, you brake and swerve onto the exit ramp, switch back to PME,and reassume your Oprah-like pose of nonjudgmental equanimity.

Here are a few tips. Before you switch codes, make absolutely sure that the knucklehead in your crosshairs doesn’t know who you are. Anonymity is crucial. And avoid the rush hour, when traffic sometimes grinds to a halt. Offended knuckleheads have been known to leap from their cars, screaming obscenities and brandishing revolvers. They are, after all, knuckleheads. (Good thing it’s illegal to use a wireless telephone while driving. No one will be able to post your outburst on the internet.)

The best way to keep from backsliding is, obviously, to get a grip on yourself.

Before you switch codes, make absolutely sure that the knucklehead in your crosshairs doesn’t know who you are. Anonymity is crucial.

Second, should you choose to “just say no” to the temptation to backslide, beware of unsuccessful repression. If, in order to achieve PME fluency, you have to repress the wish to lord it up over everybody, and the repression fails to keep that wish contained, you may catch it sneaking out of that darkened back room of your consciousness, where you’ve been keeping it out of public view, and exposing itself in what is popularly known as a “Freudian slip.”

Attending a lovely garden party, you might intend to say, “Oh, You’re so kind. Thank you so much,” only to find yourself saying, “Why don’t you just go fuck yourself.” Remember, you could have said this to the knucklehead who cut you off, but you didn’t want to be seen as a hypocrite.

What then? The best way to avoid Freudian slips is to keep a civil tongue in your head. If you think that you might need professional help to accomplish this, weekly sessions with a competent therapist for a year or two should do the trick. And don't be put off if the hourly fee is hundreds of dollars. Medicare covers it.

Third, and finally: As bad as that slip would be, there is the possibility of something even more embarrassing. Freud himself believed that a sufficiently strong unfulfilled wish, if locked away in some dark dungeon of the subconscious, could create intolerable internal feelings that were then projected onto an external object in the form of a paranoid delusion of the kind that motivates such modern political extremists as white supremacists and their mirror-twins, the antifas. You may find yourself on the campus of a large university, waiving simplistic placards, shouting incoherent platitudes, and trading ineffectual blows with someone very much like yourself, a person who speaks Post-Modern English fluently but finds it difficult to express his opinions nonviolently. Why, he may even lack the most basic linguistic tools that are needed to engage in civil discourse.

You might intend to say, “Oh, You’re so kind. Thank you so much,” only to find yourself saying, “Why don’t you just go fuck yourself.”

The solution? Just pull yourself together, man. Snap out of it, for the love of God.

Given your age, maturity, and ability in archaic English, spotting these pitfalls early on and avoiding them should not be difficult. If, however, you find that you’re experiencing uncontrollable urges to play the pontiff, convert the heathen, or some such, and you feel the need for relief, there is a category of medications called anti-androgens that lower the testosterone levels often associated with judgmentalism. Most of the side effects are limited to the secondary sexual characteristics and are not life threatening. If this sounds right for you, you should consult your health care provider.

Should the medication prove ineffective and your symptoms persist, there is a growing body of evidence indicating that immediate and lasting relief can be achieved through gender reassignment surgery, provided that you are a male. While this has become a relatively safe and routine procedure, boasting a single-digit mortality rate, a small percentage of otherwise qualified candidates hesitate to “go under the knife.” But if you count yourself among these reluctant few, take heart. There is one more glimmer of hope: the experimental treatment protocol called “strategic relocation.” While there is insufficient data to conclusively prove the treatment’s therapeutic efficacy, the available anecdotal evidence suggests that, at the very least, more research is warranted.

Ferris T. Pranz, a postdoctoral fellow in the Department of Applied Metaphysics of Eastern Montana State University at Purdie, has been observing a band of people living with judgmentalism. These people were individually tagged and released over the past decade by the Montana Department of Behavior Management (MDBM) outside Fertin, a farming town near Lake Gombay, just south of the Missouri River. In his unpublished 2017 field notebook, Pranz records his painstaking efforts to gain the trust of this strategically relocated band at their watering hole, a smoke-filled bar called “Grumpy’s.”

There is one more glimmer of hope: the experimental treatment protocol called “strategic relocation.”

Pranz’s observations have raised some eyebrows in the judgmentalism community in Montana. Despite the Fertin band’s characteristically opinionated and aggressive communicational style and constant abuse of both alcohol and tobacco, they seem to share a gruff good humor while playing at pool, darts, and cards. Interestingly, they often refer to themselves as “blowhards,” apparently without shame or irony, and then laugh loudly. When Pranz would ask the group to explain the laughter, they would invariably laugh again, more loudly. Pranz has recommended that further research be conducted to discern the motives behind this laughter, possibly utilizing a double-blind design.

More broadly, Pranz and his colleagues at EMSUP have proposed a major longtitudinal study to explore the incongruity of the seemingly upbeat ambience in “Grumpy’s” by designing instruments to quantify (1) the specific characteristics of these Fertin people and the effect that such characteristics may have on their communicational dynamics; (2) the effects of the complete absence of treatment by means of any of the experimentally proven therapies for people living with late-stage degenerative judgmentalism. These effects can then be compared with therapeutic outcomes in matched groups receiving such treatments. Pranz has also recommended that the proposed longtitudinal study be completed prior to authorization of an expanded “strategic relocation” program to include areas beyond Fertin. In October of 2017, the Board of Directors of the Friends of Judgmentalism in Bozeman passed a resolution in support of Pranz’s proposal. Pranz plans to apply for a grant through the MDBM in June of 2018.

Part Five: Looking Backward

American English is the language of our past, already dated and quickly becoming archaic. As will be shown, the impression that it makes when spoken is not good. More importantly, it conveys an aggressive smugness that is out of step with today’s world. Even the founding documents of the United States, written in American English, sound absolutist, judgmental, and harsh.

By now, you must have asked yourself: “If French is the language of love, and German is the language of Nietzsche, and Post-Modern English is the language of wimps, then what the heck is American English?” Well?

American English is the language of our past, already dated and quickly becoming archaic. It conveys an aggressive smugness that is out of step with today’s world.

As a native speaker of American English, I am not qualified to answer. To find a place to sit somewhere outside of one’s own language and culture, and then to sit there and listen to one’s language being spoken in order to gather an impression of the speaker, using only the sound of the language, not its meaning, is like trying to street-park a Class A RV on the Upper East Side: while it may be possible, I’ve never seen it done. No, this question should be answered by people who don’t speak the language.

American English began in 1607, when the first British colonist stepped on the shore of the James River. How do you suppose American English sounds to British ears today? I’m told there are three main impressions. First, it is spoken more slowly than British English, giving the impression that the speaker is also a little slow. Second, it is spoken more loudly than British English, and with more emotion. As both of these characteristics are associated with children, the impression is that the speaker is somewhat immature. Third, American English is rhotic, meaning that “r” is pronounced both at the end of a word and before another consonant. As this pronunciation is normally associated with Scotland, Ireland, and remote rural areas, the impression is that the speaker is a bit rustic.

Taken together, then, to British ears American English is the language of dim-witted, childish yokels. One might call it the language of knuckleheads. That is not to say that Americans are knuckleheads. It simply means that our language makes us seem that way.

Post-Modern English, while less given to the glacial John Wayne drawl or the grating Jerry Lewis bray of American English, retains the rhotic accent, even doubling down on it with the vocal fry. Still, in two of the three categories, it constitutes an evolutionary step beyond its parent language. Even British children have begun to pick up Post-Modern English from Netflix, much to the delight and amusement of their parents.

To British ears American English is the language of dim-witted, childish yokels. One might call it the language of knuckleheads.

I was once told by a friend who spoke only the Arabic of the Nejd that French sounded like someone saying, “Loo, loo, loo, loo, loo,” and English sounded like someone saying, “Raw, raw, raw, raw, raw.” That was just one Bedouin’s opinion, of course. It seemed funnier in Arabic, somehow. “Loo, loo, loo.” We had a good laugh.

In 1776, less than 200 years after that first colonist was beached, Thomas Jefferson wrote the Declaration of Independence. What a marvelous symbolic moment in the evolution of English! He had to write it in American English, of course, because the Post-Modern branch wouldn’t emerge for two centuries. While this does not excuse him, it reduces his level of culpability. Listen:

We hold these truths to be self-evident, that all men are created equal.

Can you hear his certainty? Why, the phrase simply drips with self-confidence. To assert that a truth is self-evident is an act of rhetorical bravado that positively swaggers. (“Because I said so.”) Note the absence of fillers to dull the sharp edges. He seems to have missed the lesson that explains how “you have your truths and I have mine.” He seems to be saying that “all truths are not created equal,” pardon my French. And what is this nonsense about “men”?

So Jefferson was sure of himself, and assertive. But was he judgmental? Ask yourself: What is this Declaration of Independence, at its core? Is it a celebratory birth announcement, like a Hallmark card? (“Please welcome…”)

It seemed funnier in Arabic, somehow. “Loo, loo, loo.” We had a good laugh.

Far from it. This is Thomas Jefferson leveling a public and harsh judgment against one King George III. It spells out George’s crimes, like a rap sheet or an indictment. It is clear: Tom is judging George. Tommy is calling Georgie a dick. Listen:

A prince, whose character is thus marked by every act which may define a tyrant, is unfit to be the ruler of a free people.

This white, male, rich, privileged, powerful, slaveholding “founder” of America is writing in the scathingly self-righteous tones of archaic American English. The sound of Jefferson’s voice is clear. He is cocksure and in-your-face. He is your judge, jury, and executioner. The music of his American English is a march being played by a big brass band oompahing down Main Street on the Fourth of July, snare drums rattling like assault rifles. Courage is needed to follow the facts, no matter where they lead. It pains me to have to say it, but Thomas Jefferson was a dick.

Your final assignment is to translate the first of the two fragments above (the one with the “self-evident truths”) from American English into Post-Modern English. You have five minutes. Begin.

OK, time’s up. Answers will vary, of course, but it might be useful to compare your translation with the following:

So, some of us were sorta thinking? that a coupla of these like, ideas? or whatever? we had were, oh, I don’t know, kind of, you know, well, not so bad? I guess, right? And, uh, oh yeah, that all men, I mean, like women, too, kind of like, everybody? I mean, are pretty much, I’m gonna say, created? you know, like, equal? right. or whatever, so...”

It sounds vaguely Canadian, eh?

Yes, it is time to put American English out to pasture. Post-Modern English is not just cooler; it is more in keeping with the zeitgeist. It is the language best suited to the more equitable, inclusive, and nonjudgmental world that we are building together.

It pains me to have to say it, but Thomas Jefferson was a dick.

It is time to hang up that coonskin cap.

* * *

All living languages are continuously evolving — as are we, the species that speaks those languages. Do these two forms of evolution influence each other? Of course they do. Through millennia, the evolutionary pas de deux of our species on this earth has been and continues to be shaped by, and to shape, the words and music of our languages. To the extent that there is intent in this choreography, it is ours. We are making ourselves. The changes we make to our languages have consequences beyond the merely aesthetic. They affect the choices we make that determine our destiny. We should, therefore, make changes to our languages with the same caution we exercise in rearranging the molecules of our genome. Are we good?

“. . . Fearing not that I’d become my enemy
in the instant that I preach.”
                          — Bob Dylan, My Back Pages (1964)




Share This


Cuba, Race, Revolution, and Revisionism

 | 

When Cuba’s serial and multiple African military interventions began in 1963 with Guinea-Bissau’s war of independence from Portugal, Fidel Castro selected black Cuban soldiers and conscripts to man his liberation regiments. Dead black bodies in Africa were less likely to be identified as Cuban, according to Norberto Fuentes, Castro’s resident writer and — at the time — official biographer, confidant, and a participant in the later Angolan wars.

Cuba’s African — and Latin American — adventures were made possible by agreements reached among the USSR, Cuba, and the United States to end the Cuban Missile Crisis of 1962. One of those protocols was a promise from the US that it would respect Cuban sovereignty and refrain from invading the island. To Castro, this was a green light to build Cuba’s armed forces for the liberation of the world’s downtrodden instead of having to concentrate his resources for the defense of the island.

Ochoa was the only subordinate who could speak uninhibitedly with, and even kid or tease, the humorless, haughty, and overbearing Fidel Castro.

However, when it came to deploying his black brigades, Castro found himself short of black commanders. Enter Arnaldo (“Negro”) T. Ochoa Sánchez.

Ochoa had been part of Castro's 26th of July Movement ever since its creation, and by March 1957 he had joined Castro's guerrilla army in the Sierra Maestra, fighting against the Batista dictatorship. It was then that Ochoa and Raúl Castro forged a close friendship, one that also led to a certain intimacy with Raúl’s brother, Fidel. According to Fuentes, in his book Dulces Guerreros Cubanos, Ochoa was the only subordinate he knew who could speak uninhibitedly with, and even kid or tease, Fidel Castro — a humorless, haughty, and overbearing caudillo.

Ochoa, of humble Oriente peasant origins, had distinguished himself in the Revolution and during the Bay of Pigs fiasco, subsequently attending the Matanzas War College and Frunze Military Academy in the Soviet Union and rising to the Cuban Communist Party’s Central Committee. But he really distinguished himself in the Ethiopia-Somalia conflict. Cuba aided Ethiopia in this USSR vs. China proxy war, since both boasted Marxist regimes. Ochoa brilliantly defeated the Somalis in the tank battle of the Ogaden. For that he was dubbed “the Cuban Rommel.”

The problem was that Ochoa wasn’t really “black,” a racial classification that could apply to almost anyone in Cuba, especially if one uses the rule of thumb once common in the United States: that anyone with any black ancestry, no matter how distant or dilute, is black. (This author’s DNA test reveals a 1–3% West African ancestry, a detail not noticeable in his phenotype.) Ochoa is very swarthy, in a Mediterranean sort of way; yet his phenotype fails to show any classic “Negroid” features. It was Raúl Castro who nicknamed him Negro (black) by bestowing on him a promotion to “Black” General. The Armed Forces Minister wanted a black commander for the black troops he sent to Africa because he lacked a qualified, real black general who would realize both his political and his military objectives.

Ochoa brilliantly defeated the Somalis in the tank battle of the Ogaden. For that he was dubbed “the Cuban Rommel.”

Now, Cuba’s armed forces actually did include black commanders, among them General Víctor Schueg Colás (see below) and Juan Almeida Bosque. Almeida was a veteran of the assault on the Moncada Army barracks that launched the 26th of July Movement. Along with the Castros, Almeida was caught, imprisoned, amnestied, and exiled to Mexico after that defeat. He was on the Granma yacht as it landed survivors in Cuba, and he fought against Batista in the Sierra Maestra mountains. Later he was promoted to head of the Santiago Column of the Revolutionary Army. Wikipedia, without any sense of irony, says that “he served as a symbol for Afro-Cubans of the rebellion's break with Cuba's discriminatory past.” In his book Como Llegó la Noche, Huber Matos, third in command of the Revolutionary armies after Fidel and Raúl — though later to be purged — describes Almeida as unsuited for military command, a “yes” man. He says that Fidel kept him purely for his loyalty and as a symbol of the Revolution’s inclusiveness of Afro-Cubans. Almeida was the only black commander during the Revolution. He was Fidel Castro’s token black.

Ochoa took the nickname Negro in stride and probably even affectionately, fully understanding the political rationale behind the dubbing. In this author’s opinion, his attitude towards race (and by extension, Fuentes’ attitude) is pretty representative of one general streak of Cuban racial attitudes. Here is my translation of Norberto Fuentes’ description of Ochoa’s reaction to the moniker:

Ochoa, besides being mestizo, was very obstinate. When anyone alluded to Raúl’s reason for the nickname — that the Minister didn’t have any competent, real black generals — Ochoa would begin to vigorously shake his head. And he would continue this stubbornness even when reminded of General Víctor Schueg Colás — el Negro Chué — as he was generally known: a black Cuban general.

Ochoa responded that “el Negro Chué was not a negro who was a general.”

“And what kind of BS is that, Arnaldo?” asked a member of the group.

“He is a general who is black, and that’s not the same thing as a black who is a general.”

For a second I [Fuentes] thought Ochoa was about to write a second volume to Alex Haley’s Roots. My mind reviewed the list of black Cuban generals.

“And what about Kindelán? And Silvano Colás? And Moracén? And Calixto García? And Francis?” I challenged him.

“None of those are either generals or black,” he declared.

“But then what the fuck are they, Arnaldo?”

“Fictions, my friend. Nothing more than nonsense,” he blithely answered.

If you, dear reader, can’t make sense of that, don’t worry. It’s Ochoa’s way of saying that race doesn’t matter, that race is irrelevant, that concerns about race are nonsense. One Cuban-American academic, quoted in Guarione Diaz’ The Cuban American Experience: Issues, Perceptions and Realities, averring that humor is an essential trait of the Cuban personality, describes the archetypal Cuban as “one who jokes about serious matters while taking jokes seriously.” In that vein, there is a deeper intent in Ochoa’s flippancy that Fuentes, in a stream of consciousness rant, then goes on to elaborate.

The Castros were recapitulating the trans-Atlantic slave trade in reverse: shackled by the ideological chains of a monomaniacal dictator and sent back to Africa.

His idea is that Ochoa, in his own irreverent way, was seeking redemption for the tragedy of Cuba’s “stoical, forced, brave, sweet and immense blacks” who had to carry — since 1965 — the full brunt of the Revolutionary Armed Forces’ guerrilla campaigns in Africa, because the Castros believed that dead black bodies in Africa couldn’t really be traced back to Cuba. They didn’t contemplate any POWs.

In Fuentes’ view, the Castros were recapitulating the trans-Atlantic slave trade in reverse: two centuries ago, in physical chains across the Atlantic to the Americas; in the late 20th century, shackled by the ideological chains of a monomaniacal dictator and sent back to Africa.

To Ochoa, race was a trivial issue; to the Castros it was an essential component of their revolutionary tool kit in their struggle for universal social justice. When, according to Diaz, Cubans began leaving the island in droves to escape the repressive regime, “the revolutionary government denied exit visas to Blacks more than to Whites to show the international community that Cuban Blacks supported the revolution and did not flee Cuba.”

Castro himself, coming down to Girón, interrogated the black prisoners — just before their sham execution — accusing them of treason both to their country and to their race.

The Castros’ revisionist racial attitude reared its ugly head again during the Bay of Pigs fiasco when the invading members of Brigade 2506 surrendered or were captured. Black prisoners were singled out for extra abuse. They were perceived as traitors since, in the Castro calculus, the Revolution had been fought — in part — for them. Haynes Johnson, in his book, The Bay of Pigs: The Leaders’ Story, adds that “of all prisoners, Negroes received the worst treatment.” They didn’t fit Castro’s Revolutionary narrative, and their presence on the invasion force infuriated him. He himself, coming down to Girón, interrogated them — just before their sham execution — accusing them of treason both to their country and to their race. Osmany Cienfuegos, a Minister in Castro’s government and brother of Revolutionary Commander Camilo Cienfuegos, second in popularity only to Fidel, lined them up against a wall and told them: “We’re going to shoot you now, niggers, then we’re going to make soap out of you.”

One notable exchange during the prisoners’ trial was with Tomás Cruz, a paratrooper of the 1st Battalion. “You, negro, what are you doing here?” Castro asked, reminding Cruz that the Revolution had been fought for people like him, and of the swimming restrictions at some tourist resort hotels before the Revolution (a pathetic concession to attract American tourists).

Cruz, with all the dignity he could muster, responded, “I don’t have any complex about my color or my race. I have always been among the white people, and I have always been as a brother to them. And I did not come here to go swimming.”

Black is White and White is Black

Broadly speaking, in Cuba, race — in this context meaning skin color — is a relatively unimportant issue, on par with other physical traits such as weight, height, pulchritude, hair color, and even disposition. Unlike in the US, where large proportions of black people distinguish themselves from the broader population with distinctive clothing, hair styles, music, linguistic flourishes, political attitudes, and other traits, all kinds of Cubans share cultural values, patois, styles of dress, music, etc. Even religious affiliation, which in the Unites States often makes a visible difference between the races, tends toward a high degree of syncretism, with ancestral roots and beliefs to the fore instead of any racial overtones — a theme that the Castro regime has falsely exploited by preferential treatment of Santeria over other religions, treating it as compensation to a previously “oppressed” race (in Castro’s revisionist ideology). American hypersensitivity to race is unknown in Cuba.

In Cuba, slaves could marry, own personal property, testify in court, and run businesses.

But how did race virtually disappear as a contentious issue in Cuba, while persisting until modern times in the United States — especially considering that the former eliminated slavery 21 years after the latter?

In spite of the awful conditions of the sugarcane fields, slavery under Spanish colonial rule was nothing like what it had become in the United States by the eve of the Civil War. According to historian Jaime Suchlicki in Cuba: From Columbus to Castro and Beyond, “Spanish law, the Catholic religion, the economic condition of the island, and the Spanish attitude toward the blacks all contributed to aid the blacks’ integration into Cuban society.” After all, the Spanish had lived for centuries under the comparatively tolerant rule of Moors.

In the American south, negritude — to any degree, i.e., the notorious “one drop rule” enacted in several states — equated skin color with a deprivation of rights. In Cuba, slaves could marry, own personal property, testify in court, and run businesses. One 18th-century observer noted that many had become skilled craftsmen, “not only in the lowest [trades] such as shoemakers, tailors, masons and carpenters, but also in those which require more ability and genius, such as silversmith’s craft, sculpture, painting and carving.”

Joining the US became a nonstarter during the US Civil War when Cubans realized how badly Negroes were treated in the South.

Additionally, Spain’s liberal manumission policy “resulted in almost 40% of African-Cubans being free in 1792,” reports Andro Linklater in his book on the evolution of private property, Owning the Earth. The diverging legal and social attitudes toward race in Cuba and in the US presaged future developments in each country. The paradoxical contrasts are striking. Whereas Reconstruction in the US institutionalized policies that had grown more nakedly racist since Independence — equating skin color with the presence or absence of rights and talents — the opposite was true in Cuba. Under the influence of the Catholic Church, the fundamental humanity of Africans was uncontroversially established early on; slavery and skin color were philosophically separated. In the time of Cuba’s Wars of Independence, Antonio Maceo, an Afro-Cuban, became second-in-command of the rebel armies.

At about the time of these wars, a notable segment of Cuban intellectuals favored the Texas model: declare independence from the colonial power and petition the US Congress for admission to the Union. The idea was so popular that the proposed Cuban flag was modeled on the Texas flag: a single star on the left, stripes on the right, and the whole rendered in red, white, and blue. However, joining the US became a nonstarter during the US Civil War when Cubans realized how badly Negroes were treated in the South. It wasn’t just the exploitation of slaves (which also happened in Cuba), but rather the contempt for dark skin color that denied a person’s humanity.

Cuba has always had an amorphous racial climate, one mostly misunderstood or puzzling to Americans. Racism, in the sense of hating or fearing a person for his skin color, is unknown. Skin color was never an impediment to respect. But skin tone snobbery (rarely surpassing trivial tut-tutting or even semi-serious priggishness) was not uncommon. Color gradations, like degrees of body mass index ranging from the skeletal to the morbidly obese, extended into categories of people Americans would consider “white,” with the too-pale also looked at askance, as if they were anemic and rickety.

Fulgencio Batista, while president, was denied membership in the Havana Yacht Club: he was considered too swarthy; although his son, Jorge Luis, was admitted. That he didn’t take the rejection personally and, as a dictator, did not take reprisals, is inconceivable to an American. Instead, the president donated a marina to the Havana Biltmore Yacht & Country Club, as swanky a venue if not more, and, voila! he and his family became members of that club.

Racism, in the sense of hating or fearing a person for his skin color, is unknown in Cuba. Skin color was never an impediment to respect.

This nonchalant — politically-correct Americans might say insensitive — attitude is related to Cubans’ tendency to nickname everyone, even strangers. A person with epicanthic folds will be called Chino, a very black man Negro, a fat person Gordo (my own nickname after immigration), a starkly white-skinned person Bolita de Nieve (Snowball), a skinny woman Flaca, a large-nosed man Ñato, a full-lipped person Bembo (hence, Negro Bembón for a full-lipped black man), a pug-nosed man Chato . . . You get the picture.

But the irreverence also gets manifested post-ironically, in the same vein as Ochoa’s nonchalant whimsy: a very black man might be nicknamed Blanco or Bolita de Nieve, a fat woman Flaca (skinny), and so on.

My favorite example of this is Luis Posada Carriles’ nickname. Posada Carriles, a Cuban exile militant, is considered a terrorist by the FBI. He is generally thought to be responsible for the bombing of Cubana flight 455 in 1976, which killed 73, including 24 members of Cuba’s National Fencing Team. In addition, Posada Carriles is said to have been involved in the planning of six bombings at Havana hotels and restaurants during 1997. His rap sheet is much too long repeat here. Posada Carriles’ nickname? Bambi.

But I digress. Overtones of Americans’ racial (a term I hesitate to use, as you’ll see below) attitudes are making inroads into the Cuban-American experience. One white Cuban-American informant admitted to being fearful of and avoiding groups of black men after dark in the US, a behavior that had never crossed his mind back in Cuba. Would one call his reaction in the US “racism”? I wouldn’t. I’d call it adaptability based on experience, a phenomenon that black economist Thomas Sowell has explicitly addressed in his writings.

The Color of Culture

Americans, both black and white, are quick to cry racism in any untoward exchange between people of different hues when someone is being a boor or a snob or experiencing a misunderstanding or, more often than not, when mild ethnocentricity is at work. Ethnocentricity . . . a big word that simply means the tendency of most people to exercise a preference for congregating with like-minded, like-speaking, like-dressing and like-looking people — people they can easily “relate to.” Expressed hierarchically, people’s instinctive loyalty is first to their family, then to their clan (extended family), town, state, religion, in-group, political party, culture, nation, etc. One can see this in the popular slogans “buy local” and “buy American.”

Imagine you’re a small business owner looking for a sales rep. You interview two applicants, one black and one white. The white applicant is sloppily dressed, needs a shower, doesn’t speak clearly, and seems distracted. The black applicant, on the other hand, is fully engaged, is dressed smartly, and seems keen to join your operation. It’s a no-brainer — the black applicant has more in common with you; skin color is not a factor.

We all share a tendency to look at other cultures solipsistically: we see through the lens of our own values, evaluating people according to preconceptions originating in our own standards and customs.

Now imagine the opposite scenario: The black applicant displays plumber’s crack, reeks, and is unintelligible; while the white wears a coat and tie, speaks in your local accent and displays overwhelming enthusiasm. Again, a no-brainer, with skin color again not a factor; instead of that, it is shared values that determine your choice.

Ethnocentrism does, however, have its extremes, the ones you’ll most often come across in a dictionary, without the nuances of an Anthropology 101 course. The first — and one that we all share to some degree — is a tendency to look at other people and cultures solipsistically: we see through the lens of our own culture and values, evaluating other cultures according to preconceptions originating in the standards and customs of our own milieu. More extreme is the belief in the inherent superiority of one's own ethnic group or culture — an attitude that, taken to an absurd limit, can breed intolerance, chauvinism, and violence.

The Origin of Races

What is race? One doesn’t need to understand race in order to be a racist or accuse someone of racism. Contrary to popular opinion, skin color is not a determining factor of race. H. Bentley Glass and Ching Chun Li were able to calculate from blood group data that North American Negroes have about 31% white ancestry (cited in Stanley M. Garn and Charles C. Thomas, Readings on Race [1968]). For practical or political reasons, biologists and physical anthropologists are divided as to the validity of the concept.

First, the more practical biologists. In biology, race is equivalent to variety, breed, or sub-species. In a nutshell, it is incipient speciation. According to the Oxford English Dictionary, race is “a group of living things connected by common descent or origin” — as uncontroversial and far from the whole-picture definition as one can dream up. But to understand race one first has to understand species.

Contrary to popular opinion, skin color is not a determining factor of race.

A species is a group of living organisms consisting of similar individuals capable of exchanging genes or interbreeding. The species is the principal natural taxonomic unit, just below genus — yet even this is by no means a simple or clear-cut concept. Think of horses, donkeys, mules, Jennies, zebras and zorses (a horse-zebra hybrid); or dogs, wolves and coyotes. These animals can interbreed, with various rates of fertility success, but do not normally interbreed in the wild. To account for this, the classic definition of species was amended by the addition of a qualifier, that the group of organisms in question must not only be able to interbreed but must also do so regularly and not under extraordinary or artificial circumstances.

To further complicate things (or was it to simplify?), Ernst Mayr, one of the 20th century’s leading evolutionary biologists and taxonomists, formulated the theory of ring species (aka formenkreis) in 1942 to explain a natural anomaly in the distribution of closely related populations. According to Wikipedia, “a ring species is a connected series of neighboring populations, each of which can interbreed with closely sited related populations, but for which there exist at least two ‘end’ populations in the series, which are too distantly related to interbreed, though there is a potential gene flow between each ‘linked’ population.”

The term ‘ring species’ is a vestigial remnant of some of the first ring species identified, but the populations need not be in a ring shape. Examples include the circumpolar Larus herring gull complex, Ensatina salamanders, the house mouse, trumpet fish, drosophila flies, deer mice, and many other bird, slugs, butterflies, and others. Most natural populations are bedeviled by such complexities, including our closest relative, Pan troglodytes, among whom the East African subspecies shweinfurthii is separated by the Congo River and half a continent from the West African variant verus.

Gould believed that the concept of "race" had been used to persecute certain human groups to such an extent that it should be eliminated.

So that brings us back to race, or incipient speciation. Charles Darwin, in Origin of Species, identified the speciation process as occurring when a subpopulation of organisms gets separated from the larger group, fails to interbreed with them, and interbreeds strictly with itself. This process increases the smaller group’s genetic complement while reducing — again, within the smaller group — the larger group’s greater genetic diversity. The eventual result may be that the smaller group becomes distinct enough to form a new species. This part of the process is labeled “genetic drift.”

Two other factors usually contribute to speciation: genetic mutation and adaptation (through natural selection) to a new environment or way of life. Here “adaptation” does not carry the sense of individuals “getting accustomed to” a new situation but rather the sense of individuals carrying genes that are detrimental in that situation dying before they procreate — in time deleting those genes from the smaller group. This is called “natural selection.” After a subgroup separates from the main population and before it becomes a new species…this is when the term “race” properly applies.

But Darwin understood the limitations:

Certainly no clear line of demarcations has as yet been drawn between species and sub-species — that is, the forms which in the opinion of some naturalists come very near to, but do not quite arrive at the rank of species; or, again, between sub-species and well-marked varieties, or between lesser varieties and individual differences. These differences blend into each other in an insensible series; and a series impresses the mind with the idea of an actual passage.

Of course, a race may never become a new species; it may well, for any number of reasons, reintegrate back into the main population — which brings us back to human races and the more political anthropological concepts.

Some experts, the late Marxist paleontologist Stephen Jay Gould to the fore, believed that race, as applied to humans, was unhelpful, even invalid. He believed that the concept had been used to persecute certain human groups to such an extent that it should be eliminated. And forget “variety” (humans aren’t flowers) and “breed” (they aren’t dogs) and “subspecies” (the Nazis’ use of unter ruined that prefix).

On the other side stand the Physical Anthropologists (Stanley Garn, Paul T. Baker, Bentley Glass, Joseph S. Weiner, et al.) with the late physical anthropologist Carleton S. Coon, who pioneered the scientific study of human races under the Darwinian paradigm of adaptive and evolutionary processes.

Coon divided Homo sapiens into five races with origins in some distant past, distant enough that genetic and phenotypical differences appeared: the Caucasoid, Congoid, Capoid, Mongoloid and Australoid races. These had diverged not only because of genetic drift, but also as adaptations to their local conditions. The oldest races were the darkest: African Blacks, Australoids and Papuans; while whites, Asians, Pacific Islanders, and American Indians diverged later. Skin color varied according to sun exposure. For example, northern European climates favored fair skin to improve Vitamin D synthesis, while dark skin was a shield from Vitamin D overdose. However, in extremely hot and sunny climes such as the Sahel, too-black a skin would tend to heat a body too much, favoring a more swarthy tone. Along the lands of the upper Nile, tall, lanky bodies helped radiate accumulated heat.

When sickle-cell anemia was discovered in white populations, it clinched the notion that racial adaptations were responses to local environments and independent of adaptations such as skin color

On the other hand, the Inuit were physically well adapted to extreme cold: compact bodies to conserve heat; little facial hair to prevent frozen breath condensation that might freeze the face; lightly protruding noses to protect it from freezing; epicanthic eye folds to reduce the area of the eyes to the elements and yellow or yellow-brown skin. The yellow skin likely evolved as an adaptation to cold temperatures in northern Asia. The yellow color resulted from a thick layer of subcutaneous fat, visible through translucent outer layers of skin.

A more recent adaptation was lactose tolerance, which apparently evolved in whites, permitting adult consumption of milk following the domestication of cattle about 6,000 B.C. But one of the most curious adaptations was sickle cell anemia, a debilitating genetic disease that nonetheless provided partial immunity to malaria to the carrier of one allele. First discovered in black African populations, it was first considered a Negroid feature. However, when it was discovered in white circum-Mediterranean populations, it clinched the notion that racial adaptations were responses to local environments and independent of other adaptations such as skin color — a curious vestigial association from more unenlightened times.

Coon’s classifications — mostly unbeknownst to him because the later fine points post-dated him — were already a mélange built on a vast diversity of prehistoric Homo: neanderthalensis, sapiens, denisovans, floriensis, erectus, habilis, etc. Some scholars define these as separate species, others as separate races. I would argue that it is impossible to define an extinct species within a genus from bone remains alone. (Conversely, albeit ironically, modern skeletal remains often yield their race.) DNA researcher Svante Päävo, one of the founders of paleogenetics and a Neanderthal gene expert, has opined that the ongoing “taxonomic wars” over whether Neanderthals were a separate species or subspecies as the type of debate that cannot be resolved, “since there is no definition of species perfectly describing the case.”

Human evolution, ignoring all the tedious debates, continues to surprise us.

Luckily, some Neanderthal DNA has been sequenced and it was discovered that Sapiens includes some of those brutes’ genetic material — about 2% — in northern European populations. In our history, studies suggest there may have been three episodes of interbreeding. The first would have occurred soon after modern humans left Africa. The second would have occurred after the ancestral Melanesians had branched off — these people seem to have thereafter bred with Denisovans, 90% of whose genetic material is extant in modern Sapiens. The third would have involved Neanderthals and the ancestors of East Asians only, whose percentage of Neanderthal genetic material nears 20%.

One difficulty with Coon was his overly distinct racial categories. To some degree he realized this, even while recognizing many subraces, racial mixtures, and incipient formenkreis (before the phenomenon had a name). The problem was that these incipient races kept interbreeding at their verges (and even farther afield; consider Vikings, Mongols, and Polynesians), and accelerating racial mixture after 1500, when human populations began interbreeding willy-nilly, because of globalization.

And that, dear reader, is why Gould and others eschew human racial classifications.

Meanwhile, human evolution, ignoring all the tedious debates, continues to surprise us. The April 21 issue of The Economist reports the discovery of a new human racial variant in the Malay Archipelago. The Bajau people spend almost all of their lives at sea. “They survive on a diet composed almost entirely of seafood. And . . . spend 60% of their working day underwater . . . They sometimes descend more than 70 meters (240 feet) and can stay submerged for up to five minutes . . . They have lived like this for at least 1,000 years.” The evidence suggests strongly that these astonishing abilities are genetic, the result of mutations and natural selection.

The Bajau spleen, an organ that acts as an emergency reserve of oxygenated red blood cells, is 50% larger than those of neighboring populations — “a difference unconnected with whether an individual was a prolific diver or one who spent most of his time working above the waves on a boat. This suggests that it is the Bajau lineage rather than the actual activity of diving, which is responsible for a larger spleen,” continues The Economist.

There is nothing in any of this to suggest that race should be used for political purposes by governments and demagogues — Hitler, Castro, and others.

DNA analysis tells a similar story: a series of Bajau genetic mutations controls blood flow preferentially to oxygen-starved vital organs; another that slows the build-up of carbon dioxide in the bloodstream and one that controls muscle contractions around the spleen.

What to make of all this? Human racial differences, both behavioral and phenotypic, exist and are worth studying: for medicine, forensic science, DNA studies and just for basic scientific knowledge. Genes are not destiny; they encode broad parameters for modification, in the uterine environment, through nurturing, and now through technology (for better or worse). There is nothing in any of this to suggest that race should be used for political purposes by governments and demagogues — Hitler, Castro, and others.

Will Americans in general ever achieve Arnaldo Ochoa’s insouciance about race? We can only hope. After a Civil War, the Emancipation Proclamation, Reconstruction, the Ku Klux Klan, Jim Crow, segregation, and Civil Rights, we’re now experiencing a heightened sensitivity in the finer details of race relations — probably a good indication of the tremendous progress that has been made in the fundamentals.




Share This


The Movie of the Multipliers

 | 

The multipliers. These are some of the most dangerous elements of political life.

Intelligence, knowledge, persuasiveness, experience in political affairs — all these good things may add much to a politician’s ability to succeed. The lack of such qualities may subtract from it. But you can be possessed of all of them and still be only half as likely to win public office as a person who lacks them completely, but has real money, or one-quarter as likely as a person whose father happened to be a noted politician, or one-tenth as likely as a person who happens to possess the right age, color, or creed. Wealth, unearned prestige, the accidents of demographics — these are multipliers, and there are many others.

The first President Bush, a man of normal abilities, achieved high political office by means of multipliers unrelated to political ideas or performance. He was rich, his father had been socially and politically important, and his contrast with Ronald Reagan endeared him to journalists who, for their own reasons, valued that contrast. The second President Bush, a man of no ability at all, was a nice guy, which added something to his political appeal. But the multiplier was the fact that his father had been president and had been surrounded by a gang of hacks who wanted to get back in power.

Wealth, unearned prestige, the accidents of demographics — these are multipliers, and there are many others.

Political multipliers can be mildly amusing, innocently useful, morally disgusting, or existentially disturbing. In the case of the Kennedy family, they are terrifying. One of the Kennedys — John — had intelligence, courage, and a personality that was attractive in many ways. On its own, this ensemble of good attributes would probably have gotten him nowhere important in the political life of his time. His success depended on multipliers — a large fortune; an ambitious, politically manipulative father, good at surrounding young John with media toadies; a family ethic that sanctioned and demanded constant, conscienceless lying; a support base of fanatical Irish Catholics prepared to vote for anyone who shared their ethnic and religious identification, no matter what that person did; and an unbroken phalanx of media writers and performers for whom “Jack” embodied fantasies of male potency and sophisticated “culture.” His assassination provided another mighty multiplier, so mighty that sane people should thank God every morning that his brother, Edward (“Teddy,” then “Ted”) Kennedy, the inheritor of John’s manufactured charisma, never realized his life’s purpose of attaining the presidency.

Few readers of this journal need to be reminded of the fact that Edward Kennedy had no good qualities whatever, political or otherwise. Yet he might have become president; and after he died, he continued to be celebrated by crazed or cynical followers who would have hounded any person without his multipliers out of politics, if not out of the country.

Finally, a mere five decades after the event, a serious film has been made about the great divider of Kennedy’s political prospects, the incident of July 18, 1969, in which a drunken Kennedy drove a car off a bridge on Chappaquiddick Island, Massachusetts, drowning the young woman, Mary Jo Kopechne, who was with him. Kennedy left her to die, trapped in the car. Then he tried, in various ridiculous ways, to conceal his involvement. This proving impossible, he admitted some vague form of responsibility, retreated to his Irish Catholic base, which, I repeat, would swallow any kind of explanation from a Kennedy, and, with the aid of friendly media and the accustomed throng of social and intellectual gofers, rebuilt his political career.

Political multipliers can be mildly amusing, innocently useful, morally disgusting, or existentially disturbing. In the case of the Kennedy family, they are terrifying.

Jason Clarke, who plays Kennedy in this film, and director John Curran, both apparently modern liberals, seem to think that Kennedy rebuilt not only a career but a self; they seem to believe that he became a genuinely great political figure. The idea is absurd, and the film does nothing to support it. It shows Kennedy deciding to recover from the incident at Chappaquiddick by founding his life on ever more aggressive lies — which is exactly what he did.

The film is, indeed, closer to fact than any historical movie I have ever seen. By the time it’s done, you have encountered all the relevant evidence, evidence that gains power by being introduced slowly, by frequent revisits to the scene of the crime. The scenes, both indoor and outdoor, are impeccably authentic and meaningful as further evidence. To select a small detail: the camera notices that when Kennedy is to make a particularly “authentic” television broadcast, he is seated at a serious looking desk behind a case full of important looking books, but the legs of the desk are propped by haphazard piles of the same kind of books — a good indication of the importance of knowledge in the life of Ted Kennedy.

As for acting — at the start of the movie, Clarke doesn’t look or talk much like the Kennedy we saw all too frequently, but as he develops the character’s psychology he actually convinces you that the two are exactly the same, right down to the shape of the face. The other well-known people who are impersonated do the same (a sign of great direction). One of them is Bruce Dern, playing Kennedy’s father. Dern is the most recognizable of actors, but I didn’t discover who he was until I read the credits. Kate Mara has a hard job playing Mary Jo Kopechne, and her performance is not memorable, but she had a difficult task, given the fact that Kopechne was not allowed to achieve distinctness in real life. Clancy Brown does a magnificent Robert McNamara; Taylor Nichols presents an interesting view of the psychology of Ted Sorensen (perhaps the most respected of the Kennedy hacks), though without aspiring to the height of Sorensen’s towering arrogance; and Ed Helms does an excellent job in the difficult role of the one good guy, Kennedy sidekick Joe Gargan.

Ted Kennedy left Mary Jo Kopechne to die, trapped in the car. Then he tried, in various ridiculous ways, to conceal his involvement.

Real artists often exceed their conscious ideological programs simply by taking seriously their jobs as artists, so that in their hands a representation of human life takes on a life of its own, which is simultaneously our own real life, seen more deeply and rendered more self-explanatory. Artistic insight becomes analysis, and fact becomes a more suggestive truth. This is what Chappaquiddick does. Particularly revealing are the serious but irresistibly comic scenes in which all the hacks that money can buy are assembled to advise Teddy Kennedy about how to get out of the mess he has made. Here, viewed without overt explanation, analysis, or moralization, are a horde of important men, operating on the assumption that (A) the politician they serve is a destructive fool; (B) this politician must be elected president; and (C) his supporters must create all the lies and corruption necessary to make him so. The childishness is funny; the absolute lack of conscience is, in these true images of the powerful, terrifying. Add to that the movie’s evocation of the stolen prestige of John Kennedy’s presidency, and the Mafia-like adulation of “family” that has always characterized the Kennedys and their followers, and you have all the multipliers you need. The picture is complete.

I consider Chappaquiddick the third-best film about American politics, after Advise & Consent and The Manchurian Candidate. That’s quite an achievement.


Editor's Note: Review of "Chappaquiddick," directed by John Curran. Apex Entertainment, 2017, 101 minutes.



Share This


Evidence for Emerson

 | 

In the olden days — say, the 1960s — college professors were still carrying on debates about something called the Influence of Great Men on History. Basically, they denied that there was any.

Emerson had written, “An institution is the lengthened shadow of one man . . . and all history resolves itself very easily into the biography of a few stout and earnest persons.” A century later, not many earnest professors, spending their lives in the lengthened shadows of the American Historical Association, cared to believe that. Even Bismarck and Napoleon were the products of social circumstances, etc.

Trump’s very deficiencies offer good evidence for the historical influence of personality.

Arrayed on the other side were the writers of popular history. Whether they were sincerely attracted to the Emersonian idea, or they knew that social history doesn’t sell, they busied themselves about topics that assumed the crucial influence of a few important people. The multitudes of What Would Have Happened articles exemplify the trend: what would have happened if Hitler had ordered more air attacks at Dunkirk? What would have happened if Lincoln had not been shot? What would have happened if Lee had occupied better terrain at Gettysburg?

Well, what indeed? But the academics just got more and more “social.” Today, if you want to publish historical articles with Emersonian assumptions, you will not, I repeat not, get tenure.

Yet although they don’t seem to realize it, the professors are now faced with a dilemma. Almost all of them hate President Trump, and lots of them spend their idle hours — which appear to be many — campaigning against him, asserting that if he is permitted to prevail, America will become a nationalist, white supremacist, xenophobic state. But this assumes that the political shape of the nation has a good chance of being irrevocably changed by the election of a single powerful personality. And this is contrary to what you think you believe.

Trump — because he is Trump — diverted himself with midnight messages, confused assaults on Obamacare, and puerile entertainment of his core supporters.

I’ll leave people who are so wise about history to discuss that problem among themselves. I simply wish to note that Trump’s very deficiencies offer good evidence for the historical influence of personality. If Trump’s personality were not significant, wouldn’t the social movement that elected him have the professors and the other members of the ruling class on the run by now?

A person of normal discernment could have followed up his victory, which was a triumph over the entrenched leadership of both political parties, by getting at least three or four parts of his agenda immediately enacted. Every victory would have strengthened his position for the next big effort to fulfill his movement’s social demands. But no. Trump — because he is Trump — diverted himself with midnight messages, confused assaults on Obamacare, and puerile entertainment of his core supporters. So far, none of his opponents’ fears, real or purported, have turned out to have been justified. Is this not evidence for the crucial importance of the individual personality?

And as to his opponents . . . Their strategy has depended on the socialist ideal of mass resistance among the populace. And what has this strategy accomplished? Nothing in particular. What has stymied Trump hasn’t been their Marches for Science and Marches for Women and Marches Against the Border and Marches for the Sake of Marches. It has been the lengthened shadow of Trump himself.




Share This


Cruise Ship Books

 | 

I’ve discovered some of my favorite authors while perusing the books other passengers have left behind in shipboard libraries. While cruising in Alaska, I discovered Michael Frayn and couldn’t stop until I had devoured all of his novels and then looked around hungrily for more.

Spies is perhaps my favorite. A pungent aroma sparks a memory from a man’s childhood during World War II and compels him to return to his childhood village, where he tries to make grownup sense of things that happened there so many years ago, while he and his boyhood friend hid in the privet bush pretending to be spies. The dual perspective of middle age and childhood, as well as the contrast between the WWII setting and the “present” of 30 years later makes the book particularly evocative, and the mystery of what actually happened drives the story.

Frayn’s books often present a tone of detachment and loss, as expressed in these opening lines from A Landing in the Sun:

On the desk in front of me lie two human hands. They are alive, but perfectly still. One of them is sitting, poised like a crab about to scuttle, the fingers steadying a fresh Government-issue folder. The other is holding a grey Government-issue ballpoint above the label on the cover, as motionless as a lizard, waiting to strike down into the space next to the word Subject.

These hands, and the crisp white shirtsleeves that lead away from them, are the only signs of me in the room.

The separation of the action performed by his hands from his intentional, sentient will is also reminiscent of the book-burning fireman, Montag, in Ray Bradbury’s Fahrenheit 451: “Montag . . . glanced to his hands to see what new thing they had done.” Both authors use synecdoche effectively to suggest the protagonist’s looming split with authority and his attempt to regain control over his life.

A pungent aroma sparks a memory from a man’s childhood during World War II and compels him to return to his childhood village.

It was on a cruise ship traveling around Australia and New Zealand that I discovered the historical novelist Tracy Chevalier. Her Girl With a Pearl Earring, in which she creates a compelling and poignant backstory for the supposed model of Vermeer’s famous painting, had recently been made into the film that launched Scarlett Johansson to stardom; but the book that hooked me was Fallen Angels, a name that refers to the memorial stones in a local graveyard but also to the fallen characters within the story. Beginning at the end of the Queen Victoria’s reign, the book’s multiple storylines focus on husbands and wives, friends and lovers, ruling class and servant class, and a gravedigger’s son.

From the moment I entered Chevalier’s world of shifting narrative perspectives set in turn-of-the-century England, I didn’t want to leave. I felt a profound sense of loss as I read the final page and reentered the 21st century. Similarly, while viewing Manhattan from across the river in one of the books I review here, a character observes, “You wanted to approach it for the rest of your life without ever quite arriving.” That’s how I felt while reading many of the books I’ve mentioned in this review — I wanted to approach the end, but never quite arrive there.

This past month I was cruising the western Mediterranean when I discovered Rules of Civility by the talented author Amor Towles, whose fresh metaphors and unexpected developments delighted and surprised me. I had barely finished reading it when, craving more of his elegantly crafted sentences and trusting his storytelling skills, I downloaded his second novel, A Gentleman in Moscow, to my Kindle for the long flight home from Europe.

I felt a profound sense of loss as I read the final page and reentered the 21st century.

The title Rules of Civility refers to George Washington’s Rules of Civility & Decent Behaviour in Company and Conversation, a dog-eared copy of which is discovered by the narrator, Katey Kontent, on the bedside table of the central character, Tinker Grey. Towles’ book is a novel of manners set in 1938 Manhattan and framed by a 1966 photography exhibition. As the book opens, a middle-aged Katey spies two candid photographs taken of her long-lost friend Tinker at the beginning and the end of 1938. This chance sighting becomes the catalyst for her recollection of that year, a year that became a turning point in her life as she navigated between boarding houses and mansions, trust-fund kids and dockworkers, the upper West side and the lower East side, in her journey to define who she would become.

Katey begins 1938 living in a women’s boardinghouse and working in a steno pool. She and her roommate, Eve Ross, meet the posh and elegant Tinker Grey at a restaurant on New Year’s Eve, and he becomes the direct and indirect catalyst for everything else that happens that year.

Like the original “novels of manners,” set in manor houses and often populated by governesses or impoverished heiresses who make satirical observations about the ruling class, Rules of Civility contains biting, cogent cultural commentary. Katey is paid well as a secretary, but when she decides to move on to a job in the literary world, she must accept a huge cut in pay. She wryly observes, “A secretary exchanges her labor for a living wage. But an assistant comes from a fine home, attends Smith College, and lands her positions when her mother happens to be seated beside the publisher in chief at a dinner party.” I worked under an executive director who landed her position in the same way, and it was just as galling. Katey also notes, after guests at a dinner party heap praises upon the hostess at the end of a fine meal, “This was a social nicety that seemed more prevalent the higher you climbed the social ladder and the less your hostess cooked.”

Sudden changes in tone or circumstance permeate the book and provide elegant twists that would create envy in the heart of a mystery writer.

Katey is a philosopher by nature, and her thoughts begin to resonate with the reader. As she looks back on 1938, she recalls, “To have even one year when you’re presented with choices that can alter your circumstances, your character, your course . . . shouldn’t come without a price. I have no doubt [my choices] were the right choices for me. And at the same time, I know that right choices by definition are the means by which life crystallizes loss.”

Sudden changes in tone or circumstance — in this case, from the bright optimism of making right decisions to the mournful grief of cutting oneself off from other options — permeate the book and provide elegant twists that would create envy in the heart of a mystery writer. Consider the span of emotion in moments like these: “I tore the letter into a thousand pieces and hurled them at the spot on the wall where a fireplace should have been. Then I carefully considered what I should wear.” And: “Something fell from my jawbone to the back of my hand. It was a teardrop of all things. So I slapped him.” And this cautionary reflection: “In moments of high emotion — whether they’re triggered by anger or envy, humiliation or resentment — if the next thing you’re going to say makes you feel better, then it’s probably the wrong thing to say.”

Katey wants everything to be neat and orderly and open. As a secretary, she “suture[s] split infinitives and hoist[s] dangling modifiers,” and she wants life to be as simple as that, with rules that allow no ambiguities and people who are who they say they are. But soon she realizes that “it’s a bit of a cliché to refer to someone as a chameleon, a person who can change his colors from environment to environment. In fact . . . there are tens of thousands of butterflies, men and women like Eve with two dramatically different colorings — one which serves to attract and the other which serves to camouflage — and which can be switched at the instant with a flit of the wings.” Katey herself is a chameleon, adapting to her different environments by adjusting her clothing until she decides which environment will become her natural habitat.

Katey is not well-bred, but she is well-read, and her running references to such books as Walden, Great Expectations, Washington’s Rules of Civility, Agatha Christie novels, and others add depth to the story. I especially like the way she combines insights from Thoreau and Christie to deliver this:

In the pages of Agatha Christie’s books men and women, whatever their ages, whatever their caste, are ultimately brought face-to-face with a destiny that suits them. . . . For the most part, in the course of our daily lives we abide the abundant evidence that no such universal justice exists. Like a cart horse, we plod along the cobblestones dragging our heads down and our blinders in place, waiting patiently for the next cube of sugar. But there are certain times when chance suddenly provides the justice that Agatha Christie promises.

We all have turning point moments in our lives — moments that occur during the years when we’re deciding who we will be and making decisions so profound that they change our course completely and irrevocably. They often seem insignificant at the time, and the people who influence us most profoundly move on from our lives. Although we may never see them again, we think of them frequently. I have one such friend from my childhood who moved on from my life when we were 11, yet much of who I am today comes from the experiences I shared with her during the three profound years we spent together, just as Michael Frayn’s narrator in Spies is forever influenced by the events he experienced with his childhood chum.

Katey herself is a chameleon, adapting to her different environments by adjusting her clothing until she decides which environment will become her natural habitat.

Towles suggests the importance of these so-called minor characters when Katey begins reading a Hemingway novel from the middle rather than the beginning: “Without the early chapters, all the incidents became sketches and all the dialogue innuendo. Bit characters stood on equal footing with the central subjects and positively bludgeoned them with disinterested common sense. The protagonists didn’t fight back. They seemed relieved to be freed from the tyranny of their tale. It made me want to read all of Hemingway’s books this way.”

And Katey learns to read life that way too — paying more heed to the side characters who influence us unexpectedly. Eventually she discovers that “when some incident sheds a favorable light on an old and absent friend, that’s about as good a gift as chance intends to offer.” Reading Rules of Civility gave me cause to reflect on many an old and absent friend, and that’s one of the many good gifts of this book.


Editor's Note: Reviews of "Spies," by Michael Frayn. Picador, 2002, 261 pages; "A Landing on the Sun," by Michael Frayn. Picador, 2003, 272 pages; and "Rules of Civility," by Amor Towles. Penguin Books, 2011, 338 pages.



Share This
Syndicate content

© Copyright 2018 Liberty Foundation. All rights reserved.



Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.