You Can Write Whatever You Want

 | 

Word Watch has entered its fifteenth year of operation — a good time to take up a subject that deserves to be addressed, especially on behalf of a libertarian audience. I do this from time to time, in various ways, but I haven’t done it for quite a while. So here goes.

There are two ways of discussing grammar and usage. One is descriptive: you try to describe how a language “really is,” right now. The other is prescriptive: you try to say what the language should be. Almost all academic linguists, and some academic grammarians, take the descriptive, supposedly scientific course. This column, however, is strongly prescriptive. It tries to describe what’s going on, but it also tries to state where it went wrong (or, occasionally, right). But rules and prescriptions — even advice — can be hard to sell to libertarians.

Every 20 years, there’s a revolution among people who teach composition in American colleges, and everybody has to swear allegiance to a new creed.

“Why,” readers sometimes ask, “do I have to bother with rules? Why can’t I say and write whatever I want, so long as my audience understands me? And who has the right to make these rules, after all?”

Those are good questions, and they deserve good answers. Here they are.

It’s true; you don’t have to bother with rules. By right you are free to express yourself in whatever way you want. If you want to say, “I think everybody should, like, go ahead and, like, you know, vote for the, like, party of freedom?, I mean the, you know, Libertarian Party?” you can do so, and a normal audience will understand what you’re driving at, despite the pointless interjections and the irritating “uptalk.” But that doesn’t mean that no one should have the temerity to suggest that you sound like an idiot.

Every 20 years, there’s a revolution among people who teach composition in American colleges, and everybody has to swear allegiance to a new creed. According to a creed that was temporarily espoused about 40 years ago (and probably 40 years before that, too), teachers should not presume to “correct” their students’ work; they should simply encourage them to abandon their fears and write. If you did enough writing, you would turn out well. Or good. Or something. Old-fashioned teachers were satirized for penning such insolent marginalia as, “I know what you’re trying to say, but it’s not quite coming through.” How, it was demanded, do you presume to think that you know what somebody else is trying to say, yet not really saying? But of course that’s nonsense. If a young American writes, “The judge was so disinterested that she fell asleep,” you’ll know what he meant, and you’ll also know that the word he needs, whether he knows it or not, is “uninterested.” You’ll know it because you’re more competent than he is.

Well, who has the right to say that uninterested is the right word? People use uninterested and disinterested interchangeably, all the time.

Yes, that’s a description of what they do. They also say, “I seen you crossing the street,” all the time. There remains the question of what they ought to do. Anyone has the right to decide that question, and be correct or incorrect about it, just as anyone has the right to decide whether Raphael was a better artist than the four-year-old next door.

How, it was demanded, do you presume to think that you know what somebody else is trying to say, yet not really saying?

In respect to Raphael and the preschool kid, the best qualified judges will be people who have seen lots of art, and lots of kinds of art, people who are familiar with the various effects that art tries to achieve and who are perceptive enough to notice whether those effects actually are achieved by any given work. Such judges may well say that the kid’s paintings project an immediacy that Raphael never achieved (or wanted), but they will also say that when it comes to the art museum, or the narthex of the church, Raphael is better.

These people’s judgments will carry authority, but the issue is never who shall judge but how the judgment is made. If you have a competent understanding of the resources of the English language, you know that uninterested and disinterested are traditionally considered virtual opposites of each other, and you recognize that the distinction between them continues to be valuable. It allows people to say such things as, “The judge was admirably disinterested, but she was woefully uninterested in the case before her.” No one has authority over the English language — not even Webster’s Dictionary and Fowler’s Modern English Usage — but every informed and rational judgment is authoritative.

This is no tyranny. It is a vindication of the individual mind.

Having said all this, I don’t want to sacrifice too much to the idea that, yes, we can understand you, no matter how ugly we consider your self-expression. Sometimes — not infrequently — we cannot understand you, because your ugliness gets in the way.

On August 11, a good example came into my possession. I was on a ship off the eastern coast of Canada, and my internet connection didn’t work. Actually, I was too cheap to make it work. Anyway, I picked up a copy of the news digest that the New York Times provides for the maritime trade, and there I found an article about James K. Galbraith, an economist who gives zany advice to people in Greece. Apparently the advice is to initiate the millennium by inflating the currency enough to repudiate the nation’s debts. If I’m wrong about this, I’m sorry; I’m just trying to interpret the Times account of his notions:

Galbraith . . . argued passionately that a new currency would wash away the country’s debts, solve Greece’s competitiveness problem and ultimately create what he called a “good society.” A step opposed by a vast majority of Greeks, he had drawn up a contingency plan for Greece under [finance minister Yanis] Varoufakis’s direction, in case the country was forced to leave the [euro] currency zone by its creditors.

Yeah, yeah, yeah. But there’s a crucial grammatical error in there. It occurs in the second sentence. It’s a dangling modifier. “A step opposed by a vast majority of Greeks” is supposed to fit with or “modify” something else in the sentence, but what? The normal candidate would be the noun or pronoun immediately following the modifier. In this sentence, that candidate is “he.” Unfortunately, “he” is not a “step.” So the modifier is dangling, apparently unattached to anything else.

Many Greeks probably like his ideas, but most are probably unaware of his personal existence.

Now, the English, and especially the British, language is full of dangling modifiers. They are widespread, but they are wrong. When you, as a reader, pay the kind of attention to sentences that any author would appreciate your paying, you try to visualize the author’s meaning. But when a sentence includes a dangling modifier, the resulting image is misleading or ludicrous. Steps don’t draw up contingency plans.

And here the dangling modifier creates a problem that is worse than aesthetic. It’s a conceptual problem: what is the step that most Greeks oppose? It is not, alas, Mr. Galbraith (“he”); many Greeks probably like his ideas, but most are probably unaware of his personal existence. Well, is it the “contingency plan”? Probably not. Schemes to get rich by welshing on your debts are usually pretty popular. OK. How about “a good society”? No . . . few people are “opposed” to anything like that.

We’re still searching for the “step.” But notice that now we’re trying to figure out what the passage means by using whatever we knew about the subject before we read the passage. We’re going in reverse: authors are supposed to say something that adds to our knowledge, not something that depends on our pre-existing knowledge to understand.

All right, suppose that the “step” to which most Greeks are “opposed” is “a new currency”? Maybe. Probably. But how can we be sure? No matter how free you are with your lingo, “currency” is not a “step.” And we reached our identification of “step” with “currency” only through the process of eliminating every other possibility. It’s conceivable that there is no “step,” that the passage is literally meaningless.

Writing of this quality hardly inspires confidence in the New York Times. It’s a harsh saying, and sometimes wrong, but it’s basically true: if you don’t reflect on the way in which you say things, the things you say are unlikely to be taken seriously.

That’s true about words that appear in professional jargon or regional dialects or purely colloquial language as well as about words embedded in formal written English. If you’re trying to talk like a millennial and you don’t recognize the difference between dude, bro, brother, and mate, you’re not going to be treated as reliable on most of the subjects you want to address. If you’re trying to make some intellectual contribution and you show that you don’t care, or maybe don’t know, about the rules of grammar, your readers will wonder, perhaps justifiably, whether you have anything to contribute. And if you, as an intellectual, turn out writing that’s as stiff as a board, people will begin to ask themselves whether you have the understanding of human beings, their likes and dislikes and ways of interpreting the world, that is necessary to most intellectual disciplines.

We’re going in reverse: authors are supposed to say something that adds to our knowledge, not something that depends on our pre-existing knowledge to understand.

I’m talking about what Aristotle called ethos — the perceived character of a speaker or writer. As Aristotle observes, if you don’t have a decent ethos you’ll have lots of trouble getting other people to listen to you and agree with you. But I’m also talking about what we moderns call empathy — human beings’ ability to imagine what others are like, what others are likely to feel, how others are likely to react to what we do or say. If you can’t summon enough empathy to look at your sentences and see whether your readers will receive them as clear communications or as verbal puzzles, you should stop writing until you’re in a better mood. If you insist on writing, “All taxpayers have been now directed to submit his/her forms to the nearest IRS/tax office,” you should have enough empathy to know that while most readers will understand your sentence, sort of, their attention will be fixed not on your meaning but on your annoying slash forms, your unidiomatic placement of words (“now”), and your odd switch from plural (“taxpayers”) to singular (“his/her”). You’re free not to realize that or to care about it, but don’t think you’ll emerge from that sentence with your ethos intact.

In July, the computers at Southwest Airlines failed, and hundreds of flights were canceled. Passengers were understandably unhappy, but the icing on their cake of fury — note: I have enough empathy to realize that you will realize that this is an awful metaphor; I ask you to have enough empathy to understand that the image is supposed to be amusing — the icing on their cake of fury, I say, was Southwest’s explanation of the affair, an explanation that the airline placed on an obscure website, which nobody ever goes to, an explanation headlined by these words:

Information Regarding Operational Impact of Technology Issues

Is there a man or woman on the face of the earth who would guess that this had anything to do with a canceled flight? Empathy? We don’t need no stinkin’ empathy. And is there a person on earth who, after finally getting the point, would retain any confidence in anything that Southwest might deign to say thereafter?

Proceeding to another sample of great corporate writing — an advertisement for the Viking line of cruise ships, which makes this claim:

Designed as an upscale hotel, Viking’s chefs deliver a superb . . . experience.

Here’s another dangling modifier: “designed as an upscale hotel.” It’s presumably the ship that’s designed that way, but ship is nowhere in the sentence. What the sentence literally means is that Viking’s chefs are designed as an upscale hotel.

It’s funny; you laugh. Then you wonder: if Viking’s spokesmen are so careless with words, are they also careless with meanings? Can it be that their messages are just so many phrases thrown at the audience, to see what will stick? Can it be that a Viking ship is not actually anything like an upscale hotel? But one thing is clear: no one at Viking imagines that readers will actually think about its messages.

Obama knows the past tense of “see,” but he doesn’t know about a thousand other things he should know if he wants to maintain his ethos of literacy.

Ethos and empathy . . . On July 22, President Obama made a statement in which he said that Donald Trump’s speech of the night before “just doesn’t jive with the facts.” The president said that twice. Of course, what he meant was “agree with the facts.” I know that. But the word that means “agree,” in this sense, is jibe, not jive. I understand that there are millions of people who don’t know the difference. Peace to all such. There are also millions of people who don’t realize that the past tense of “see” is not “seen.”

Obama knows the past tense of “see,” but he doesn’t know about a thousand other things he should know if he wants to maintain his ethos of literacy. He has, for instance, never mastered the like-as distinction or the not-so-subtle rules governing pronoun case (“just between you and I”). With him, lack of ethos is related, as it often is with lesser mortals, to a lack of empathy. He has evidently never asked himself whether there may be persons on this planet, and millions of them, who know more about grammar and usage than he does, so he has never felt the need to investigate these subjects.

“Well,” I hear some truly generous libertarians saying, “we all make mistakes.” Yes, we do. Indeed we do. And that’s the most important realization we can have about this subject. We all make mistakes. The question is whether we want to notice our mistakes and do something about them.




Share This


Education or Fantasy?

 | 

When asked, “Why a third party, when it has no chance of winning?”, supporters of the Libertarian Party and other Thirds usually say, “We’re running an educational campaign.” That makes sense. It would be several steps beyond sense to spend your time figuring out ways in which you might actually win. But that’s what Gary Johnson, presidential nominee of the Libertarian Party, seems to be doing, with some help from the media.

Interviewed on August 28 by Chris Wallace of Fox News, Johnson said that, given the “polarization” of the two major-party candidates, his party “might actually run the table” — this year’s cliché for “winning big.”

This logic defies analysis: how would Democrat vs. Republican polarization induce voters to go for a polarization of Libertarians vs. Democrats and Republicans?

I don’t know whether it’s more likely for Johnson to win outright than to win in a House election, since there is no chance of either.

Wallace, who should know better, tried to save the situation by projecting a future in which Johnson could get a majority in enough states to throw the election into the House of Representatives, where he could emerge victorious. Johnson has encouraged that idea in the past. But this time he said, “The object is to win outright.”

I don’t know whether it’s more likely for him to win outright than to win in a House election, since there is no chance of either. If you think it would be more of a feat to gain a majority in the Electoral College than to be elected after throwing the election into the House, consider the fact that voting in the House would take place by state, and the Republicans have a majority in most state delegations.

What these fantasies have to do with an educational campaign, besides discrediting it, I don’t know. Maybe, in some way, they Keep Hope Alive. But that wasn’t Johnson’s concern when he agreed that, if he doesn’t get into the presidential candidates’ debate, “It’s game over.”

I would like to see Johnson in the debate. His presence would make it possible for me to watch the affair without having a physician at my side, ready to administer emergency aid. But if he doesn’t get in, is he just going to sit out the rest of the campaign?




Share This


The Unmentionables

 | 

I’ve noticed a curious phenomenon. Virtually no one I know is willing to start a conversation about the current election campaign.

As an academic with many academic acquaintances, I grew used to hearing people inject George Bush into every possible conversation, always in a derisive manner. But this year, I can count on the fingers of one hand the number of mentions my colleagues have made of either Donald Trump or Hillary Clinton. The same pattern holds with groups or individuals who I am certain will vote for Trump. No one wants to mention the campaign or the personalities.

This is a welcome relief, but why is it happening? I have several guesses.

  1. People are aware that this election is even more divisive than the past few elections, and they’re unwilling to start a fight.
  2. Many people who are expected by their friends to vote for Clinton will actually vote for Trump, and vice versa, and they don’t want to give themselves away.
  3. Everybody’s just sick of the damned thing.

These ideas may go far toward explaining the matter. But there’s at least one other possibility. Many people are discouraged about the presidency itself. They regard it cynically, as just one more object on a growing pile of political rubbish.

I’m not sure whether it’s good or bad that people feel that way. The imperial presidency lost almost all of its glamor with the abject failure of Obama (whom, by the way, hardly anybody ever mentions either). That’s certainly good, and maybe it’s permanent. I’m not sure, however, that complete political cynicism is a good long-run strategy for the pursuit and capture of individual freedom.




Share This


Let Us All Come to Worship at Olympia

 | 

As the editor of a journal, I get lots of email advertising other journals. (Yeah — go figure.) I got some on August 19 from Commonweal, the liberal Catholic mag, puffing an article by E.J. Dionne, which starts with the following display of asininity:

Simone Manuel, Katie Ledecky and Simone Biles will not be eligible to run for president until 2032, although Michael Phelps hits 35 years old in 2020. After watching these Olympians display so many traits we admire — persistence, discipline, grace, goal orientation, resilience, and inner strength — perhaps we should consider drafting one of them some day.

It is both a blessing and a curse that the Summer Olympics happen during the election year. The blessings are obvious. Especially in this campaign, it is a relief to watch a display of American talent that truly brings the country together.

My first thought was, “It’s odd to be reading this a day after learning what Ryan Lochte and his Olympian buddies did in Brazil.” They might have shown a kind of persistence and goal orientation (whatever that means), but I have some doubts about the rest of Dionne’s list of admirable qualities. The idea that because somebody is a good athlete he or she must be a wonderful human being, and a political genius to boot, is even more ridiculous than the notion that because somebody is a good artist or musician, he or she must have all the answers about everything else.

But the thing that left a nasty taste in my mouth was the nonsense about bringing the country together. Let’s get this straight. It’s not necessarily a good thing that a country comes together. It’s most likely to come together under the influence of fear or in an episode of mob behavior or at the bottom of a descent to the lowest common denominator. Anyone with a brain must have observed that millions of fans all cheering for the Chicago sports teams have done precisely nothing to bring the city together in any other respect or to solve any problem except the teams’ desire for profits, and the same can be said about any other audience in any other city.

“For God’s sake and for God’s sake,” as the lady says in A Portrait of the Artist, do we have to keep hearing nonsense like this?




Share This


How to Succeed by Failing

 | 

While Hollywood remains determined to commit financial suicide by focusing on their never-ending stream of superhero flicks (such as the recent Suicide Squad), film lovers can still find satisfying fare by looking beyond the major studios. Florence Foster Jenkins is a case in point. Produced by BBC Films, it is utterly delightful — and it delivers an important message about passion and dreams to boot.

The film is set in 1944 New York, where Madame Florence (Meryl Streep) is a popular socialite and patron of the arts, a woman who has established several music clubs to further the careers of budding composers and musicians. She also loves to produce tableaux and small operatic concerts with herself in the principal roles. The only problem is that she can’t sing. The solution, for her, is that she is blissfully unaware of how painful her voice is, and her husband St. Clair (Hugh Grant) is determined to keep it that way. He sits in on her lessons, hires only the best voice coaches and pianists, and invites “members only” to her performances while keeping the legitimate press away.

This ruse becomes more difficult when Florence books herself at Carnegie Hall, and St. Clair has to work even harder to protect her from learning the truth about how she sounds to others. (Evidently there is more than one way to get to Carnegie Hall; while most musicians have to “practice, practice, practice,” “money, money, money” can be just as effective.)

Their love is entirely believable and entirely captivating. We glimpse the richness of a relationship that endures in sickness and health, in good times and bad.

As Madame Florence prepares her new accompanist, Cosmé McMoon (Simon Helberg), for their “rigorous” training schedule, she warns him, “I practice an hour a day — sometimes two!” She hires the likes of Arturo Toscanini (John Kavanagh) as her voice coach, and St. Clair sits in on the lessons, smiling contentedly as she sings. We in the audience wince painfully, and Cosmé can barely control his embarrassed laughter — until he realizes that St. Clair is as serious as Florence about her singing.

Soon, however, Cosmé is drawn to Florence’s eccentric charm. So are her numerous friends. And so are we. From the flowery froufrou and feathers of the self-designed costumes on her matronly figure to the bathtub full of potato salad for her lunchtime soirées, we can’t help but love her free spirit.

Because Florence has a chronic health condition, she and St. Clair have a platonic marriage. But their love is palpable. He protects her and cares for her with a tenderness that transcends the tear-off-her-clothing kind of love portrayed in most movies. And she returns his affection with the confidence and sincerity of a woman who feels adored. Their love is entirely believable and entirely captivating. We glimpse the richness of a relationship that endures in sickness and health, in good times and bad.

Hugh Grant made his career playing the young and somewhat bumbling British heartthrob with the self-effacing demeanor and dazzling boyish smile. Then, in Music and Lyrics (2007), at the age of 47 he played a washed-up singer almost as an aging parody of himself, as though he couldn’t imagine himself as a believable love interest any longer. Happily for us, he was coaxed from a self-imposed semi-retirement by the prospect of playing opposite Meryl Streep. Actors have said that they love performing with Streep; she is so fully engaged in her character that they can become more completely engaged in their own. In this case Grant seems to provide that same emotional depth for Streep, who as Florence Foster Jenkins gives what I think is her finest performance ever — and this is a woman who has been nominated for 19 Oscars and has won three of them.

Madame Florence hears the voice of an angel when she sings, and that is the subtle message of her story: embrace your passions, whether or not you have the skill or talent to succeed by the world’s standards. It made me think of the many people this month who, fueled by the passion they’ve seen in the Olympic athletes, donned track shoes or swim suits and began “training” for the Tokyo Olympics four years from now. I remember the skating moms I knew when my daughter was a competitive figure skater, and how each of us imagined our daughters would stand on that ultimate medals platform — even though we knew, deep down, that the chance was pretty slim.

I also remember performing in Oklahoma! many years ago and being so surprised when I saw the video of the show — I had danced with the heart of a Rockette, but in reality I had barely left the ground. Still, I love to dance, and I’m perfectly happy never to see what I look like. In my heart and my mind, I’m a pro. As Madame Florence confesses with a smile, “They may say that I can’t sing, but they can never say I didn’t sing.” So sing your heart out. Or run. Or do whatever it is that brings you joy. Don’t let what others think keep you from doing what you love.

Embrace your passions, whether or not you have the skill or talent to succeed by the world’s standards.

Just how bad was the voice of the real Florence Foster Jenkins? When Streep began to sing, I thought she was exaggerating the squawkiness out of fear that modern audiences, raised on pop culture, wouldn’t know a well-sung aria from a flat one. I thought that no one could really sing that badly. But as the credits were rolling at the end of the film, a recording of the real Florence’s voice was played, and I have to hand it to Meryl Streep — she nailed it. It was godawful. But the film is brilliant. Don’t miss it.


Editor's Note: Review of "Florence Foster Jenkins," directed by Stephen Frears. BBC Films, 2016, 110 minutes.



Share This


Customer Care

 | 




Share This


A Libertarian Novel of Ideas

 | 

“You couldn’t make this stuff up, yet it all seemed so natural and matter of fact” — so says Free Dakota, William Irwin’s new novel, in a comment about some of its own plotline. The first part of the comment is, of course, as fictional as the novel itself: Irwin made all “this stuff” up. But the second part is real and true: Irwin’s unusual work of fiction does somehow seem natural and matter of fact.

The book demonstrates, for one thing, that you can start to make a fantasy seem real if you surround it with realistic detail. Free Dakota is as loaded with Americana as a Tom Wolfe book, or it would be, if it were as long as a Tom Wolfe book. Actually, Free Dakota is becomingly brief. Irwin selects exactly the right details to make a fictional small town feel like a real small town, to make a fictional diner in North Dakota seem like a real diner. He does the same with his characters. But descriptive details are never enough. Irwin adds to the details a story that appears to be real because the plot develops naturally from his characters’ psychology.

If there’s a protagonist of this novel, it’s Don, a middle-aged novelist with an enormous writer’s block and even more enormous ennui. Don finds a purpose in life in the attempt of a libertarian group to get like-minded people to move to North Dakota and then vote to secede from the Union. Vying with Don for the role of protagonist (the “apron” as the novel calls it, for various good reasons) is Lorna, a high-class madam who unexpectedly (but reasonably, as Irwin makes it appear) follows Don to North Dakota and becomes a libertarian activist. A third interesting person is Mackey, a dropout from a Catholic seminary who in place of formal religion has adopted classic libertarian ideas, and a desire to live up to them.

Irwin selects exactly the right details to make a fictional small town feel like a real small town.

The plot takes us all the way through the secession campaign, and I won’t spoil the fun by telling you how it turns out. Along the way, there are many arguments for and about libertarian ideas, arguments that most libertarians have encountered before, but probably not in so clever and attractive a guise. And this may be the place to record what Irwin told me about the genesis of his story:

I've always wanted to write a novel, but I wanted to have something to say. I love Plato's dialogues for their ability to raise questions and expose how little we know. I'm an accidental libertarian. I just didn't care much about politics or political philosophy until I was well into my 30s. I just wanted to be left alone, and it was becoming more and more clear to me that the government wasn't leaving me alone. I read [Robert] Nozick and others. Only much later did I read the novels of Ayn Rand, and Atlas Shrugged made a big impression on me. For all its faults, it's a great novel. I think you need to read it as being in the same genre as Brave New World and 1984. It's not meant to be completely realistic, but it conveys important ideas. So, along with Plato's dialogues, Atlas Shrugged is the main source of inspiration for Free Dakota. Readers will find plenty of allusions to Atlas Shrugged in Free Dakota.

That’s true, and charming. More remarkable is the novel’s wealth of libertarian ideas, and how easily Irwin gets us into and out of the arguments about them. Free Dakota is a “novel of ideas” that has freed itself from the melancholy history of its genre. The ordinary “novel of ideas” is all ideas and no novel — witness Henry Hazlitt’s Time Will Run Back (1951, 1966), a complete guide to free-enterprise capitalism, written by a good economic journalist. Time Will Run Back is one of the most intelligent books in the world, and one of the nicest. It’s even nice enough to offer a basic plot — a story about the conversion of a dictatorship into a libertarian society. But the plot doesn’t really matter, and neither do the characters. It’s only the arguments that count, and long before you reach the end, you’re ready for something else, such as a real novel.

Irwin’s book, however, is a real novel. You can read it without knowing anything about libertarian ideas and still get involved with the story and characters. And this is strange, considering the way in which Irwin, a professor of philosophy, says that his book developed: “I started with the exchange of ideas and added layers of detail concerning characters and settings in subsequent drafts. I had a basic idea for the plot but not a detailed outline, so I let it develop naturally.”

Free Dakota is a “novel of ideas” that has freed itself from the melancholy history of its genre.

I can think of no other literary work that was written in this manner — starting with the intellectual debates, and adding the details later — that still succeeded in recommending itself as a novel, as opposed to a series of essays. The vitality of Irwin’s story can result from only one source — a sustained interest in the varieties of human life and character. That’s why the plot could “develop naturally” and not mechanically.

There’s another unusual thing about Irwin’s narration, and I believe it has something to do with the “naturalness” of the book. In a normal novel, one learns a great deal about the youth, education, career choices, dietary preferences, and other features of the major characters. One often learns these things soon after the characters come on stage. Irwin’s novel, by contrast, functions on a rigorously need-to-know basis. Most characters are hardly introduced at all; they just show up. Later, one finds out more about them, but seldom anything that isn’t absolutely necessary to the movement of events.

This method never advertises itself; it’s simply a thing one begins to notice. It wouldn’t work in a story that tried to accomplish anything like a sociological survey, or depth psychology — but, come to think of it, it manages to preserve more of the alluring mysteriousness of the human mind than many a chapter of Henry James. And what one sees on the page is more realistic, more natural, in one definition of those terms, than the normal novelistic treatment. In real life, new people aren’t introduced to us with full accounts of themselves, extensively cross-indexed. They just show up, and if we’re interested in continuing the relationship, we learn enough about them to do that. The basis of Irwin’s tale might pompously be called the epistemology of normal life. Whatever you call it, it gives the book a good deal of its realism and credibility.

Something that Irwin told me suggests that for him, creating characters was a good deal like the experience of meeting people in normal life. We meet Person X, recognize similarities and differences between him and ourselves; then, perhaps, we establish further relationships, with the people he knows:

I started with the protagonist, Don Jenkins. Like me, he's an accidental libertarian, but there aren't too many similarities between us beyond that. . . . Most of the other characters developed in reaction to Don Jenkins. I would ask myself who Don would meet in a given situation and I would go with the images and ideas that occurred to me.

It’s a part of normal life to connect ourselves not just with real people but also with the people we meet in books. So it’s very natural for Irwin to add, “The diner owner, John Mackey, is inspired by Hugh Akston from Atlas Shrugged.” Readers who already know Hugh Akston will be interested, and I think amused, to see their old friend from a new perspective. But that’s the way the world really is; people change in interesting ways when we see them in new company.

In real life, new people aren’t introduced to us with full accounts of themselves, extensively cross-indexed. They just show up.

Almost as natural is another comment from the author about the images and ideas in his story: “Some of these changed in subsequent drafts, of course.” Free Dakota is natural in its method but well meditated in its execution. It’s a work of libertarian thought and action that libertarians will warmly welcome.


Editor's Note: Review of "Free Dakota," by William Irwin. Roundfire, 2016. 203 pages.



Share This


Type B, Meet Type B

 | 

R.W. Bradford, the founder of this journal, was an acute political analyst, thoroughly familiar with American history and American life in all its forms. I’ve read a lot of professional commentators on American politics, but Bill Bradford’s chance observations showed more knowledge and intuition than 90% of the commentators show in a lifetime.

Every four years I recur to something Bill said to me one day, almost by chance. He said that there have been two types of presidential candidates: (A) those who had a perennial constituency — in Bill’s words, those “who always had a lot of people who wanted them to be president” — and (B) those who didn’t, those whom “nobody ever wanted to run.”

Crowds of people loved them, honored them, backed them in every attempt at the highest office.

It wasn’t a difference between people with good ideas and people with bad ones, although Bill said that he’d always had a weakness for the old maxim that “the job should seek the man,” not the other way around. The difference had to do with the psychology of the candidates and of their willing or unwilling supporters. Because of that difference, there might also be a difference in the candidates’ campaigns and their performance in office, if they managed to get into office.

I think there’s a good deal of truth in Bill’s idea. I think it provides an interesting perspective on how things work. And I think it’s sadly appropriate to what we see this year.

Think about it. Andrew Jackson, Daniel Webster, Henry Clay, Stephen Douglas, Ulysses S. Grant, William Jennings Bryan, Robert LaFollette, Robert Taft, Barry Goldwater, Hubert Humphrey, Ronald Reagan . . . Crowds of people loved them, honored them, backed them in every attempt at the highest office. These people cheered their victories, mourned their defeats, and convinced themselves that the defeats were victories. Such followers enhance their favorites’ stature. More importantly, they enhance the candidates’ experience of their country and their countrymen. They give them a connection, if they want to use it, to real knowledge of America. And most of those favorites did use that connection.

Now think of Franklin Pierce, William Howard Taft, Woodrow Wilson, Warren Harding, Lyndon Johnson, Richard Nixon, Jimmy Carter, Bill Clinton, the two George Bushes, Barack Obama . . . No constituency ever spontaneously decided that these men were inspiring figures, and therefore insisted that they run for office. When they ran, it was because of their own insensate and insatiable ambition (Wilson, Nixon, Johnson, Carter, Clinton, Obama), or because they thought it was somehow an appropriate thing to do (Taft), or because a deadlocked party invigorated a lurking idea that yes, maybe they could make it (Pierce, Harding), or because of some reason I cannot fathom (the Bushes).

Who clamored for Ted Cruz to run for president? What irresistible mob of supporters demanded that Marco Rubio take the field?

In each group, A and B, there are people whom I happen to like or admire, and there are people whom I happen to dislike or despise, usually because of their political philosophy. And there are people whose group assignment we can debate. But it would be hard to say that the Group B folk had the personal stature of the Group A folk, or their connection with the American experience. People in the second group have been candidates of themselves and some political coterie; their experience hasn’t needed to be broader, and sometimes it has been remarkably narrow. Those among them who have been motivated merely by their ambition, or the ambition of their friends and family, have tended to be either twisted souls or kids perpetually too late for the party.

The alarming thing about 2016, from this perspective, is the absence of any candidates from Group A.

Who clamored for Ted Cruz to run for president? What irresistible mob of supporters demanded that Marco Rubio take the field? John Kasich — the subject of what adoration? Jeb Bush — the cynosure of what eyes? None, of course, except those of the Chamber of Commerce and the diaspora of former Bush political employees.

I guess it goes without saying that nobody ever wanted Hillary Clinton to be president, and nobody wants it now. What her supporters desire is somebody who will favor their chosen policies, make the appointments they want to the Supreme Court, give them government grants and favors, employ them (or their relatives) and give them wealth and power. If Krazy Kat had figured out a way to collect gigantic bribes without overtly violating a law, and therefore had a ton of money to throw around, those people would be cheering for Krazy Kat. Who, come to think of it, would be a much better choice than Hillary Clinton, who is zanier than any comic strip character, though without the fun.

Ah, but Donald Trump and Bernard Sanders, what of them?

This is not a puzzling question. Think back to a year or two ago. Do you remember anybody ever saying, “There’s just one person I want to be president, and that’s the senator from Vermont”? No, you don’t. Sanders was and is a nonentity. It was the prospect of Mrs. Clinton’s coronation that made him a public hero. Any other plausible receptacle for leftist nonsense would have done as well, or better.

Of Donald Trump, we may ask a similar question, and find much the same answer. He wasn’t a nonentity, but no broad masses (to use the Marxist phrase) ever begged him to run for public office. He just got up one morning and decided to do it. So he has become the plausible receptacle for most of the justifiable or unjustifiable anti-establishment sentiment in the country. The fact that he has certain curious skills, skills that have made him more successful than Sanders in the political arena, doesn’t mean that anyone ever wanted him to be president.

I guess it goes without saying that nobody ever wanted Hillary Clinton to be president, and nobody wants it now.

I don’t know what Bill Bradford would say about this, but when I look at the major-party presidential contests of this republic, if we can keep it, I find very few examples of a year in which both candidates were in Group B. One example is the Harding-Cox election of 1920. Another is the melancholy contest of 1976 between Gerald Ford (nice guy, but an accidental president) and Jimmy Carter (distinctly not a nice guy, or a guy with any known constituency or capacity for office — a man elected to the seat of Washington by the fact that he was a Southern Democrat).

There have been other contests of B vs. B. But the current election is spectacular for the prominence of two inmates of Group B who are obnoxiously assertive personalities. To paraphrase the words of an advertising man who helped to elect Richard Nixon, “They wake up in the morning with their suits all rumpled and start running around shouting, ‘I want to be president! I want to be president!’”

One of these Type B people will win. The voter’s job is to decide which one is less weird and dangerous. This isn’t Harding vs. Cox. Both were capable men, and the victor, Harding, turned out to be a good president. (Forget the adverse propaganda; read the great book on the subject, Robert Ferrell’s The Strange Deaths of President Harding.) This time, the chances are much greater of getting a president devoted wholly to his or her self-generated ambitions.

Yes, in a republic, private ambition can sometimes benefit the public. Sometimes.




Share This


The New Cable

 | 

“As a stranger give it welcome. / There are more things in heaven and earth, Horatio, / Than are dreamt of in your philosophy.” Or offered in your network TV listings. Strange things are afoot in home entertainment, and the made-for-Netflix series Stranger Things is a brilliant case in point.

If you’re tired of the endlessly inane sitcoms, crime dramas, talent shows and trashy pseudo-reality shows offered by CBS, NBC, and ABC, turn off your networks and turn on your Netflix. There you will find well-scripted shows with movie-quality production values streamed to you on your phone, your computer, or your Smart TV. And it won’t cost you the nearly $200 a month many are paying now for cable television throughout their homes. My Netflix account costs $9.99 a month — and I can even carry it with me when I travel and share it with family members in other states at no extra charge. I don’t need a cable box or even a digital video recorder, because Netflix provides all of its listings to me on demand, whenever I feel like watching it — no commercials, no interruptions, and no set schedule. All I need is an Internet connection. And the quality of the programming can be superb.

Great shows are driven by great scripts, and the dialogue in this show feels natural and unforced.

Stranger Things, an eight-episode sci-fi series made specifically for Netflix, is a great example. Set in 1983, the show begins with a group of 12-year-old boys playing Dungeons & Dragons. They’re a lot like the kids in Steven Spielberg’s Goonies — likeable and outgoing, but slightly off. One has cleidocranial dysplasia, a genetic condition that prevents his permanent teeth from growing in; another has a weak chin that gives his face a beaklike quality. All of them are a little nerdy, but their friendship overcomes any sense of inadequacy.

When the game ends, the boys jump on their banana-seat bicycles and head for their various homes. (My immediate reaction: “Yes! Geeky boys on bicycles! I’m in!”) One of them, Will Byers (Noah Schnapp), encounters a strange beast that seems connected to a strange government installation on the outskirts of town. No one is home when he arrives there, and he goes out to the shed to investigate. The lights start flickering, an ominous predatory growl is heard outside, and suddenly Will vanishes.

The rest of the series focuses on finding out what happened to Will and uncovering the truths behind that secret government laboratory. The eight hourlong episodes are sharp and suspenseful, and each ends with a cliffhanger reminiscent of Fox’s phenomenally successful, movie-quality 24. I “binge-watched” the entire series in a single day.

What makes Stranger Things so compelling? First is the quality of the scripts and the acting. Great shows are driven by great scripts, and the dialogue in this show feels natural and unforced, reminiscent of my own son and his friends hanging out in the ’80s. My only caveat is Winona Ryder as Joyce Byers, mother of the missing boy, who is cloyingly, ferally crazed throughout the series, until her character suddenly and inexplicably ends up wearing lipstick, eyeshadow and false lashes with her hair combed out of her eyes in the last two episodes. (Winona must have seen the rushes and decided too much was too much.)

Prop master Lynda Reiss managed to recreate the ’80s with a budget of just $220,000 for eight hours of screen time.

Even more impressive than the script is the quality of the production. Netflix provided them a budget that allowed them to create a movie-quality show. But budget alone doesn’t lead to success; witness the nine-figure superhero films that have been dropping like flies at the box office this summer. The Duffer Brothers (twins Matt and Ross), who created the show and directed most of the episodes, know what they are doing. Much of their success (at least with semi-nerds like me) is in their skillfully crafted homage to Stephen King and Steven Spielberg, which is as impressive as J.J. Abrams’ Super 8 (2011). It’s a little bit E.T. mixed with Poltergeist, The Goonies, and Stand By Me (Rob Reiner) too, with nods to numerous Stephen King books.

Prop master Lynda Reiss managed to recreate the ’80s with a budget of just $220,000 for eight hours of screen time. She reportedly searched eBay, flea markets, rental companies and estate sales to find the vintage boomboxes, telephones, bicycles, cars, movie posters, clothing and home furnishings that give the show its striking ambience. The soundtrack too, is authentic ’80s, with the Clash’s “Should I Stay or Should I Go” providing a particularly poignant recurring motif.

This is a show that could only be set in the ’80s, when boys could still ride their bikes around town without telling their parents where they were going or when they would be home (and without their parents being investigated by Child Protective Services for letting them do so). It’s a reminder of just how free life was less than a generation ago, when kids learned all by themselves how to solve problems, stand up to bullies, navigate relationships, and manage not to get killed while snooping around empty buildings or abandoned rock quarries. Without cellphones, it was also a harder time for parents in many ways. I love how often the characters have to look for a pay phone in order to contact one another, and the reminder of how a mother had to wait anxiously at home beside the landline phone to hear from a late or missing child. Worry and trust went hand in hand back in the ’80s; it was a magical time, and I hope filmmakers continue to remind us of what it was like when kids roamed free.

So how is Netflix able to produce great programming such as this on a subscription model of ten bucks a month for unlimited viewing with little-to-no commercial advertising? There was a time when “made-for-TV” was code for “don’t expect much” from a movie. I expected that would be even truer of “made-for-Netflix,” with its inexpensive business model. Stranger Things is no anomaly, however. Series like House of Cards starring Kevin Spacey and the wildly popular (and well-made) Orange is the New Black are just as impressive. I wondered how Netflix could afford such quality while the traditional network shows are becoming markedly worse. So I did some checking around.

It’s a reminder of just how free life was less than a generation ago, when kids learned all by themselves how to solve problems.

Netflix started as a home-delivery alternative to Blockbuster. Instead of driving to the local video store, wandering the aisles in search of something to watch on Friday night, and then paying exorbitant late fees when you inevitably forgot to take it back on time (after forgetting even to watch it), customers could create a list of films they wanted to see and have them delivered to their mailbox with a pre-stamped return mailer. When they finished watching the movie, whether it was the next day or two weeks later, they just set it out in the prepaid mailer for the mail carrier to take, and Netflix automatically sent them the next film on the list (or films, depending on their subscription plan). Eventually streaming replaced the need for physical DVDs, and instant gratification was possible.

Netflix had signed a deal with Starz that provided them access to a huge library of current movies, but when that contract ended about five years ago, their selection shrunk significantly overnight. They were still making a ton of money off subscriptions, but they needed to give those subscribers a reason to keep paying each month. Just as they had recognized that the video store model had to change, they realized that their new business model needed to change again. People weren't going to leave Netflix right away, but if their library stayed small and uninviting, Netflix would eventually lose their cash cow, the monthly streaming subscriptions.

Meanwhile, the television entertainment model was changing too. For decades the three major networks had dominated entertainment television, with cable as the poor stepsister largely providing cheap local-access programming, infomercials and news shows. Then HBO realized they could make inroads into in-home entertainment by providing original programming. Shows like The Sopranos, The Wire,and Curb Your Enthusiasm provided top-quality scripts, actors, and production values, while also pushing against FCC rules regarding language and nudity that controlled network programming.

When their contract with Starz ended, Netflix programmers realized that they could continue to spend a ton of money buying existing content, or they could create their own exclusive content. They’ve done both, providing their customers with favorite old network series from the ’70s, ’80s and ’90s, but also commissioning great new programs.

CEO Ted Sarandos was dead set on creating shows on par with those being made for HBO, and he had enough surplus cash to do it. Their first production was House of Cards, an American remake of a British series of the same name, and they managed to land David Fincher (Fight Club, Gone Girl, Se7en) as director and Kevin Spacey as the star. (Netflix also greenlighted Fuller House, so the quality of their programming runs the gamut.).

People weren't going to leave Netflix right away, but if their library stayed small and uninviting, Netflix would eventually lose their cash cow.

The biggest difference between Netflix programming and HBO programming is that Netflix is straight-to-consumer and subscription based, while HBO continues to go through the local cable TV model and requires a premium upcharge. With cable companies now requiring that customers rent a separate cable box (at $10 or more) for every television in the house, in-home entertainment now costs more than $100 a month. Add the Internet service and landline telephone that are usually bundled into the cable service, plus premium charges for movie channels, sports and HBO, and before long you’re paying closer to $200 a month. Is it any wonder that so many households are ripping out their landlines and cable and opting just for Internet-based entertainment? Netflix, Hulu, and Amazon Prime are all subscription-based, on-demand Internet streaming options that offer good quality programming at a more affordable price.

In response to the cable-free movement in many homes, most networks are now offering some form of streaming, including HBO. Of course, if you’re paying $9.99 for Netflix, $14.99 for HBO Now, $99 a year for Amazon Prime Video (a bonus with shipping perks) and a few specialty stations, pretty soon you’re back over $100 for in-home entertainment. Perhaps in the future some of these services will start rebundling, and customers will be able to choose the services they want at a reasonable price. Choice — what a novel idea!

What’s next? It’s hard to say. The Hollywood studios have made a conscious decision to focus entirely on blockbuster franchise films and ignore the smaller, script-driven movies, creating a vacuum that wants to be filled. Netflix has been responding to that vacuum, committing to a six-film contract with comedian Adam Sandler and winning a bidding war on a script called Bright from Max Landis, son of John Landis of Thriller fame and a super-hot screenwriter in LA right now. Bright has a reported $90 million budget, comparable to almost any Hollywood studio product.

Netflix is also experimenting with a new distribution model of purchasing independent films and releasing them in theaters and on Netflix almost simultaneously. Beasts of No Nations with Cary Fukunaga of True Detective fame is one example. It earned almost nothing in the box office — not surprisingly, since a single ticket, small popcorn, and small drink currently costs close to $25. I love the atmosphere of a movie theater, but at those prices I’m starting to think it’s time to convert the basement into a home entertainment center.

I’m also concerned about how this new model will affect the careers of fledgling directors, since my understanding is that they earn very little, if anything, from individual views on Netflix. How will new filmmakers be able to continue their craft in the future if “success” means a distribution deal with one week of ticket sales in the theaters and an eternity of streaming on Netflix? I have hope that the market will solve this problem, just as it is solving the problem of outrageously expensive monopoly cable service. In the meantime, if Orange is the New Black, then Netflix is the new Cable. And I think that’s a good thing.


Editor's Note: Review of "Stranger Things," directed by Matt and Ross Duffer. Netflix, 2016, eight 50-minute episodes.



Share This


Pretty Good Cause, Pretty Bad Argument

 | 

Rightwing commentators have a ridiculous thing going on right now. It appears to emanate from the otherwise intelligent and upright Dinesh D’Souza, who is puffing a new propaganda movie — Hillary’s America: The Secret History of the Democratic Party.

This ridiculous thing is the idea, now constantly confided as formerly hidden knowledge, that the Democratic Party has always been bad, and the Republican Party has always been good. After all, the Democratic Party supported slavery, and the Republican Party opposed it.

I submit that this notion is just as silly as Michelle Obama’s maunderings (much criticized by the Right) about the significance of the White House having been built by slaves. She might have added that the Louvre was built by a tyrannical monarchy. Or that the pyramids were built by a tyrannical monarchy, in the service of a false religion. We’ve come a long way from then. And so . . . ?

If you don’t know anything about history, don’t insist on informing other people about it.

Now to the rightwing’s use of historical facts, or non-facts, about American political parties. This is the truth: the Democratic Party is almost 200 years old, the Republican Party more than 150. During the long, strange history of those parties, each of them has been colored by almost every conceivable political, social, and religious tendency. In terms of attitudes, ethnicities, gender roles, social classes, political beliefs, religious or anti-religious preoccupations — in short, in terms of everything — their present membership bears no similarity whatever to their original membership, except that almost all of their adherents have two eyes, a large mouth, and a peculiar nose (useful for detecting food, useless for detecting falsehood). Even in 1860, many Democrats opposed slavery, and many — perhaps most — Republicans were disgusting racists. And if you’re toting up ideological goods and evils, the Democratic Party was, for many long years, the party of hard money and low taxes, and the Republicans were the party of high taxes, crony capitalism, and big government projects.

I very much dislike the current Democratic Party of the United States. I consider it the source of much more than half of the political evil of this country. But something that antagonizes me even more than the DP of the USA is the willingness of good people, intelligent people, people whom I feel are on my side, to engage in false arguments and misrepresentations of history. They ought to know better, for God’s sake. Can’t they read?

If you don’t know anything about history, don’t insist on informing other people about it. And if your idea of “history” is nothing more than your idea of good and evil, and therefore of what should and should not have occurred, whether it occurred or not, you shouldn’t even use the word. You’re no better, intellectually, than any of your conceivable opponents. Drunk with your moral fervor, you’re denying yourself, and whatever slack-minded followers take you seriously, the real history by which moral judgments ought to be informed.




Share This

© Copyright 2016 Liberty Foundation. All rights reserved.



Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.