A “Very Important” Message for Mr. David Brooks

Back in March David Brooks titled one of his New York Times columns “The Modesty Manifesto.” In it, he argued that over the course of a few generations American culture has shifted from an emphasis on self-effacement to one on self-enlargement — in short, that Americans now hold themselves, as individuals, in much higher regard than they once did.

Some of the supporting data that Brooks cites, both in the column itself and in various speeches and interviews he has given recently, bolster his case. Who would argue, for example, with statistics that show a growing disjuncture between the declining achievement of American students on global math tests and their persistent confidence in the excellence of their math ability? Other points that Brooks makes, such as one about how sports players are much less humble than they used to be, aren’t especially persuasive and merely elicit a knowing chuckle from readers.

One item from his column that Mr. Brooks keeps repeating on the lecture and interview circuits is more sinister. He cites polling data showing that in the 1950s 12% of American high school seniors said they were “a very important person” and that by the 1990s a whopping 80% believed that they were. Leaving aside the fact that Brooks keeps changing the date for that 80% figure (sometimes he says it’s from polling done in the 1990s, sometimes from 2005), Brooks is refusing to look under the surface of this seemingly alarming number.

What Mr. Brooks would like us to assume — and, judging from the reactions he gets from live crowds, what we do assume — is that there has been a roughly sevenfold increase in the percentage of high school seniors who think they’re more important than other people. If, however, you reflect for more than the ten seconds that Brooks takes to manufacture dismay among his audience, you see that the question “Are you a very important person?” doesn’t mean the same thing today that it did in the 1950s.

Back then students are likely to have viewed this question as one about their worth relative to that of other people. Nowadays most students would interpret it to mean “Do you have intrinsic value as an individual?” Sure, the very fact that young people would now translate the question differently reflects a cultural shift. But it’s a shift in their awareness of the value that all people have, themselves included — not in how many of them see themselves as better than other folks. The answer “Yes, I am a very important person” can now exist side-by-side with the belief “And so is everyone else.” It is simply not the same type of statement as “I believe that my math skills are better than average.” One is about intrinsic worth; the other is about relative worth. The polling question “Are you a very important person?” may not have changed in half a century, but its cultural meaning has. Comparing responses to that question across generations is an apples-and-oranges proposition — indeed, it amounts to a distortion of the facts.

Brooks, of all people, should understand this. After all, his latest book, The Social Animal, is a study of how unconscious factors shape human behavior and biases. Brooks has, in fact, begun to build much of his reputation as an intellectual on his ability to tease out how people’s beliefs and values are strongly influenced by factors that lie just under the surface. In both that role and his one as a political commentator, he likes to cast himself as a sober, straightforward thinker who rejects the hyperbole that so many of his fellow idea-mongerers routinely use. I can’t help but conclude, then, that Brooks’ assertion in his self-proclaimed “manifesto” — that the statistic about “very important” high school seniors is one of the smoking guns in the crime of American immodesty — is totally disingenuous. Smart, clear-headed people know when they’re twisting data to their advantage.

Am I outraged by this petty fraud? Not really. This kind of chicanery is now a given in American political and media discourse, as Mr. Brooks himself has often rightly pointed out. It’s just especially disappointing from someone, even someone I frequently don’t agree with, who likes to cloak himself in the mantle of an honest, modest alternative to the lying, carping chatterboxes who really do think they’re more important than everyone else.

Life After Quitting a Full-Time Job

Exactly two years ago today I quit my job. My work life since then has not been a roller coaster, an adventure, a disaster, a triumph, a barrel of laughs, or a bucket of tears. It is what it is: Change — necessary, gradual, the fabric of existence.

I quit, in part, to rediscover the joy and the mission of teaching, and I continue to work in publishing as a freelancer. It’s great to be back in the classroom, and I value the opportunity to keep my hand — and my brain — in the various editing and writing domains where I manage to find (usually stimulating) work for pay.

Since I quit, I’ve been blogging about that decision, about freelance life, and about various other topics that relate to working independently. With just a few exceptions, the most widely read of my posts continue to be those specifically about quitting. Around some of the early ones, a still-ongoing online conversation sprang up. Perhaps you’ve been a part of it and have had a chance, like me, to learn from the stories of people who decided to deliberately change course, sometimes for the worse but usually for the better.

Indeed, there’s a lot of psychic energy out there around issues of job frustration, “starting over” professionally, and remaking oneself. The search terms that people use to find my blog provide anecdotal evidence. Phrases like these pop up frequently on my WordPress dashboard: “quit and reinvent myself”; “can’t take this job anymore”; “talented and want to quit”; “must quit to be free”; “can I quit and become famous”; “how to quit and be successful”.

Then there are the rarer, even more revealing phrases: “want to quit wife won’t let me”; “if I quit will my kids eat”; “leave this job and conquer the world”; “too talented to have a boss”; and (my favorite) “my job sucks a fat gorilla”.

Perhaps these two sets of search terms don’t fairly represent the sensibilities of folks who desire to quit. Google can, for some who are alone in a cubicle at work or on a laptop in bed late at night, serve as a therapist’s office or a confessional booth, where emotions are expressed raw and in rare form. But even in contexts that are not anonymous and impulsive, I have met recent and would-be quitters who express sentiments similar to those shared with the mighty search engine. To many, quitting seems like a ticket to liberation — an American dream of bursting forth by tearing down the fence that cages you in, in ironic contrast with the type you build with white pickets. In the spooky world of the American dream lurk the strangest contradictions.

There are, of course, a few folks for whom quitting is the ticket to great, previously unimagined material and spiritual success. But nearly always, life just doesn’t work that way. Quitting may be the right thing to do in a given set of circumstances, as it was for me. So my tepidness is not meant to sound like an endorsement of inaction. But if when you think about quitting, you find yourself intoxicated by the fantasy of your own uniqueness or by the delusion of your manifest destiny, do yourself a favor: Take a deep breath (and at least a few weeks, maybe months) before you do something rash.

After all, change borne of anger or euphoria is likely to deliver a sting — and to leave you no better off than you were before. The best kind of transition is the mundane sort that isn’t fully palpable while it’s happening, but only in retrospect. Change that feels heady is more likely to end with a letdown, perhaps a sobering realization that the opportunity you thought you’d grabbed by the throat was never even real.

Gee, what a wet rag to throw on this sexy topic of quitting your job! Maybe so. I don’t mean to say that life-altering career decisions don’t have moments of inspiration. They undoubtedly do, and I have written about my own. But what I read and hear so often in discussions about whether someone should quit a job are silly promises about uncharted waters on the one hand and dire warnings not to rock the boat on the other. There’s nothing more likely to cause seasickness than a ship of fools. Don’t listen to the yammering crew or, worse, become one of them.

With that strained metaphor, I take my leave of this topic of quitting a job. I’ve said all I have to say about it, at least in blog-post form. I’ll continue to write about the other topics that have been featured here and will probably add new ones that change the direction of this self-indulgent little enterprise. I’m not sure precisely where I may digress, but ideas are brewing. “Working for Yourself” is ready to boil off its excess.

For those interested in the history of the now two-year-old “Quitting a Job” subseries in this blog (which was born on HBR.org), here is a list of all the posts on that topic, in chronological order. Have a happy troll through the archive, if you’re so inclined. Regardless, I hope to hear from you on other topics in the near future. And, of course, feel free to offer some final thoughts and stories about quitting.

I Just Quit My Job. Am I Crazy?

Leaving Your Job in Tough Times: Swim, Sink, Swim

When Not to Quit Your Job

Quiz: Does Your Work Matter to You?

How Are You Coping with Uncertainty?

How to Quit Your Job with Style

Don’t Quit the Way Sarah Palin Did

Was Quitting My Job the Right Decision?

The Quitter’s Playlist

You’ve Quit Your Job. Now What?

Going Solo: One Year Later

A Career — and Now a Blog — in Transition

So You Want to Quit Your Job and “Start Over”?

Why Talented People Quit

Does Quitting Your Job Seem Sexy?

Quitting a Job: An Act of . . . Poetry?

When Focus Becomes Monotony

Where Freelancing Meets Independence

People who work for themselves often cite independence as the most appealing element of their work lives. They praise the flexible schedule, the lack of a boss, and the ability to select the work they do. I certainly value those concrete benefits, but what matters to me more is the freedom to assess quality as I see it, without the burden of internal politics or the sometimes senseless rules, both written and unwritten, of organizational culture. Despite not being bound by those strictures, some freelancers censor themselves, fearing to tread into territory that might displease a client. But that usually diminishes the value of their own work and denies them the sense of satisfaction that only calling things on the merits can provide.

Merits are relative, to be sure, especially in the worlds of editing and writing, where I spend about half my work life. But people with keen, analytical minds who try to honestly assess everything they encounter usually end up agreeing quite a lot with one another about what the merits are — even if it takes a lot of debate and deliberation, not all of it pleasant, to get there. Some organizations have managed to make room for this kind of honesty internally, but they are relatively rare.

Much more often, I have found, work environments function in one of two ways: a moaning and groaning culture, in which people routinely make things more burdensome than they need to be, or an “everything’s great” culture, in which people are pathologically positive and reflexively ignore flaws in the interest of preserving equanimity. Of course, most workplaces have a mix of those characteristics, in part because of the diversity of personalities, work styles, and subcultures in any one institution. But I must say that I have very rarely encountered a workplace climate that simultaneously (1) challenged chronic complainers directly on the substance of their exaggerations and (2) unmasked the type of self-censoring, Stepford Wives–style optimism that, by tacit agreement, keeps everyone creepily content and uncritical. Fear, self-interest, and willful ignorance are usually what entrench these mind-sets, but brute force is not the way to break their strangleholds. Dispassionate leadership-by-example does a much better job, though that can be hard to execute when you’re on the outside.

If you happen to be pretty good at influencing insiders, the outside perspective that freelancing enables still does not entitle you to preside like a robed judge over poor petitioners who seek your counsel, even if you’re explicitly being paid as a consultant. Loftiness is not what this freedom is about. Quite the contrary, it’s about allowing yourself to explore and question with the enthusiasm of a curious scientist, then negotiate the practical value of what you find with the deftness of a skilled diplomat. It’s, in short, the thrill of discovery and the craft of persuasion all wrapped into one. But plainspoken critique is sometimes required, and, yes, that could cause you to lose a client.

In my work life, an independent point of view is what I’ve always valued more than anything else, whether that perspective is mine or that of the people with whom I collaborate. Freelancing has allowed me a bit more breathing room as I try to do the best job of this that I can. And, frankly, it has given me the wherewithal to do some of my own writing, this blog included. Finding time for such independent expression still remains a huge challenge, especially for someone like me who struggles with saying no to people. But independence of mind is, after all, more about space than it is about time. And there’s more of that on the outside than there is within one institution’s four walls.

The “Ideas Guy” Is a Gas Guzzler

If you work in a field that produces intellectual content, you’ve probably rubbed elbows with at least one “ideas guy.” That’s the person in your organization who, like a constantly humming machine, generates countless ideas for others to implement. He’s usually an extrovert—glib, quick on his feet, intoxicated by his own “genius” (if politically savvy, he tempers that with just the right amount of self-deprecation). Sometimes this guy’s the boss, sometimes not. He may be widely liked, or disliked, or something in between. And he’s not always a guy, although more men than women play the role.

If you’re really lucky, you’ve got a whole bunch of these types running around your institution, belching out one idea after another while the rank-and-file scramble to figure out how to turn clouds of smoke into concrete realities. A few of the ideas turn out to be wonderful; most don’t. But that’s supposedly okay. You see, the ideas guy—and those who enable him—believe that the way to find the perfect specimen is to let a thousand flowers bloom. In fact, at “ideas meetings” that very metaphor is often invoked to add fragrance to a room where freshly conceived possibilities waft liberally through the air.

For a long time, that way of operating was highly productive. Finding the best idea was the most essential step on the path to success. Efficiency was reserved for downstream efforts, undertaken well after the “big idea” had been found. Brainstorms were not only messy, as they must be, but also enormous. Large-scale cleanup was a small price to pay for the gem that emerged from the wreckage. The process was organic, or at least it seemed to be.

But perceptive front-line employees have always known that most of what you find on the landscape after a storm of ideas has blown through is hard-to-recycle rubble and debris, not expanses of sweet-smelling flowers. Working through bad ideas in order to find—and then to implement—the good ones takes huge amounts of time, energy, and resources. Still, if the alternative is to never generate a great idea, there’s no real choice, is there?

Not so. In an era when we’re rethinking the very notion of waste—in materials, time, and space—the American ideal of the “ideas guy” is a yesteryear behemoth in need of an overhaul. Nevertheless, organizations have become so top-heavy with these characters that instead of a field of a thousand flowering ideas, we’re often confronted with tons of trash so daunting that finding anything worth a darn becomes almost impossible. Like a big car that guzzles gas, the old-style ideas guy consumes too much energy and hogs most of the space that others need to actually keep things moving. In short, he takes so many vital resources away from implementation in the name of invention and innovation that the end product is, de facto, a rush job at best and a hazard at worst, even though it may come in a handsome package.

It’s a rarer, humbler breed of intellectual that we now need to cultivate and elevate. Let’s, for a change of personage, call her a woman—one who

  • acts as her own filter, not because she is inhibited, but because she has the judgment, restraint, and good sense to recognize that early, judicious winnowing allows you to execute good ideas effectively and, thereby, make them great.
  • is not afraid of messy beginnings but abhors unnecessary waste, and knows the perils of squandered time—both hers and that of others.
  • envisions the outlines (and even some of the details) of implementation almost from the moment she conceives an idea, because the art of execution is in her bones.
  • has foresight and knows that the most important aspect of revision is having the luxury of time to revise.
  • is deft with words, both oral and written, but is not intoxicated by her own or others’ awareness of that mastery.
  • appreciates the dangers of getting into the weeds—but knows that not all weeds are bad and, when needed, can kneel in the dirt amid those details without feeling sullied.
  • values intrinsic more than extrinsic rewards and fosters an appreciation of the former in everyone she leads—and everyone she follows, too.
  • aims for the best but recognizes that there are times when the best-case scenario is to prevent the worst—and isn’t afraid to be forthright about that cold, hard reality.
  • uses resources, both human and material, in ways that reenergize the people and the things she draws from, so that everyone can see and look forward to the long-term benefits ahead of time.

In short, the alternative to the ideas guy produces more than she consumes. She doesn’t suck up all the air in a room, only to start spewing smoke that others try desperately—and inefficiently—to capture. She is not a machine. Guzzling gas isn’t her thing.

How Adult Literacy Programs Stimulate the Economy

image from the National Coalition for Literacy

A new school year is upon us, and as usual nearly all of the focus is on K through 12. As a former high school teacher, I value the attention on young learners that September brings. After all, investing in the future by investing in the education of children and young adults is a no-brainer. But in times of economic distress, investing in the present is also essential. One of the most overlooked opportunities for stimulating the economy is in adult education. As current (rather than future) members of the labor force, adult learners immediately use the skills they acquire in the classroom on the job and, thereby, directly and quickly improve business productivity. And, in the U.S., the skill that is the gateway to almost all other skills is, of course, literacy in English.

Many of the more than 2.5 million adult literacy students in the U.S. are immigrants, and the vast majority are highly motivated to learn English and use it every day. I know that because I now teach at two adult education centers in Massachusetts. In fact, shortly after the economic crisis beset us in late 2008, I quit my full-time job in publishing to return part-time to teaching. My decision to focus specifically on adult education was grounded in a firm conviction that this is where I would have the greatest and most immediate impact. And the reality I found in the classroom has exceeded my expectations. My students are champing at the bit to learn everything they can and to explore all the ways they can apply their classroom experiences to the real world. Together, they and I are effecting change.

When people think about adult education, even those who believe in the cause of funding literacy programs, they often see the issue in charitable terms — helping disadvantaged people who deserve a chance. Well-intended as that impulse is, this endeavor isn’t merely about stemming the flow from hearts that bleed for the needy. Funding and publicizing adult literacy programs is practical, plain and simple. It’s in the acute economic interest of local communities, states, and the nation as a whole. Making that point in clear, convincing terms will help to expand the pool of people who are interested in investing in adult literacy.

There’s no doubt that teaching students to speak, read, and write effectively in English takes time. But we don’t need to wait until adults finish a program, or earn a certificate or a diploma, before we see the benefits. These folks walk out of classrooms every day and put their newly acquired skills right to work. And many of them bring their education home to their children (my students frequently ask for extra handouts so that they can use them with their kids). That twofold, mutually reinforcing investment — in the parent now and in her child for the future — makes the concept of “trickle down” a concrete reality, not an economist’s fantasy. Let’s face that hard fact, and put our money where our mouth is. Then we all can reap the rewards together.

For more information about how you can help adult literacy programs fulfill their mission, visit the websites of the Cambridge Community Learning Center and the Somerville Center for Adult Learning Experiences.

Is That Professor a Plagiarist?

image from rutgers.edu

Plagiarism is alive and well among America’s tenured faculty. Non-tenured too, for that matter. I’ve worked as an editor for academic authors in a variety of disciplines for 15 years, and from my perspective, the situation is getting worse. Not because the profs are all turning crooked, but because many of them are allowing laziness to trump rigor and because some, strangely, don’t seem to be schooled in what plagiarism is.

As a writing teacher, I have an ear for detecting “borrowed” words. Clues include a suspicious shift in voice, syntax that doesn’t fit the writer’s usual forms, and outright non sequiturs of the copy-and-paste variety. Indeed, it’s the copying and pasting in the age of the internet that helps to explain why plagiarism is cropping up more than ever. I still can’t help but be surprised, though, at how prevalent it is among people who are supposed to be the bastions of academic integrity and protocol: university professors.

To be fair, my encounters with plagiarism of the most nefarious sort have been rare. Only once have I worked with an academic who knowingly tried to pass someone else’s entire argument off as his own. And that situation was handled by folks above my pay grade at the time. What’s become rampant of late are, rather, smaller-scale shrug-the-shoulders sloppiness and just flat out ignorance about what counts as using someone else’s ideas, language, or both without proper acknowledgment. Here’s what I’ve seen, and how I’ve handled it.

Careless Omissions

These come in two varieties, both of which some people prefer to call “misuse of sources” rather than plagiarism. I don’t.

One involves chunks of text hastily regurgitated with the intention of adding attribution later. The problem is when later never comes. In some cases, the original is nearly identical in language, but some key data point has also been misreported so that the lack of acknowledgment is made worse by an inaccuracy (see my previous post “Has Anyone Checked the Numbers?”). Authors who make these kinds of errors sometimes apologize; others admit to the respected editor, “I knew you’d clean up after me.” In the latter scenario, I remind the author that if I can’t find the offense, I also can’t rectify it — so remember to give me enough of a tip so that I’m looking for a needle in a hairball, not a haystack.

The other kind of omission amounts to incomplete source attribution. I’m talking about quotations correctly ascribed to the person who uttered the words but without acknowledgment of where they appeared. For instance, I have encountered statements such as “As my colleague Jane Expert said, . . .” without any source mentioned, only to find that Ms. Expert made this comment in, say, a New York Times interview. I remind the author that crediting the speaker is not enough, as it leaves unclear how the quotation was obtained. I’ve even heard other editors say things like, “Oh, I assumed it came from an interview that the author had done with Ms. Expert.” Assume that, and you may be publishing a correction later.

Clueless Commissions

Much more disturbing than finding an academic to be careless about reporting his source material is facing the reality that he doesn’t understand the basics about attribution. One author who had lifted material from a somewhat older text actually said to me, “But that’s in the public domain now.” Um, Professor, just because a text is public doesn’t allow you to claim it as your own. (I’m paraphrasing myself, of course. The actual quotation was much more diplomatic.)

Far more common are instances in which the author thinks that if he changes a phrase or two, plagiarism has been avoided. “Is that what you teach your students?” I wonder to myself. Again, my actual approach is more practical. I simply propose an alternative that either includes appropriate attribution or avoids the need for the passage altogether. That works almost every time.

The Editor as Teacher

Diplomacy is, indeed, at the heart of all the efforts to help an author avoid embarrassment (or even a lawsuit) for having plagiarized, whether due to sloppiness or ignorance. What you say obviously depends on your relationship with the author. If you have an ongoing and strong professional bond, a mini-lesson on best practices in academic writing can actually be a welcome offering; if you don’t have that kind of trusting tie, suggesting intelligent alternative language usually does the trick. Occasionally, you’ll work with someone who bristles at the very thought that she plagiarized (even if you didn’t say that outright). Again, a deferential “Here’s what you could write instead to make your excellent point” often dilutes the defensiveness.

And, of course, there are cultural differences in what constitutes plagiarism. Don’t be afraid to take an information-sharing stance as you politely explain how an American audience might perceive a particular use of another author’s material as inappropriate. With academics from abroad who may be unfamiliar with U.S. standards, focus on the perceptions of prospective readers rather than the rectitude of your position. After all, the definition of plagiarism is, like that of any intellectual practice, culturally bound.

The Editor as Policeman?

As it becomes easier for authors to commit acts of plagiarism, it’s also getting easier to identify — and to prevent — instances of it. Even without a nifty plagiarism-spotting application, a keen ear and simple online searching will turn up much more than you might expect. Will you find it all? Certainly not. Should you even attempt to look down every alley to find evidence of crime? No way. Editors are not police officers, for good reason. You certainly don’t have the time to walk that beat, given how many other important duties are on your roster. Besides, editing with a crime-fighting mind-set comes through in your communication with the author.

As with any editing task, your radar must be on at all times, but you mustn’t spend all your time listening to it hum. That kind of self-consciousness gets in the way, as good editors well know. Mindfulness is an asset; compulsiveness is a liability. Vigilance about plagiarism is one line item on the balance sheet. Give it its increasingly important due, but don’t let it overwhelm the bottom line.

When the Best-Case Scenario Is to Prevent the Worst

photo from portlandonline.com

A worker stays late on a Tuesday to avoid a hellishly busy Wednesday. A family forgoes a vacation to dodge delinquency on debts. A business owner cancels a scheduled wage increase to avert layoffs. Facing down a dilemma in which all feasible outcomes are bad is no epic saga. It happens every day. The healthy response is to motivate yourself to soberly take steps to mitigate the damage. There’s no triumph in the end result, but there’s a sense of relief and, occasionally, even satisfaction in having avoided outright disaster. In the lives of individual Americans, such decisions are reality, plain and simple.

At a national level, however, choices like these tend to be deferred, sidetracked, or abandoned altogether. Collective aspirations demand vision, vigor, and ultimately victory. Resignation and realism are killjoys. But increasingly, the U.S. is confronting large problems that have no chance of being conquered completely and ending in jubilation. In a variety of domains — the national debt, energy, the environment, public health, among others — recurrent, if not permanent, crisis is the new reality. The only sensible response is for citizens to take the difficult measures that are necessary to avert catastrophe, knowing full well that the best-case scenario is still a grim one. But how do you sell that kind of cruel gruel to a nation bloated on Cheetos, chips, and light sweet crude?

The gulf between what we expect and what we can realistically have is wider than it’s ever been in America. A course correction of unprecedented magnitude is in order. Casting the net wider and thinking bigger and bolder are, however, not the ways to take on the daunting challenge. Only in thinking small can substantive things be achieved when the aims are, by definition, not bloated.

The solution is to be found in the spirit of mundane, individual acts of sacrifice that people understand because they’ve been there. It’s familiar psychological territory, and most folks have found ways to make it not so grim. What they’ve never explicitly been asked to do is to downsize the most unrealistic of their national expectations — in short, to find relief, satisfaction, and maybe even pride in coming together not to achieve the impossible but to temper the threat of the worst-case scenario, much as they do in their own lives.

Consider the person who weatherproofs and fortifies the structure of her house to prevent the worst of the havoc that a storm could wreak. If the storm is powerful, some degree of damage is inevitable, and the homeowner knows that. The envisioned best-case scenario is no picnic, yet in putting in the time, energy, and money necessary to avert the worst, she can find satisfaction, peace of mind, and even pleasure in the preparations. “I’ve done what I can” fills the belly like a hearty oatmeal breakfast, even if it’s not tantalizing to the tongue.

Imagine this kind of sensible resolve applied to threats on a larger scale. We know, for instance, that widespread coastal flooding is coming as polar ice melts, even if carbon emissions are halted immediately. Instead of continuing to build recklessly on waterfronts, why not cast it as both a national and a local priority to rethink and redraw the boundaries between land and sea, thereby reducing the chances of a future worst-case scenario? There’s a sense of mission to be found in such planning. Yes, the message must be delivered with a calm voice and the plans executed with steady hands so that resolve does not balloon into alarm. But by drawing parallels with the individual choices that we all routinely make to avert disaster, we can stiffen our collective resolve into a strong national backbone.

Not all of these efforts need to be universal, of course. Even within narrower interests, such as that of political parties, best-case scenarios that nonetheless remain grim can be motivators. Take, for example, midterm Congressional elections in which the party that holds the presidency almost inevitably loses seats. That knowledge, once the sole province of experts, has trickled down to the masses. The awareness now itself inhibits turnout among voters who affiliate themselves with the president. Many assume, “We can’t win anyway, so why bother voting?” Indeed, the best-case scenario probably is a modest loss, but neglecting to act in the face of that loss ensures a worst-case blowout victory by the other side.

“Vote late on Tuesday to avoid a hellish Wednesday” is a message most people can grasp, even if it’s not as thrilling as the promise of a heavenly win. Seeing collective aims through the lens of a common-sense need to “do what you can” in the face of an inevitably negative outcome can achieve more than pundits and political operatives suspect. But, true, it won’t achieve victory. No horns will blare. The aims of this vision are not lofty.

It is Tuesday. I plan to work late. Wednesday will not be paradise, but if I have any say in the matter, it won’t be hell either.

Follow

Get every new post delivered to your Inbox.

Join 96 other followers