what part of no don’t you understand?
(1990’s | journalese (politics)? | “no means no,” “don’t you get it?,” “stop acting like an idiot”)
Nearly anything can substitute for “no” in this rhetorical question — “this” and “that” are often used — otherwise it is invariable, except that occasionally you will see a pronoun other than “you” (“they,” I should say, a majority of the time). Normally used as a rejoinder or expostulation suggesting that you fail to grasp something that has been made abundantly clear; therefore, whether actively disingenuous or not, you are being obtuse. Whether directed to a child by a parent or to public officials by voters, it bears an outraged, sarcastic, or at least exasperated edge. Linguist Arnold Zwicky has provided a very thorough exposition and history of the phrase and how it may be amended. The Phrase Finder’s entry is also worth a look.
Neither Zwicky nor anyone else has uncovered a primal connection with a film or television show, which surprises me; this question has always struck me as very likely to have fallen originally from the lips of an actor. (I have noted previously that this sort of genesis isn’t as common as one might suppose.) Lorrie Morgan’s 1992 country hit featured it prominently. LexisNexis suggests that this expression, and its numerous variants, are less common now than in the 1990’s, when it became generally known. A celebrity or public official uses it every so often; presidential candidate Herman Cain, disgraced general David Petraeus, and the president of Venezuela were all quoted using it in recent years. Mostly, it remains the mainstay of those who write cranky letters to the editor.
What gives “what part of no . . . ?” its kick is the fact that “no” is about the least dissectible utterance in the language. It doesn’t have any constituent parts. It can be used in different parts of speech, so it can be analyzed, but it is everywhere the ur-negation (except in a particular usage which is discussed in detail here). The only utterance more indivisible is an animal’s cry: a dog barking or a cat meowing (“What part of woof/meow don’t you understand?” are popular memes nowadays, so the kids tell me). This thrust is lost when nearly any other expression replaces “no.” Here’s a simple example: “What part of ‘Thou shalt not kill?’ don’t you understand?” Well, the hearer might not understand “thou” or “shalt,” or might want clarification of the precise meaning of “kill.” Alternatively, one may comprehend an expression perfectly well but fail to see why it’s relevant.
When wielded, “what part of no . . . ?” is a challenge. But when you look at the issue from the other side — that is, from the point of view of the one whose actions are provoking the questioner — it is quite often a red herring. For prohibitions to be effective, the hearer must recognize the authority of the issuer. Just because someone tells you “no” doesn’t mean they have the right to boss you around or imply that you’re stupid. The way to meet this question is to insist on its irrelevance.
Thanks to Dad, who unwittingly nudged this week’s expression my way. It always reminds me of a memorable episode I experienced with my friend Charles years ago. We were sitting in his yard minding our own business when a nearby homeowner barged out of his house and said to someone he considered a trespasser (not us), “What part of ‘get off my property’ don’t you understand?” The offender’s reply was a fine example of the response described in the previous paragraph: I’m not on your land, so you can’t tell me what to do. Bloodshed was averted even if hard feelings were not.
(1980’s | doctorese | “bad diagnosis”)
An example of an older expression that has grown common and become less specialized (other examples: “blowback,” “grounded,” “politically correct,” “template“). In medicine, “false positive” goes back at least to the forties, probably earlier; for some reason, the only results in Google Books from those days have to do with the Wassermann test for syphilis. In the seventies, the phrase got a boost from the popularity of home pregnancy tests. In the eighties, it was employee drug testing. Both developments got plenty of press, so use of the phrase grew sharply, and as it spread it began to turn up outside of strictly medical contexts. Now it can apply to virus or spam detection, security systems, internet search results, or even economic forecasting or earthquake warnings. The last two are notable because they involve not to results but predictions, which adds a new twist. You said there will be a recession and it doesn’t materialize — instead of you said there was cancer and there was no cancer there. Another example from the scientific community: “A false positive is a claim that an effect exists when in actuality it doesn’t,” that is, detecting a correlation that exists only because of your misinterpretation of the data. All these meanings rely on presumably preventable misreadings of an empirical result, incorrectly assigning too broad a significance to a single symptom, or maybe just running the test wrong.
False positives are a big problem; they can creep into the work of the most careful scientists. Medical tests that show a disease that isn’t really present can result in unnecessary or dangerous treatment, and all the expense that goes with it. The effect is subtler in empirical science, but pressure to obtain statistically significant results can skew the perspectives even of conscientious experimenters. (This article explains how it happens.) Such errors are dangerous because it’s worse to be sure of something that isn’t true that to fail to know something that is. As a great American philosopher, possibly Josh Billings or maybe Will Rogers, said, “It ain’t what people don’t know that’s the problem; it’s what they know that ain’t so.”
The expression was well settled by 1980, but only in medical contexts. (“False negative” is just as old.) When it turned up in general-interest articles, it often came packaged in quotation marks. It had not become a regulation noun; in those days it was still normally a compound adjective, applied to readings, results, reactions, responses, rates. Now it is more common as a noun than as an adjective.
I’m sure I wasn’t the first or last kid to stumble over the counterintuitive meaning of “positive” in medicine. I thought “the test came back positive” was good news, whereupon my hard-working parents (I kept ’em hopping) had to explain that the word you wanted to hear was “negative.” Doctors test for the presence of a disease or condition, and a positive result means they’ve found it, and you’re stuck with an undesirable disorder. It’s the only zone in everyday language in which “positive” means “negative,” I do believe. (It reminds me of middle-aged parents in the seventies cheerily reminding each other that “bad” meant “good.”) We must ever observe the instructions in the song and accentuate the positive, but not in the lab, please!
(1990’s | journalese | “duck walk,” “gauntlet”)
The act itself is objectionable to any American who takes the Bill of Rights seriously. Police march a suspect in shackles through a posse of reporters and photographers, who try to get him to say something incriminating or at least look guilty. Part of the purpose is planting the arrestee’s guilt in the public mind. Or it’s simply a way for the cops to get back at someone they don’t like. Parading criminals in front of crowds is a very old custom indeed, but the perp walk differs in two crucial respects. First, they are designed solely for the benefit of journalists, who act as stand-ins for the mobs of old. Second, they take place before guilt has been legally established. Most of us don’t get upset when a child pornographer or wealthy asshole is subjected to a perp walk, but as always, caution about giving the police too much rein is indicated. Presumption of guilt is insidious, and this expression perpetuates it shamelessly. There’s no snappy abbreviation of “accused” or “alleged,” I guess. “Susp” does not roll off the tongue.
“Perp walk” entered the mainstream somewhere around the mid-1990’s. There was no precise older equivalent, to my knowledge; if there was, it was pretty specialized. It might have been called, with grim irony, a “photo opportunity,” but I never heard anyone use that phrase that way. (Then I learned that the New York Post did call it a “photo op” a few months ago, so now I have. Who says writing a blog isn’t educational?) Several sources agree that the practice itself dates back decades, but the term “perp walk” does not appear in LexisNexis before 1986. My guess is that like “road rage,” the signifier grew more common because the signified grew more common. I’m not sure how far the rise of “reality” cop shows in the early nineties pushed “perp walk” into prominence. (Shouldn’t we have a reality show made up of nothing but footage of perp walks?) The phrase can’t be much older than that, because “perp” isn’t much older. It was almost certainly in use among police officers before 1980, but it was primarily a New York term throughout that decade; “perp” doesn’t seem to have become widespread until 1990. Its main advantage is that it’s short and memorable, but its lack of associations and baggage is also useful. Not a neutral term, exactly, but not as fraught as “accused,” “criminal,” or “con.”
Even after twenty years, “perp walk” has little if any figurative use. Here’s one instance, but it’s little removed from the literal: “chaplains and psychologists are housed together with the troops, so that a guy seeking mental health counseling doesn’t have to make the long ‘perp walk’ up the street past his buddies to the therapist’s office” (Huffington Post, April 29, 2016). It could stand in for any process that casts suspicion on someone, or even an invasion of privacy by the government or the press, but nothing like that has taken hold. “Raid,” “witch hunt,” “hounding” — all metaphors once — have become more or less standard terms. “Perp walk” shows little sign of going the other way.
Here’s a phrase that should be but isn’t: “perp school.” It’s when a juvenile is placed in an adult prison. No, wait, “Perp Walk” is a street name on Fire Island. Or how about “Under the perp walk, Down by the jail, Marched down the street in handcuffs, Gonna ride a rail.” I’m beginning to see why “perp walk” remains solidly literal — it doesn’t lend itself to plays on words, or any kind of play.
(1980’s | businese (finance) | “on request,” “when you want it”)
When did “by request” become “on demand”? The expression in financial circles is quite old; a note or loan might be payable “on demand” (all at once when the lender calls for it) rather than on a fixed schedule over time. But somewhere in there, it took on a much wider range of use. The campaign for abortion rights certainly played a role; by 1970 it was not unusual to hear talk of abortion on demand, which became a rallying cry as laws banning abortion came under attack. That trend has been going the other way for the last two decades, too late to stop the expansion of “on demand,” which now applies to nearly everything that can be ordered over the internet, from groceries to streamed movies to academic courses. All you have to do is snap your fingers, or tap your phone. (Doesn’t sound right, does it? even though it’s a literal description. But the old meaning of tapping a phone continues to get in the way.) You may have to wait longer than you did when you left the house to supply this need or that, but we are beguiled by the ease of letting a credit card and a delivery service do all the work, making the new “order” seem all the more attractive.
So a staid and venerable financial term has sprawled all over the place like lava flow from an angry volcano, aided first by medical and cultural trends (not just abortion — drug treatment and medical care more generally glommed onto the phrase in the seventies and eighties) and then by the rise of the personal computer, which even before the internet infiltrated our lives occasioned much talk of providing computational or word-processing services on demand. The phrase has become a hyphenated adjective as well. “On-demand economy,” based on people spending money from their smartphones, is a phrase you will hear more and more.
There seems to be an implicit democratization at work, too. If you have enough money, just about anything is available on demand, and that’s been true for centuries, making allowances for the fact the number of things we want, or think we need, has grown over time. Now you don’t need much money to acquire goods or entertainment on demand. If money can’t buy it, it’s not so easy. We may forget that not everything desirable can be had at the click of a mouse.
I’ve suspected for a long time that the internet has completed our transformation into a nation of three-year-olds, a trend initiated by the Sears Roebuck catalogue and the rise of advertising in the late nineteenth century. The consumer economy requires people to come up with new stuff to want and must continually come up with quicker and more reliable ways to get it to them. eBay, for example, consummates a huge number of transactions every day called “buy it now.” Is that much different from “Want it NOW” or “gimme NOW”? When it comes to tangible items, it’s not even instant gratification — that CD or toaster won’t fall into your lap the minute you click “confirm and pay” on Paypal. But we’ve learned to treat it as instant gratification; making the purchase is as good as holding the object of desire in our hands. Amazon wants to use drones to deliver packages faster than ever; next year it will be something else. We have created an economic monster that requires our appetites, and the means to sate them, to continue growing indefinitely. How long can we keep it up?
(2010’s | therapese? academese? | “little thing,” “insult,” “slight,” “dig”)
Now that Jim Crow is no longer legal (not that it has disappeared), we are left with microaggressions: words or actions directed at members of a minority group that appeal to negative stereotypes, intentionally or not. They do not violate any law, sometimes not even social convention, and in some cases the oppressed person can’t even explain why he is offended. But they can have a powerful cumulative effect, causing people to feel as degraded as their forebears felt under more immediately threatening conditions. To such victims, the microaggression is only a more subtle means of keeping women, African-Americans, Latinos, gays and lesbians, Jews, the homeless, trans- people, et al. in their places. It’s not just white men who commit microaggressions, though we do it more than anyone else, partly because we have the biggest pool of people to commit them against. But pecking orders are observed here as elsewhere, and each group looks for another group to feel superior to. In U.S. culture, everybody gets to pick on African-Americans, but African-Americans get to pick on LGBTQ people. Men lord it over women; the sharp mulct the dull. There must always be a way to define yourself such that there exists a class lower than you. As long as we seek such imbalances of power, we will have fertile fields for microaggressions, among other things.
Many sources attribute the coinage to Professor Chester Pierce, ca. 1970, an African-American professor of psychology at Harvard. The New York Times also pointed to a 2007 article by Professor Derald Sue that pushed the term out of the academic ghetto into wider use. (I certainly don’t recall hearing it before then.) To this day, the word is used far more often at universities than anywhere else. We have a lot of “micro” words now: microfiber, microloan, microblogging. “Microcephaly” has reared its ugly head recently thanks to the Zika virus. Two more examples sometimes seen near “microaggression” are “microinequality” or “microinequity.” I can’t help but hear an echo of the medical term “microabrasion,” which has little semantic connection but a strong phonological one. The word “aggression” does get people riled up, but the reason “microaggression,” despite its technical, academic sound, has some punch and poignancy stems from the fact that such acts occur only in situations when both the aggressor and aggressee are in direct contact, normally in a public place; they cannot be committed remotely, except by telephone, but even there you have two people engaging each other. Personal interaction is required.
Microaggressions have emerged as the latest fodder in an old debate: Are the oppressed overreacting to unexceptionable behavior, or are the oppressors using any means available to remind everyone else who the boss group really is? The more fundamental question — who gets to decide? — may be shunted aside. Straight, well-off white people are quick to suggest that microaggressions are symptoms of hypersensitivity or political correctness, a means to make us feel guilty even after we’ve made the reforms we were asked to make (well, most of us). But SWOW’s likewise dismissed much more brutal and intimidating means of subjection, from segregation of public amenities to lynching. You know, “They don’t have it so bad. Look at all the nice things we do for those people.” Not much comfort when you’re hauled off to jail for sitting in the wrong place or getting killed for an imagined offense against some white man’s code of honor. That old feeling of domination, whether backed up or not by formal legal sanction, counted for a lot. Treating as equals those you have been discriminating against for generations is a hard pill to swallow, and lots of people are tired of trying. It’s easier to say, “Wait a minute. I’m a victim, too!”
The rise of the microaggression may be taken optimistically: Except in a few extreme cases, physical and economic violence have gone out of the practice of racism, etc., leaving only petty snubs and well-meant gaucheries, which do much less real damage and will in turn become unacceptable in another generation or two. Or pessimistically: There’s no end to it. We get rid of one layer of abuses, and there’s another below that, and another below that. Microaggressions definitely damage some individuals, and that will ultimately hurt the larger society. My two cents: I haven’t thought this through, and it may be untrue, but it seems to me that if a half-concealed sneer can cause significant harm, then small kindnesses may also have an effect greater than their magnitude. It would be awfully nice to think so.
(1990’s | teenagese? therapese? | “siesta,” “catnap,” “forty winks”)
I grew familiar with this term as businese, through articles about frazzled employees needing a way to get back on track during the workday. That’s probably where you learned it, too, but the phrase more likely saw the light of day elsewhere. It was in use among college students in the late eighties, and still is, but it became much more familiar to the rest of us in the nineties when psychologists started pushing the benefits of resting and recharging at the office. The businese definition has largely won out, yet students even today may assign the phrase a slightly different meaning. Businesspeople use the term to mean a short period of sleep intended to increase alertness, vigor, and therefore productivity. Students use it that way, too, but it can also mean a period of deep sleep without any indication of duration. In 1988, New York Times columnist Richard Bernstein defined it as “deep sleep induced by extreme exhaustion,” and cited it as an example of college slang. That sense has not disappeared completely, though it has been largely eclipsed.
The reason it sounds like businese is that it goes with “power lunch” and “power tie,” which became clichés in the eighties, when the cult of the world-bestriding businessman, brought low for a couple of generations by the Great Depression, ramped up again. Flaunting was in, and executives took pride in asserting their prerogatives. In the early nineties, when psychologists like Dennis Shea, James Maas, and Bill Anthony began writing about the benefits of brief rest periods for white-collar workers, “power nap” made our vocabulary more productive and efficient. (I can’t resist: “Feeling logy at work? There’s a nap for that!”) But powerful people don’t generally sleep on the job if they want to stay that way, and a power nap wasn’t a way to project one’s own muscle (like a power tie) or extend one’s dominion (like a power lunch). The fit isn’t as neat as it sounds, more evidence that “power nap” was not native to businese.
In 1992, the Guardian, reporting on the U.S. military’s methods of keeping soldiers minding sensitive or complex equipment as sharp as possible, noted that those charged with such duties were instructed to rest regularly: “to avoid implications of sissiness, such rests are called ‘power naps.’” Another possible origin story for “power nap,” one I don’t find very convincing. There’s no doubt that our armed forces are a great source of euphemisms (collateral damage, anyone?), and it’s also true that there is a lot of stubborn machismo in the ranks. But even the Army must put aside long-cherished prejudices when science and experience team up to demand it. “Soldier, I order you to take a power nap before your next eighteen-hour shift!” “Yes, sir!”
No matter how many studies demonstrate that short rests during the workday improve employee performance, most bosses still view power naps as proof that workers aren’t serious about their jobs. I’m as prone as anyone to get sleepy after lunch, but I shudder to think of how my boss would react if he caught me in an actual doze. Your average boss just can’t get past that rock-bottom-line calculation: time spent sleeping is time spent not working, and you’re here to work, so sleeping on the job is dereliction, dress it up as you will. American bosses are not, on the whole, a very imaginative or innovative lot. The experts can talk till they’re blue in the face, but the boss knows what he knows. Power naps are for weaklings.
(2010’s | hipsterese? teenagese?)
Primarily a verb, I would say, but available as a noun (short for “vaporizer” or for the practice of “vaping”), or for modifying fanciful store names (there’s one on 14th Street called Beyond Vape). One who vapes is a vaper, which may remind antiquarians of “viper,” a very old word for marijuana smoker. “Vape” was not entirely new when we first encountered it between 2005 and 2010 — 2009 is the first time it shows up in mainstream press sources, says LexisNexis — it had seen limited use before that as short for “vaporizer,” but that was before anyone thought of a vaporizer as a way to ingest nicotine or anything else. For that we had to wait until the early 2000’s, when a Chinese pharmacist invented the battery-powered nicotine delivery device, which heats liquid to form vapor rather than leaf to form smoke. It took a few years, but by 2010 electronic cigarettes had become noticeable. They looked suspiciously like cigarettes — and plenty of people were and remain suspicious — but they produced far less dangerous fumes, though probably not perfectly safe. A few short years later, vaping need have nothing to do with nicotine, and dispensers need not look like cigarettes, though the ever-popular vape pen retains the slim, cylindrical shape. It’s become an art and science and commerce all its own. Shops have sprung up everywhere, and vaporizers have supplanted hookahs as the hip smoking device. I see people vaping all the time now on the streets of New York. Professional worriers have stopped worrying about hookah languages and started worrying about kids taking up vaping.
There are a number of associated terms, of course, (and a legion of brands to match); if you want a chuckle, check out the alphabetical list of headwords on the right of Urban Dictionary’s “vape” page. I won’t try to go into all of them, but here’s one glossary (here‘s another). The medium for the nicotine, flavoring, or whatever you put in your vaporizer is generally called “e-juice” or “e-liquid.” Another term for the device is “PV,” for “personal vaporizer.” Basic tools of the trade have been shortened to “atty” (atomizer), “cart” (cartridge) and “bat” (battery). A souped-up PV is called a “mod” (short for “modified”), which should not conjure up visions of the Mod Squad. A “clone” is a fake, basically, a knock-off or counterfeit. The sensation of a puff of vapor going down is called a “throat hit.” Regular old tobacco cigarettes are known as “analog cigarettes,” though there’s nothing digital about an e-cigarette; the association with e-mail and other computer-spawned e’s is fortuitous.
We are entitled to wonder why vaping became so popular so fast. Much is made of its role as an aid to giving up smoking, with accompanying debates over how safe it really is — debates that continue to rage, though most observers agree that they are less toxic than old-fashioned cigarettes. It seems likely that many vapers took it up for that reason. Vaping is cool rather in the way that smoking used to be: not rebellious exactly, but a bit transgressive, a little dangerous, developing a subculture recognized by the general population. But there’s also the technological factor. Vaping is in because it has produced new gadgets and lots of opportunities to mess around with them. The engineer types like having things to play with, and the techno-buffs revel in the latest improvements. There’s also the rage for anything new that occupies a surprising number of our fellow citizens, which I have cited before as a powerful force behind new concepts and expressions in our discourse.
(1980’s | athletese | “unwind,” “relax,” “take it easy”)
This word first came to our attention primarily as a result of the Iran hostage crisis, or rather its end in January 1981. The hostages flew first to a U.S. base in Germany and stayed there for several days. The State Department discouraged family members from visiting them, because they needed time to “decompress.” The word had appeared before with a similar meaning, but it showed up in all the major news outlets and was treated as a novelty. The word was also used on occasion to talk about Vietnam veterans returning too quickly to civilian life.
Much older in the contexts of medicine, engineering, and particularly diving, “decompression” is extremely important to deep-sea divers, who must avoid the bends by returning to the surface very gradually, resting at certain depths along the way so their bodies can get accustomed to lower pressure. This use seems to be the direct ancestor, and it is definitely echoed in both the cases of ex-hostages and ex-soldiers. Moving from a high-pressure environment to less intense surroundings requires time to adjust; the more time taken, the more likely the transition will be smooth. In engineering and medicine, “decompress” meant simply “relieve pressure,” obviously a related usage, though normally transitive. (Why didn’t Jimi Hendrix do a song called “Manic Decompression”?) In computerese, “compress” was in use by the mid-eighties to denote making computer files more compact, or combining them, without deleting data, and “decompress” was its usual antonym; it can still be used that way, though my ear says that “extract” has become the most common term for restoring the files to their original size and configuration.
Soldiers in Vietnam and the hostages in Iran both went through terrible ordeals, and “decompress” was often used in such contexts in the eighties. Now we are more likely to talk about a vacation from work or a little r&r rather than recovering from prolonged physical and emotional strain. One can find instances of “decompress” even in the seventies referring to respite from much less arduous circumstances. Even so, my own feeling is that the word still bears some weight. If you need to decompress, you’ve been under significant stress — “stress” itself has evolved into the verb “de-stress,” which is a competitor — and probably for some time. Or perhaps the average daily stress level (I propose a new statistic to the Labor Department: ADSL) has gone up in forty years to the point that a garden-variety vacation from the office seems tantamount to a break from captivity or jungle warfare. “Decompress” has been helped into prominence by its association with “stress,” not only by virtue of rhyme but by contiguity of sense as well.
(2010’s | teenagese | “engaging,” “relevant,” “familiar,” “accessible,” “personable”)
“Relatable” is one of those expressions thrown up by our younger contingent. (Other examples: “take a chill pill,” “peace out,” “sketchy,” “stoked,” and possibly “love handles” and “no pressure.” “Based off of,” “I know, right?,” and the “because + noun” construction have swept the under-18’s decisively in recent years.) Teachers report periodically new words or phrases bubbling up in the classroom, and “relatable” had its moment somewhere around 2010 and has become widespread since. I certainly did not know the word in 2010, and probably not for three or four years after that. It’s tempting to blame such eruptions on social media, but consumable popular culture for teens has been omnipresent for decades and did not always require Instagram or Tumblr. Once the kids adopt an expression, it has a strong chance of entering the language, because the rest of us spend so much time talking about what they’re up to and what it bodes for the rest of us (ill, generally). Also because some day those kids are going to take over the world, or at least this corner of it.
The teenagers didn’t invent this one, mind you. “Relatable” was available in the early 1980’s, especially in writing on film and television; it meant roughly “agreeable” or “comfortable” — more accurately, “characteristic of something most Americans can identify with” — doubtless descended from “relate to” as used in the sixties. The new sense of the word has hung around ever since, so the teenagers of 2010 had had many opportunities to learn it. The old meaning, “capable of being told,” has grown rare, and we are left with the inescapable fact that “relatable story” means something much different from what it did fifty years ago.
Every teenage addition to our vocabulary calls forth a phalanx of teachers and professors to bewail it, and “relatable” has been written up in The New Yorker, Slate, and the Chronicle of Higher Education, among other places. (Ben Zimmer provided a non-judgmental history in the New York Times.) The good professors have a number of reasons for objecting to the term, all of them cogent and stoutly defended. Use of the word proves students self-centered, closed-minded, unwilling to try new things or broaden their horizons. But let’s not forget that the older generation always says as much about the younger, often with justice. It is true that most kids don’t want to do a lot of work to absorb their lessons, and therefore they prefer everyday language, stories, and characters they can understand without effort. But plenty of these same kids will grow up and open out, and it’s no use pretending that this is some unprecedented defect never encountered before millennials stuck a trembling toe into adulthood. Grousing about the rising generation is as old as civilization, at least.
“Relatable” doesn’t always mean likable. When used to talk about everyday situations, it is more likely to connote awkwardness or embarrassment than triumph. You can find collections of mottoes, truisms, and slice-of-life stories all over the web that advertise themselves as relatable. Maybe my sample size isn’t large enough, but I came away with the distinct impression that the most of them have to do with unpleasant contretemps that we try to get past without humiliation. We are all supposed to sympathize and see ourselves in others’ tales of woe, or the nuggets of wisdom acquired from them. Any pleasure we take in such misfortunes is rueful. But we are also to take away the unstated conclusion that those who encounter the same predicaments or feel the same way about etiquette as we do make up the only world that matters — our experience is universal, and everyone else’s? — well, we’ll make room for that around the edges, if we feel like it. “Relatable” is seductive to the extent that it assures us that our group is the center of the universe.
Thanks to that inspirational teacher and observer of the language, Lovely Liz from Queens, for pointing out that this expression needed an airing. I hope I pass.
my work here is done
(2000’s | businese? | “I’ve done what I set out to do,” “I’ve done what I can”)
Little wonder this phrase has become a popular meme in recent years; it can convey exactly the sort of self-satisfaction and superiority that express themselves so often and so malignantly on the internet. Of late it has become popular in right-wing circles as a means of bashing Obama — they never tire of it — as in this cartoon. It doesn’t have to be this way. The phrase can have a benevolent sound, the sort of thing Gandalf or Obi wan Kenobi might say, though as far as I know neither of them ever did. But juvenile irony is winning the day; now the expression goes readily with scenes of catastrophe and chaos.
Most on-line sources cite three possible sources of the catchphrase: the Lone Ranger, Mary Poppins, and Blazing Saddles. I’ve been able to verify only the last, not the other two. Some cite Errol Flynn in The Mark of Zorro (1940). “My work is done here” is a variant (used by Leonard Nimoy in the monorail episode of The Simpsons, for example); it means the same thing and has the same weight. This formulation is technically ambiguous, but the alternate meaning (I do my job in this place) does not obtrude. “My work here is done” began to appear sporadically in LexisNexis right after 1990. Then as now, it was a favorite of departing CEO’s who wish to convey the impression that they have completed their stint with honor and can safely hand responsibility to their successor, provided they get their severance package. Perhaps that’s when the phrase picked up its odor of smugness. Despite, or because of, the ironic turn, it still bears a hint of hipness and remains the property of college kids, middle-aged columnists, and corporate consultants alike.
But the question is not “When did this expression originate?,” because the phrase is not fixed, and any normally equipped English speaker could utter it in the course of conversation. It’s an ordinary English sentence, after all, and doesn’t require a mythical origin. True, it is a bit more elaborate than what you might call the ground-level expression, “My work is done,” which has an almost Biblical simplicity. The question is when did it become the sort of thing people ask about in chat rooms and forums? According to LexisNexis, it started turning up regularly in the press not long after 2000. It might be used to end an article, or, conversely, as a blogger’s headline. As early as 2004, I found an example of the now familiar meme: “Chaos, panic, disorder — my work here is done.” (Google it and despair.) The phrase has always had a bias toward the smug, but now it has a healthy dose of snark as well, as we use it to crow about the mess we (or someone else) have made rather than acknowledge an edifying experience. The expression’s grandiloquence is real but easily subverted. The trend toward using it sarcastically continues and may win entirely in another ten or twenty years.