(1980’s | businese | “sure thing,” “fait accompli”)
“Done deal” always makes me think of the mob expression “made man.” The alliterative spondee lends both expressions the necessary sense of finality and irrevocability. I don’t know of any connection between “done deal” and organized crime; the earliest uses of the term I was able to find come out of the financial industry, soon absorbed into political discourse. As you might expect given its business origins, “deal” clearly refers to transactions, not cards, although I can imagine a casino employee responding to a poker player’s complaints with “Shut up — it’s a done deal.” Newsweek noted in 1985 that the phrase was a favorite of Treasury Secretary James Baker, and such early patronage by politicians favored its fortunes; there’s no doubt “done deal” is as useful in politics as in banking (or the Mafia, for that matter). Even today, the phrase turns up most often in financial and political news — not that they’re different. “Done deal” has now come to be used more often, if not predominantly, in the negative, to caution us that there’s no guarantee the contract will be completed as advertised (e.g., “this is not a done deal”).
“Done deal” originally referred to business maneuvers, but as politicians picked it up it came to mean any sort of dead certainty (a little like “slam dunk,” but used in different situations). A way of saying “we’re not going back” or “you can count on it.” A done deal need not actually be done, but the point is that even if the papers aren’t signed, they will be soon. It does seem to me that “done deal” is often used to refer to a transaction or agreement that is not yet formal or final; once the deal is truly executed, it is no longer necessary to call it “done.”
“Done deal” represents a form of grammatical displacement not uncommon among new expressions. The concept is an old one, so how did we express it in the old days? “Settled,” or more poetically “chiseled in stone.” In a simpler key, “all over.” These are all adjective phrases that cannot serve as subject or object. Commonplace ideas look for new parts of speech to inhabit, and nouns may slip into power where once ruled only adjectives. To some extent I am speaking fancifully in attributing will to words, which are but bits of breath and ink, but if you spend enough time observing the language, it’s easy to slip into the belief that words have life and motive independent of us, their creators but not their controllers.
(1980’s | New Yorkese | “heartburn,” “acid indigestion,” “anxiety,” “stress”)
I didn’t know this, but the rest of the internet did: “Agita” originally referred to gastric distress. No one seems to have demonstrated conclusively where the word comes from, but many people blame the Italians, possibly by way of a colloquial pronunciation of “acido” (which means, surprise, “acid”). It may have started life as a euphemism allowing the speaker to avoid unsavory details, just as telling your boss you were out sick with an upset stomach usually ends the conversation. “Agita” seems to have come into use in New York before it did anywhere else — some on-line sources claim it has been around for decades, but the earliest uses I’ve found date from around 1980. Woody Allen’s 1984 film, Broadway Danny Rose, contains a song of that title, and Ed Koch was quoted using the word in the eighties as well. One may doubt Allen’s and Koch’s Italian credentials, but that’s the great thing about New York. Ethnic vocabularies slosh around and get all mixed up in the urban mind just as a big plate of pasta and tomato sauce gets all jumbled in your stomach. For those of us that dislike certain words, there’s even a kind of linguistic heartburn caused by hearing one of them, which provokes an analogous sour, burning sensation in the ear and mind.
Today, the expression is much more likely to refer to a general emotional state than mere stomach trouble. The internet doesn’t answer conclusively the question of how chronic indigestion or aggravation has to be to qualify as “agita.” I tend to think of it as an ongoing condition, but it can also be a temporary tempest that subsides as soon as the irritant is removed. In everyday use, it implies a persistent quality. The substratum of “agita” has changed since the early days: from discomfort caused by indigestion to nail-biting nervousness caused by worrisome circumstances or developments.
Allen and Koch might have had in mind two Yiddish words that are related, though not very closely: tsuris (troubles) and shpilkes (restlessness). At least one commentator equates “agita” with “tsuris,” but they are not an exact match. Tsuris normally come from the outside rather than welling up inside you, but the main point is that they (“tsuris” is plural) are the troubles you are actually having, to which you may react with philosophical tranquility or by waking up in a cold sweat. “Shpilkes” is even farther away, but when “agita” is used as a straight synonym for “agitation,” they have something in common, even though the mood is quite different. “Agita” covers only personal feelings, not “agitation” in the sense of public protest or stirring up unrest. That doesn’t mean entire classes or types of people cannot experience agita, but it’s more a shared experience than a collective one.
“Agita” is a trade name for an insecticide (chemical name thiamethoxam), and in prescription-speak, it means “stir,” which is much easier to understand than the usual cropped Latin found on your medicine bottle. It also appears occasionally as a woman’s name, but I hope in that context it would be pronounced ah-GHEE-tuh, or possibly “ah-JEE-tuh,” not accented on the first syllable as “agita” (AH-ji-tuh). At least I hope so.
Thanks to Anna from the Bronx, friend and colleague of lovely Liz from Queens, who nominated, all unwittingly, this week’s expression.
doesn’t pass the smell test
(1980’s | legalese? bureaucratese? | “is fishy,” “ain’t right,” “doesn’t smell right,” “stinks to high heaven”)
The primary characteristic of the “smell test,” I suppose, is that it does not measure or evaluate anything that can be rigorously defined. It’s a more or less instantaneous reaction to circumstances that tells you whether to go further or not. And it usually is an ethical test. Passing the smell test means you are on the up-and-up; to do otherwise means your motives or methods are questionable, or worse. An important characteristic of the smell test is that it is anterior to other kinds of questions that might need to be asked. Deciding that an explanation or proposal doesn’t pass the smell test means you need not go any further to evaluate other aspects of the set-up; you simply turn up your nose and go home. It has failed to meet a minimum standard of credibility or honesty.
Even though it is possible to pass a smell test, the phrase is used far more often in the negative — so much so that while it is very easy to think of idiomatic equivalents to “doesn’t pass the smell test,” it’s much harder to come by conventional expressions that mean the opposite. It would be more economical to have a new expression that fills in a gap instead of lapsing into a well-worn groove, but many recent additions to our vocabulary are unnecessary, if not unwanted.
“Smell test” has a more literal meaning: evaluation of olfactory acuity (not very common) or any examination conducted by means of smelling (less uncommon). An example would be sniffing a sample of the grain harvest to see if it’s moldy or otherwise contaminated. The inspector’s nose must decide if the wheat has something wrong with it that makes it unfit to eat. This appears to be a direct ancestor of our more figurative use, since the same operation and results are at work. The odor is off, so we declare the batch unusable and cast it aside.
I found exactly one example of “doesn’t pass the smell test” before 1980 in LexisNexis. The phrase seems not to have taken hold for several years after that. It was used now and then during the eighties, nearly always in legal or political contexts. In a New York Times article (1988), an attorney named Edward Costikyan credited the expression to a colleague, but it seems likely that the phrase had been around for a decade or more by then. There doesn’t seem to have been a definite moment that catapulted “smell test” into everyday language. A few people liked it and used it as a handy way to refer to a kind of quick and final gut reaction that (usually) warned you away from a crime, scam, or cover-up. Maybe it’s obvious to everyone; maybe you have to know something about a particular field or business to detect the problem, but either way, you know it when you see it. Maybe you can’t give a well-defined reason, but you don’t need one with a smell test — a feature that makes it easy to abuse. For the most part, though, “smell test” does not seem to have become a synonym for arbitrary or prejudicial conduct (e.g., your job application didn’t pass the smell test because your name is Takisha). It still has a faintly commendable ring, evidence of an active sense of right and wrong and a pure heart rather than a means of getting rid of people or projects you didn’t like anyway.
show daylight between
(1980’s | athletese | “distance oneself from,” “move away from,” “disagree with”)
Now primarily a political term and has been for at least twenty years. It comes out of equestrian sports: space between rider and saddle or between two horses on a track. It goes back a long way among other athletes as well; Lighter found citations as far back as 1903 in sports talk. When a running back sees daylight, he’d better gain some yardage. Before that, “daylights” could mean “eyes” or “guts” (as in beating the daylights out of someone), an odd pairing. (“Lights” is a very archaic term for lungs, as in “liver and lights.”) Daylights plural and daylight singular don’t seem to have a lot to do with each other.
In sports lingo, “daylight” just means there is a gap between two things: a baseball and a foul pole, say, or two defenders. When you see light you know the objects aren’t touching. That particular meaning was next adopted into politics; by 1980, political figures felt free to use “daylight” in the athlete’s sense. By 1990, executives had it in their arsenals, too. Today, it is still primarily the property of athletes and politicians; according to LexisNexis, it turns up infrequently in any other context. Among athletes, “daylight” might be good or bad, according to the circumstances. But in politics, “daylight” always indicates antagonism of some kind. If it’s someone you want it known that you’re in conflict with, you may “put” or “create” daylight between yourself and the other. When an official wants to affirm unity with another official, she says there is “no daylight” between them. It can exist (or fail to exist) between organizations or countries, too.
Politicians, magpies that they are, love to steal the characteristic expressions of athletes, just as they love to hijack military jargon. I have covered at least half a dozen examples: payback, you’re history, raise the bar, slam dunk, punt, game changer, man up, and there are a few more that are less clear-cut. Politicians, especially male ones, may feel a toughness deficiency and look for ways to cover it up. Taking expressions from athletes and soldiers exploits their generally acknowledged masculine superiority and delivers to the audience an (often unmerited) impression of strength, vigor, and determination. I’ve noted before that politicians like to draw on military vocabulary, but their yen for athletese may also be worth exploring. There are other factors at work: “daylight” sounds like a pleasant, uplifting word, and the way it veils animosity also makes it attractive to the politically inclined.
“Daylight” should not be confused with “sunlight” or “sunshine,” words that in political discourse are used to talk about openness or transparency in government proceedings. The use of “daylight” in such a context would suggest a slip of the tongue or confusion on the part of the speaker.
what part of no don’t you understand?
(1990’s | journalese (politics)? | “no means no,” “don’t you get it?,” “stop acting like an idiot”)
Nearly anything can substitute for “no” in this rhetorical question — “this” and “that” are often used — otherwise it is invariable, except that occasionally you will see a pronoun other than “you” (“they,” I should say, a majority of the time). Normally used as a rejoinder or expostulation suggesting that you fail to grasp something that has been made abundantly clear; therefore, whether actively disingenuous or not, you are being obtuse. Whether directed to a child by a parent or to public officials by voters, it bears an outraged, sarcastic, or at least exasperated edge. Linguist Arnold Zwicky has provided a very thorough exposition and history of the phrase and how it may be amended. The Phrase Finder’s entry is also worth a look.
Neither Zwicky nor anyone else has uncovered a primal connection with a film or television show, which surprises me; this question has always struck me as very likely to have fallen originally from the lips of an actor. (I have noted previously that this sort of genesis isn’t as common as one might suppose.) Lorrie Morgan’s 1992 country hit featured it prominently. LexisNexis suggests that this expression, and its numerous variants, are less common now than in the 1990’s, when it became generally known. A celebrity or public official uses it every so often; presidential candidate Herman Cain, disgraced general David Petraeus, and the president of Venezuela were all quoted using it in recent years. Mostly, it remains the mainstay of those who write cranky letters to the editor.
What gives “what part of no . . . ?” its kick is the fact that “no” is about the least dissectible utterance in the language. It doesn’t have any constituent parts. It can be used in different parts of speech, so it can be analyzed, but it is everywhere the ur-negation (except in a particular usage which is discussed in detail here). The only utterance more indivisible is an animal’s cry: a dog barking or a cat meowing (“What part of woof/meow don’t you understand?” are popular memes nowadays, so the kids tell me). This thrust is lost when nearly any other expression replaces “no.” Here’s a simple example: “What part of ‘Thou shalt not kill?’ don’t you understand?” Well, the hearer might not understand “thou” or “shalt,” or might want clarification of the precise meaning of “kill.” Alternatively, one may comprehend an expression perfectly well but fail to see why it’s relevant.
When wielded, “what part of no . . . ?” is a challenge. But when you look at the issue from the other side — that is, from the point of view of the one whose actions are provoking the questioner — it is quite often a red herring. For prohibitions to be effective, the hearer must recognize the authority of the issuer. Just because someone tells you “no” doesn’t mean they have the right to boss you around or imply that you’re stupid. The way to meet this question is to insist on its irrelevance.
Thanks to Dad, who unwittingly nudged this week’s expression my way. It always reminds me of a memorable episode I experienced with my friend Charles years ago. We were sitting in his yard minding our own business when a nearby homeowner barged out of his house and said to someone he considered a trespasser (not us), “What part of ‘get off my property’ don’t you understand?” The offender’s reply was a fine example of the response described in the previous paragraph: I’m not on your land, so you can’t tell me what to do. Bloodshed was averted even if hard feelings were not.
(1980’s | doctorese | “bad diagnosis”)
An example of an older expression that has grown common and become less specialized (other examples: “blowback,” “grounded,” “politically correct,” “template“). In medicine, “false positive” goes back at least to the forties, probably earlier; for some reason, the only results in Google Books from those days have to do with the Wassermann test for syphilis. In the seventies, the phrase got a boost from the popularity of home pregnancy tests. In the eighties, it was employee drug testing. Both developments got plenty of press, so use of the phrase grew sharply, and as it spread it began to turn up outside of strictly medical contexts. Now it can apply to virus or spam detection, security systems, internet search results, or even economic forecasting or earthquake warnings. The last two are notable because they involve not to results but predictions, which adds a new twist. You said there will be a recession and it doesn’t materialize — instead of you said there was cancer and there was no cancer there. Another example from the scientific community: “A false positive is a claim that an effect exists when in actuality it doesn’t,” that is, detecting a correlation that exists only because of your misinterpretation of the data. All these meanings rely on presumably preventable misreadings of an empirical result, incorrectly assigning too broad a significance to a single symptom, or maybe just running the test wrong.
False positives are a big problem; they can creep into the work of the most careful scientists. Medical tests that show a disease that isn’t really present can result in unnecessary or dangerous treatment, and all the expense that goes with it. The effect is subtler in empirical science, but pressure to obtain statistically significant results can skew the perspectives even of conscientious experimenters. (This article explains how it happens.) Such errors are dangerous because it’s worse to be sure of something that isn’t true than to fail to know something that is. As a great American philosopher, possibly Josh Billings or maybe Will Rogers, said, “It ain’t what people don’t know that’s the problem; it’s what they know that ain’t so.”
The expression was well settled by 1980, but only in medical contexts. (“False negative” is just as old.) When it turned up in general-interest articles, it often came packaged in quotation marks. It had not become a regulation noun; in those days it was still normally a compound adjective, applied to readings, results, reactions, responses, rates. Now it is more common as a noun than as an adjective.
I’m sure I wasn’t the first or last kid to stumble over the counterintuitive meaning of “positive” in medicine. I thought “the test came back positive” was good news, whereupon my hard-working parents (I kept ’em hopping) had to explain that the word you wanted to hear was “negative.” Doctors test for the presence of a disease or condition, and a positive result means they’ve found it, and you’re stuck with an undesirable disorder. It’s the only zone in everyday language in which “positive” means “negative,” I do believe. (It reminds me of middle-aged parents in the seventies cheerily reminding each other that “bad” meant “good.”) We must ever observe the instructions in the song and accentuate the positive, but not in the lab, please!
(1990’s | journalese | “duck walk,” “gauntlet”)
The act itself is objectionable to any American who takes the Bill of Rights seriously. Police march a suspect in shackles through a posse of reporters and photographers, who try to get him to say something incriminating or at least look guilty. Part of the purpose is planting the arrestee’s guilt in the public mind. Or it’s simply a way for the cops to get back at someone they don’t like. Parading criminals in front of crowds is a very old custom indeed, but the perp walk differs in two crucial respects. First, they are designed solely for the benefit of journalists, who act as stand-ins for the mobs of old. Second, they take place before guilt has been legally established. Most of us don’t get upset when a child pornographer or wealthy asshole is subjected to a perp walk, but as always, caution about giving the police too much rein is indicated. Presumption of guilt is insidious, and this expression perpetuates it shamelessly. There’s no snappy abbreviation of “accused” or “alleged,” I guess. “Susp” does not roll off the tongue.
“Perp walk” entered the mainstream somewhere around the mid-1990’s. There was no precise older equivalent, to my knowledge; if there was, it was pretty specialized. It might have been called, with grim irony, a “photo opportunity,” but I never heard anyone use that phrase that way. (Then I learned that the New York Post did call it a “photo op” a few months ago, so now I have. Who says writing a blog isn’t educational?) Several sources agree that the practice itself dates back decades, but the term “perp walk” does not appear in LexisNexis before 1986. My guess is that like “road rage,” the signifier grew more common because the signified grew more common. I’m not sure how far the rise of “reality” cop shows in the early nineties pushed “perp walk” into prominence. (Shouldn’t we have a reality show made up of nothing but footage of perp walks?) The phrase can’t be much older than that, because “perp” isn’t much older. It was almost certainly in use among police officers before 1980, but it was primarily a New York term throughout that decade; “perp” doesn’t seem to have become widespread until 1990. Its main advantage is that it’s short and memorable, but its lack of associations and baggage is also useful. Not a neutral term, exactly, but not as fraught as “accused,” “criminal,” or “con.”
Even after twenty years, “perp walk” has little if any figurative use. Here’s one instance, but it’s little removed from the literal: “chaplains and psychologists are housed together with the troops, so that a guy seeking mental health counseling doesn’t have to make the long ‘perp walk’ up the street past his buddies to the therapist’s office” (Huffington Post, April 29, 2016). It could stand in for any process that casts suspicion on someone, or even an invasion of privacy by the government or the press, but nothing like that has taken hold. “Raid,” “witch hunt,” “hounding” — all metaphors once — have become more or less standard terms. “Perp walk” shows little sign of going the other way.
Here’s a phrase that should be but isn’t: “perp school.” It’s when a juvenile is placed in an adult prison. No, wait, “Perp Walk” is a street name on Fire Island. Or how about “Under the perp walk, Down by the jail, Marched down the street in handcuffs, Gonna ride a rail.” I’m beginning to see why “perp walk” remains solidly literal — it doesn’t lend itself to plays on words, or any kind of play.
(1980’s | businese (finance) | “on request,” “when you want it”)
When did “by request” become “on demand”? The expression in financial circles is quite old; a note or loan might be payable “on demand” (all at once when the lender calls for it) rather than on a fixed schedule over time. But somewhere in there, it took on a much wider range of use. The campaign for abortion rights certainly played a role; by 1970 it was not unusual to hear talk of abortion on demand, which became a rallying cry as laws banning abortion came under attack. That trend has been going the other way for the last two decades, too late to stop the expansion of “on demand,” which now applies to nearly everything that can be ordered over the internet, from groceries to streamed movies to academic courses. All you have to do is snap your fingers, or tap your phone. (Doesn’t sound right, does it? even though it’s a literal description. But the old meaning of tapping a phone continues to get in the way.) You may have to wait longer than you did when you left the house to supply this need or that, but we are beguiled by the ease of letting a credit card and a delivery service do all the work, making the new “order” seem all the more attractive.
So a staid and venerable financial term has sprawled all over the place like lava flow from an angry volcano, aided first by medical and cultural trends (not just abortion — drug treatment and medical care more generally glommed onto the phrase in the seventies and eighties) and then by the rise of the personal computer, which even before the internet infiltrated our lives occasioned much talk of providing computational or word-processing services on demand. The phrase has become a hyphenated adjective as well. “On-demand economy,” based on people spending money from their smartphones, is a phrase you will hear more and more.
There seems to be an implicit democratization at work, too. If you have enough money, just about anything is available on demand, and that’s been true for centuries, making allowances for the fact the number of things we want, or think we need, has grown over time. Now you don’t need much money to acquire goods or entertainment on demand. If money can’t buy it, it’s not so easy. We may forget that not everything desirable can be had at the click of a mouse.
I’ve suspected for a long time that the internet has completed our transformation into a nation of three-year-olds, a trend initiated by the Sears Roebuck catalogue and the rise of advertising in the late nineteenth century. The consumer economy requires people to come up with new stuff to want and must continually come up with quicker and more reliable ways to get it to them. eBay, for example, consummates a huge number of transactions every day called “buy it now.” Is that much different from “Want it NOW” or “gimme NOW”? When it comes to tangible items, it’s not even instant gratification — that CD or toaster won’t fall into your lap the minute you click “confirm and pay” on Paypal. But we’ve learned to treat it as instant gratification; making the purchase is as good as holding the object of desire in our hands. Amazon wants to use drones to deliver packages faster than ever; next year it will be something else. We have created an economic monster that requires our appetites, and the means to sate them, to continue growing indefinitely. How long can we keep it up?
(2010’s | therapese? academese? | “little thing,” “insult,” “slight,” “dig”)
Now that Jim Crow is no longer legal (not that it has disappeared), we are left with microaggressions: words or actions directed at members of a minority group that appeal to negative stereotypes, intentionally or not. They do not violate any law, sometimes not even social convention, and in some cases the oppressed person can’t even explain why he is offended. But they can have a powerful cumulative effect, causing people to feel as degraded as their forebears felt under more immediately threatening conditions. To such victims, the microaggression is only a more subtle means of keeping women, African-Americans, Latinos, gays and lesbians, Jews, the homeless, trans- people, et al. in their places. It’s not just white men who commit microaggressions, though we do it more than anyone else, partly because we have the biggest pool of people to commit them against. But pecking orders are observed here as elsewhere, and each group looks for another group to feel superior to. In U.S. culture, everybody gets to pick on African-Americans, but African-Americans get to pick on LGBTQ people. Men lord it over women; the sharp mulct the dull. There must always be a way to define yourself such that there exists a class lower than you. As long as we seek such imbalances of power, we will have fertile fields for microaggressions, among other things.
Many sources attribute the coinage to Professor Chester Pierce, ca. 1970, an African-American professor of psychology at Harvard. The New York Times also pointed to a 2007 article by Professor Derald Sue that pushed the term out of the academic ghetto into wider use. (I certainly don’t recall hearing it before then.) To this day, the word is used far more often at universities than anywhere else. We have a lot of “micro” words now: microfiber, microloan, microblogging. “Microcephaly” has reared its ugly head recently thanks to the Zika virus. Two more examples sometimes seen near “microaggression” are “microinequality” or “microinequity.” I can’t help but hear an echo of the medical term “microabrasion,” which has little semantic connection but a strong phonological one. The word “aggression” does get people riled up, but the reason “microaggression,” despite its technical, academic sound, has some punch and poignancy stems from the fact that such acts occur only in situations when both the aggressor and aggressee are in direct contact, normally in a public place; they cannot be committed remotely, except by telephone, but even there you have two people engaging each other. Personal interaction is required.
Microaggressions have emerged as the latest fodder in an old debate: Are the oppressed overreacting to unexceptionable behavior, or are the oppressors using any means available to remind everyone else who the boss group really is? The more fundamental question — who gets to decide? — may be shunted aside. Straight, well-off white people are quick to suggest that microaggressions are symptoms of hypersensitivity or political correctness, a means to make us feel guilty even after we’ve made the reforms we were asked to make (well, most of us). But SWOW’s likewise dismissed much more brutal and intimidating means of subjection, from segregation of public amenities to lynching. You know, “They don’t have it so bad. Look at all the nice things we do for those people.” Not much comfort when you’re hauled off to jail for sitting in the wrong place or getting killed for an imagined offense against some white man’s code of honor. That old feeling of domination, whether backed up or not by formal legal sanction, counted for a lot. Treating as equals those you have been discriminating against for generations is a hard pill to swallow, and lots of people are tired of trying. It’s easier to say, “Wait a minute. I’m a victim, too!”
The rise of the microaggression may be taken optimistically: Except in a few extreme cases, physical and economic violence have gone out of the practice of racism, etc., leaving only petty snubs and well-meant gaucheries, which do much less real damage and will in turn become unacceptable in another generation or two. Or pessimistically: There’s no end to it. We get rid of one layer of abuses, and there’s another below that, and another below that. Microaggressions definitely damage some individuals, and that will ultimately hurt the larger society. My two cents: I haven’t thought this through, and it may be untrue, but it seems to me that if a half-concealed sneer can cause significant harm, then small kindnesses may also have an effect greater than their magnitude. It would be awfully nice to think so.