(1980’s | therapese | “stepfamily”)
Contested terrain semantically, as in other, more obvious, ways. Start with the definition. Nowadays, most people would probably endorse a relatively loose definition of “blended family”: any family formed when an adult with one or more children takes up with a different adult, who may or may not have children. If you’re a purist, you might require that both adults have at least one child. In 1983, a writer defined it thus: “pop-psychology euphemism for members of two broken families living under the same roof, a mixture of step-parents, step-children and step-siblings.” Ten years before that, a psychology textbook defined it as a “family consisting of a husband and a wife, the children of either or both from a previous marriage, and children of the present marriage.” The new spouses had to have kids together, not just with former partners. The extra distinctions may have been made possible by a wider panoply of related terms than we can remember now. A surprisingly large amount of vocabulary sprang up around such filial configurations; in 1980, the New York Times propounded the following list: “conjugal continuation, second-marriage family, stepfamily, blended family, reconstituted family and metafamily.” (It missed “merged family,” also in use by 1980. “Mixed family” means that the parents are of different race, ethnicity, or religion.) Of these, only “stepfamily” would be familiar to most people in 2017, but Wikipedia distinguishes between stepfamilies (only one adult has a pre-existing kid) and blended families (both adults). According to the OED, “stepfamily” goes back to the 19th century; the earliest citation I found for “blended family” dated from 1964.
Why did “blended family” win out? Probably the usual mixture of euphony and accuracy, or intuitiveness. Most of us understood pretty quickly what it meant the first time we heard it in context, and it sounds good — not too long, not too short, scans nicely. “Second-marriage family” is clunky; “metafamily” is jargony and doesn’t make a whole lot of sense anyway. “Blended family” sounds a lot better than “reconstituted family” (just add water!), you have to admit. The only mystery: why didn’t “merged family” catch on?
We like to think that the quirks and foibles of our own generation are unprecedented, but blended families are hardly new. My father’s father grew up in one after his mother divorced his father and married her second husband. My mother’s mother was the daughter of a second marriage, an old widower and a young wife. Life expectancy was lower then, so remarriages were more often occasioned by death than divorce. Was there a decline in the number of blended families for a generation or two, long enough to forget how common such arrangements used to be? If so, the phenomenon has come roaring back. Somehow, before 1970 or so, we got along without a general term for it. Now we’ll never get rid of this one.
There may have been earlier examples on television, but “The Brady Bunch” was the first show to feature a blended family week after week, thus perhaps making the whole idea seem more wholesome. It is doubtful that the sitcom had much effect in its time, given its poor ratings and reviews, but pop-culture observers agree that it had a long and powerful afterlife among those of a certain age (mine), for whom the Brady Bunch is part of a comforting nostalgic penumbra (accent on “numb”). Several shows about different varieties of blended family have succeeded Mike and Carol and Sam* and Alice: Full House, Step by Step, Modern Family. The Bradys anticipated a trend; their descendants follow along behind, trying to catch up to everyday life. The Stepfamily Foundation started life in 1977; support groups and talks at the local library aimed at blended families seem to have arisen in the eighties, when the requisite self-help books also began to appear. New terms must surely arise to reflect new conditions, but the rule is that only one or two out of a larger number will make it to the next generation and a shot at immortality.
* The butcher. Remember?
(1980’s | athletese | “bow,” “youthful exuberance,” “rubbing it in”)
I’m not quite sure when the custom originated of taking a victory lap after a race. The first instances of the phrase I’ve found turn up in the context of auto racing in the fifties and sixties, but runners have probably been taking them at least that long. (Victory laps are reserved for runners and drivers; horses are spared.) For all I know the Lacedaemonians or Athenians made an extra circuit of the stadium after trouncing the other, and being Greek, they must have had a word for it. The purpose of the act seems simple enough: it gives the adrenaline a little time to subside and the athlete a little time to soak up adulation. As late as 1980, the expression was restricted to track, racing, or occasionally non-athletic contests, like the Miss America pageant, already a small deviation from the literal.
In its figurative sense, the term is used most often by political journalists, though not exclusively; musicians and fashion designers may take them, for example. (In student slang, a “victory lap” refers to a year of high school or college beyond the usual four.) The first specifically political use I found appeared in the Washington Post, describing Reagan’s meetings with assorted government officials after he won his first term in 1980 (he pledged “cooperation with all,” as new presidents customarily do). Non-racing athletes also rated the term around the same time; I was somewhat startled to discover that as early as 1983 Reds’ catcher Johnny Bench’s last season was referred to as a “victory lap.” When a well-known athlete announces retirement far enough in advance, he may reap respectful send-offs at opponents’ stadiums as well as his own. Sometimes it’s called a “victory tour” to give the whole exercise a grander sound; either way, it’s all about adoring crowds, which is what politicians are after, too. Even today, “victory lap” denotes the acts of elected officials more often than not. As far as I know, neither man used the term, but the post-election travels of both Obama and Trump were widely described as “victory laps”: Trump’s thank-you tour and Obama’s last round of visits to European capitals. In the latter case, the phrase didn’t evoke any particular triumph so much as a sense that it was Obama’s last chance to talk up his achievements.
The rise of this expression in political journalism has given it an unsavory connotation. Victory laps used to be joyful celebrations, perhaps not always spontaneous, but at least a moment of innocent exultation shared by athlete and audience. A certain amount of self-congratulation was involved, to be sure. But the politician’s victory lap generally has more to do exaggerating an achievement or rubbing salt in the wounds of the defeated. It is a thoroughly calculated gesture, at worst malicious and at best indulged in purely for its own sake. Politicians are forever being taken to task for taking crass advantage of such opportunities for self-promotion, either because the victory is illusory or because the victor is crude and ungracious. That tendency hasn’t changed and seems unlikely to.
(1980’s | therapese | “hiding one’s head in the sand”)
My guess is we owe today’s prominence of “denial” in psychological lingo to Elizabeth Kübler-Ross’s stages of grief. I doubt we would have “in denial” without the predecessor; the phrase as we use it now didn’t turn up before 1970 anywhere I looked. The term and associated concept — refusing to believe that which is clear to others, as by failing to acknowledge an emotional or psychological state, or even sheer physical reality — were already in existence, but Kübler-Ross’s “On Death and Dying” (1969) was very influential; one of its effects was to make the experience of denial common to nearly everyone. Not long after, the term became popular among counselors of alcoholics and other drug addicts who refused to admit they had a problem. “In denial” may be merely a compressed version of “in a state of denial.” It appears to be the most common phrase descended from “denial,” but not the only one; Pam Tillis hit the country charts in 1993 with a song about Cleopatra, Queen of Denial (though I’m pretty sure the redoubtable Rev. Billy C. Wirtz had used the joke before then).
“In denial” has been in use for a long time in other contexts, but the grammar is new. Now the phrase is most common as a predicate complement (e.g., “You’re in denial.”), possibly followed by “about,” but not “of.” In the old days, when it followed a verb it had to be active (e.g., “result in denial” or “engage in denial”). Of course, it appeared everywhere in legal prose (e.g., “in denial of the motion”), and it started to bob up in political contexts in the eighties, particularly around the time the Iran-Contra revelations were unraveling Reagan’s second term. It was kinder to say Reagan was in denial than to contend that he really didn’t know what was going on. Maybe this is one of the many terms Reagan helped into the language directly or indirectly, or maybe it would have happened anyway. By 1990 it had made its mark, though ace sportswriter Thomas Boswell put it in quotation marks as late as that spring. No surprise that it became popular — it’s compact and it packs a punch. The expression conjures a state of passive malignity or dangerous indifference, willful or not; like “passive-aggressive,” it’s always an insult.
Now “in denial” is entirely standard, eligible to be adapted to all sorts of uses, including humor, irony, and wordplay. (Here’s a bouquet of suggestions for compilers of rhyming dictionaries: “infantile,” “spin the dial,” “undefiled,” “linden aisle.”) I haven’t heard “SO in denial” or “in deep denial,” but I don’t get around much; both certainly lie within the universe of possible utterances. Or “Live in denial,” which may also be heard “living denial” (as in “Girl, you are just living denial 24/7“). “Oh, he’s such an old in-denial crocodile” could be the next catch phrase. “Hit denial on the head” might be a self-help slogan, meaning something like overcoming obliviousness and seeing the world without illusions. Why not “The In Denial 500,” which pits the nation’s most noxiously clueless bachelors against each other to see who can act the most idiotic? For you tongue-twister fans out there, it’s not much, but it’s the best I can do: Say “undeniably in denial” five times fast.
unclear on the concept
(1990’s | journalese (comics) | “missing the point,” “failing to grasp the situation,” “obtuse”)
Such a mild way to call someone an idiot, this expression imputes a genuine doltishness, a stark inability to understand the simplest processes or cause-and-effect relationships, which is strange, because on the surface it doesn’t sound so condemnatory. “Unclear” suggests nothing worse than temporary befuddlement, and “concept” lends a faintly professorial air to the whole affair. The phrase is not intuitive, but instantly understandable; one does not ask, “what concept?” Its rhythm and alliteration give it a rough poetry that has helped perpetuate it in our ear.
Not many new expressions have an easily identified source, but this one does: cartoonist Joe Martin, creator of “Mr. Boffo,” a comic strip that started showing up in the late eighties and had a running gag titled “people unclear on the concept,” illustrated each time by some species of magnificent cluelessness (examples here). Neither LexisNexis nor Google Books shows any instances of the phrase before Mr. Boffo came on the scene, but afterwards it popped up all over the place, so I think Martin gets credit for it. Only a select few expressions — factoid, glass ceiling, hot-button, irrational exuberance, tiger mother, trophy wife are all that come to mind — can be safely credited to a specific person. This one hasn’t become as widespread as it might have, but it turns up in numerous contexts: politics, computers, sports, arts, you name it. A notable feature: It’s most unusual to call oneself “unclear on the concept”; we use it about other people. The need for such an expression is obvious, as our fellow human beings continue to plumb new depths of stupidity. Our habit of believing the worst of those we disagree with, which I noted recently, has created fertile ground for any locution that helps call attention to the treacherous folly of others, sedulously contrasted with our own forbearance and rectitude.
There seems to be a sub-category of new expressions that deal, like this one, with varieties of denseness. “D’oh” starts at home — it’s what you say when you’re hoist by your own petard — but most of these expressions rely on the “dumbth” (word by the, alas, largely forgotten Steve Allen) of others. “Hello?!” and “didn’t get the memo” tend to cover small-scale, interpersonal situations; “dumb down” and “special needs” have a larger, more civic role. It is an oddly heterogeneous group, ranging from personal to political, from desirable to pejorative, from jocular to confrontational. We forget sometimes how complex a concept stupidity can be.
(1990’s | journalese (arts? politics?) | “forget (temporarily),” “(have it and) lose it”)
I’m not very rigorous about it, but in everyday conversation I try to avoid using the kind of new expressions I write about here, just as I try to avoid using such new expressions in posts except to refer to them directly. But this one is an exception, and I catch myself using it fairly often. It has a host of predecessors. Probably descended directly from “draw a blank on” (be unable to remember whatever it is), it also recalls “blank look” and “let your mind go blank” (or the more involuntary “my mind is a blank”). The word implies a temporary but vertiginous mnemonic malfunction, a moment of vacuity that may lead to a deer-in-the-headlights look. A related verb is “blank out” in its intransitive sense, though that may cover a longer time span. “Blank on” means forget something and then recover it, a short-term lapse, more like a senior moment. It may also mean, on occasion, “fail to respond.” (“Shooting blanks” means something entirely different. “To blank” in sports lingo normally refers to holding the opposing team scoreless. Then there’s that charming if now unnecessary euphemism, “blankety-blank.” It is one of those linguistic oddities that “blank,” descended from the French word meaning “white,” looks and sounds much more like “black.”) “Blank on” has so many ancestors that some don’t even involve the word “blank”; doesn’t the phrase “(totally) blanked on it” remind you of “bank on it”? I continue to maintain, without proof, that such phonological resemblances influence new entries into the language.
One does hear occasional variations in meaning when this expression is used, but they never seem to catch on or persist. I saw this sentence recently in a food column in the Dayton Daily News: “Pasta is always a conundrum as a side dish. I want to pencil it into my weekly meal plan, but then I blank on how to sauce it: Cream? Tomato? Lots of cheese?” Here the emphasis falls on inability to choose among alternatives rather than failing to remember them. This usage may prove a solitary exception to the rule, but the contretemps is one we find ourselves in often enough that another word for it may be welcome.
The verb really did not exist before 1980, as far as I can tell. It started to turn up occasionally afterwards; in one of the first uses I found Reagan was the subject of the verb, and this may be yet another expression to which his presidency gave a boost, on the strength of his well-known absent-mindedness rather than policy initiatives. It had entered the language pretty definitively by 1995, often used by politicians and press secretaries, but actors also use it a lot. During the latest presidential campaign, it quickly became the standard verb to denote Libertarian candidate Gary Johnson’s inability to address the significance of Aleppo. As is often the case when a new phrase resembles an old one, or several old ones, the trail into everyday language is not well-blazed and it may be impossible to determine, even in retrospect, how it wormed its way in.
(1990’s | legalese | “discrimination”)
The verb “to profile” has a relatively complicated recent history, even if you set aside the usual literal or technical meanings from geology, engineering, esthetics, etc. For most of the twentieth century, the most common usage had to do with interview-based journalism — describing a worthy individual or organization in detail. Usually an actor or comparable cultural phenomenon, hence the phrase “celebrity profile.” The word was available as both noun and verb, but from either angle it seems an odd choice. The classical meaning of “profile” — a face seen from the side — would seem, on the face of it (sorry), to have little to do with a revealing biographical portrait. To carry the metaphor to its logical conclusion implies that the reporter has left out half the relevant information, as a profile leaves out half the visage — although an art critic might argue that sometimes the profile is more revealing than a full-face view, and there’s no denying some faces are far more interesting in profile. Sometimes “profile” means little more than “categorize,” as in a corporate profile that provides statistics grouped under various measures of performance. In African-American slang, “profiling” was another word for “showing off.” But when we use the term in African-American contexts today, it has an entirely different slant.
Our use of “racial profiling” today is descended from the more sinister practice of psychological profiling; the OED lists its first example of this usage in 1951. The goal is to see beneath the surface presented by the soldier, teacher, or employee, the psychologist’s trained eye constructing an account of each personality that understands the subject better than she understands herself, or at least better than the boss understands her. Inevitably, it occurred to the criminal justice system that such a thing might be useful in dealing with malefactors, and the idea of profiling this depraved criminal or that deranged terrorist entered the mainstream in the seventies and eighties. By 1990, the concept had undergone further refinement in the form of DNA profiling, by which the expert found a unique way to identify any individual through a bit of hair or saliva, again finding a distinctive marker that was not apparent to the unaided eye or brain. A DNA profile is a hyper-detailed diagram constructing a definitive portrait that cannot be confused with that of anyone else. Though the technology is often used in the context of medical research, it turns up much more often in news accounts of criminals, which has paved the way for “racial profiling,” now the dominant locution in which “profiling” appears. (I append the ACLU’s definition along with a reasonably non-partisan discussion of various kinds of profiling.)
The extraordinary thing about the new expression is that it has turned the old idea on its head. Racial profiling dispenses entirely with a painstaking account of the individual, teasing out a detailed map of characteristics, and replaces it with a simple question: Do you belong to this or that dangerous group? (Profiling based on religion or nationality is also possible, of course.) On one view, the change in usage is a complete reversal, but from another it is more or less seamless — profiling is merely one more weapon in the eternal war against the bad guys — and therefore it may be entitled to a certain poetic license.
The illogic of widespread, systematic profiling has been proven so often that the practice has few defenders but many adherents. When Americans feel threatened — some of us don’t even have to feel threatened — we disregard the studies and the logic and reach for the easy, satisfying answer. If a few people from a certain group mean us harm, make all of them suspect. For that to have any chance to work, the group must be very small, but preferred objects of unequal treatment in our society number in the millions, most of whom are law-abiding and just trying to do their jobs and pay their taxes. Having been mistreated by the justice system, such members of minority groups have no incentive to work with police and a quite reasonable desire to avoid them. Police departments around the country have learned this the hard way. (An exchange between Sam Harris and Bruce Schneier may flesh out the argument sketched above.) But their experiences have not dissuaded an uncomfortably large percentage of us, who demand that the law be simple and punitive. In America, foolish and failed policies can be enacted over and over again, if they benefit — or harm — the right people.
(1980’s | journalese (politics); | “lambaste,” “lash out at,” “rap”)
(2000’s | computerese? | “snowed under,” “overburdened”)
If you persist in associating slamming with doors, home runs, telephones, or fists on the table, you are behind the times. If you think of poetry or dancing, you’re in better shape, but the verb has taken on two intriguing and non-obvious definitions since 1980, both of which are technically transitive, but one of which is generally used in a way that disguises its transitivity, to the point that it may not be transitive at all any more — more like an adjective. To be fair, only doors and home runs got purely transitive treatment in the old days; telephones and fists required an adverb, usually “down,” although Erle Stanley Gardner, creator of Perry Mason, sometimes wrote of a telephone receiver being “slammed up,” which may reflect an older usage or may have been idiosyncratic. Sometimes a preposition is required, usually “into,” as in slamming a car into an abutment. (But you might also say the car slammed into an abutment.) That’s if you neglected to slam on the brakes.
“Slam” today means “attack” or “harshly criticize,” while “slammed” means overwhelmed by work or life in general, when it isn’t merely serving as a past participle. The former emerged first, before 1990 — I found very few examples before 1980 — primarily in political contexts, though it could also be used to talk about entertainment, as in slamming an actor, or his performance. It appears to have been more common in England and Australia; I doubt it originated there but our anglophone cousins may have taken it up faster than we did. “Slammed” came along later; I found only a few examples before 2000, mainly among computer jockeys.
How are the two meanings related? They both rest on deliberate infliction of metaphorical violence, obvious when one politician slams another, less so when one feels slammed “at” or “with” (not “by”) work. When I first encountered that usage, I understood it to mean the boss had assigned a whole bunch of work without recognizing that the employee already had too much to do. That doesn’t seem particularly true any more. “Slammed” no longer automatically imputes malice — if it ever did — and need not suggest anything other than adverse but impersonal circumstances. Gradually it has spread so that it need not refer strictly to having too much to do; in recent years it has developed into a synonym for “exhausted.” It has somewhat more potential for expansion than “slam,” which has not strayed from the basic idea of heated verbal assault.
Is there a direct link between the two? We might expect to discern a path from the older meaning to the newer, but how would it work? The boss can excoriate your performance, or he can dump too many tasks on you, but they would seem to be separate operations. If you’re no good to begin with, why would the boss ask you to louse up still more projects? It’s a compliment if the boss piles work on you, not an insult. The linguistic pathways that led to these two recent additions to the dictionary may remain mysterious, but there should be no confusion about why they have become so popular in the last thirty years. Our pleasure in believing the worst of each other has led inescapably to uglier discourse, offering numerous opportunities to use the older verb. On the job front, whatever productivity increases we’ve wrung out of the workforce since 1970 have come from longer hours and fewer people; so those who still have a job must work harder. Conditions favored harsher language, and there was versatile “slam(med)” to fill the gap.
(1980’s | therapese | “the house feels so empty”)
This is one of those effortless phrases. The first example I found in Google Books dates from 1968; by the late 1970’s it was turning up in the mainstream press now and then, and everyone seemed to get it right away. At that early date, it still required quotation marks and a brief gloss, but little time elapsed before the expression made itself at home. It was well arrived by the time a sitcom of that title debuted in 1988, spun off from The Golden Girls. “Empty nest syndrome,” an early elaboration, is the most common use of “empty nest” in adjective form; “period,” “phase,” and “blues” are other possibilities. As noun or adjective, it retains an innocent, “literal” quality — of course, the phrase is not literal at all, but its evocation of pure-hearted little birdies seems to shield it from irreverent wordplay. Even after thirty years, the phrase has not developed much of an ironic life, and it is not often used to refer to anything other than a home (or family) from which the last resident child has departed. “Empty nest” does have unlooked-for complexity when you take it apart. The first half is literally false — the nest isn’t empty because the parents are still there. The phrase as a whole requires knowledge of how birds bring up their young, sheltering them until they reach maturity, then sending them on their way.
The semantics of “empty nest” may tickle the analytical brain, but the concept appeals to the emotions, and it soon found a home in the long-running debate between parents and grown children over whether it’s really a good idea for the kids to move back in rent-free after college. The kids are all for it; parents are much more divided on the question. In my own case, the model was the great economist or perhaps sociologist Thorstein Veblen, who returned to his parents’ farm after taking a Ph.D. because he couldn’t find work, and filled the time with reading and long conversations about society and politics with his father. That sounded pretty good to me, but Dad saw disadvantages to the scheme and suggested graduate school instead, which ultimately got me out the door for good.
Not all parents are unhappy at the thought of their children moving back in. Some parents get all broken up when the last child leaves the house, and they are the most vulnerable to later irredentism on the part of their down-and-out offspring. Other parents can’t wait to see the back of their kids and have looked forward to the empty nest for years. I haven’t done a study, but I doubt such empty nesters (is it my imagination, or does that term imply a certain affluence?) relish the prospect of having their uncouth twenty-something kids cluttering the living room. This antidote to the empty nest is now known as “boomerang kid,” a term which arose within the last thirty years. By the way, that news article we’ve all read about how unprecedented numbers of college graduates are moving back in with Mom and Dad has been a staple at least since 1980. It’s a wonder anyone under forty lives on their own.
It is less true now, but in the olden days empty nest syndrome was primarily associated with women, a rough complement to the midlife crisis for men. True, mothers nostalgic for having surly kids in the house didn’t usually buy sports cars or cheat on their husbands, but both middle-age traumas mark a troubled transition to a later phase of adulthood. How can you tell “empty nest syndrome” was a well-established concept by 1985? By that time a whole new branch of the advice for the lovelorn industry had already sprung up, especially in women’s magazines, soothing unhappy mothers with an endless stream of counsel and reassurance.
(2000’s | journalese (politics) | “coded (signals),” “speaking to people in their own language,” “telling people what they want to hear”)
In its figurative political sense, “dog whistle” first began to turn up in quantity around 2000, primarily in the Australian press. When used down there, it was generally identified as an American expression. I’m not saying they were wrong — though in 2005, William Safire quoted an Australian reporter suggesting that the phrase may have originated in Australia after all — but LexisNexis coughs up precious few examples in America, or anywhere else, before 1995. None, really; the Washington Post defined the “dog whistle effect,” a pollster’s term, in 1988: “Respondents hear something in the question that researchers do not.” (I’m not sure if this remains a technical term in polling.) In 1995, a House Republican spake thus of Newt Gingrich: “When Newt and the others would talk about what was possible, it was like a dog whistle. Some people heard it and some people didn’t. If you were tuned into that frequency it made a lot of sense.” And that, of course, is the essence of the dog whistle. One group gets it full force and the others are blissfully unaware of the hidden message, giving the in group the added pleasure of putting one over on the uninitiated. If you want to express covert solidarity, use words and phrases that have special meaning for the target, but not for others. In fact, the phrase didn’t blossom in the U.S. until shortly after George W. Bush took office; he wooed evangelicals with snatches from hymns or Bible verses intended to elude listeners not versed in Christian vocabulary. Bush used religious rhetoric in much more open ways, but he also found subtler means to reassure that reliable chunk of his base. Dog whistles, in Australia as in the U.S., get more of a workout from politicians on the right — wonder why that is — so it’s telling that the concept was associated early on with Gingrich and his merry men.
The New York Times, not normally an outlier when it comes to contemporary usage, nevertheless defined “dog-whistle politics” thus in 2005: “handful of emotive issues that will hit voters like a high-pitched whistle,” ignoring the point of the metaphor but not exactly inaccurate, either. In the more regulation sense outlined above, the phrase has always been more common as an adjective, most often modifying “politics,” but it is also available as a noun. While it is possible to use it in other contexts, it is a political term par excellence. It captures one of the many kinds of duplicity required of politicians, though it’s more roundabout than the usual “I know an easy way to make everything better, and it won’t cost you a penny.”
In case anyone is wondering if the presidential election may have prompted this week’s musings, maybe you’re right. But for the most part, Trump didn’t bother with dog whistles, and that was one of the most extraordinary features of his campaign. Hillary did use dog whistles to talk to investment bankers, but she certainly made no effort to convince coal miners that she had a deep connection with them. It’s not entirely true that Trump dispensed with dog whistles, but coyness is not one of his attributes — which doesn’t mean he’s honest — and he did best when he trumpeted the yearnings and grievances of the white right. If the dog whistle loses its raison d’être in our politics, Trump will get the credit — or blame. Dog whistles are dishonorable, but they are also an acknowledgment of an accepted range of political discourse that does not permit slander, baseless accusations, or entirely fabricated “facts” to become the stuff of campaigns. When Trump wanted to fire up his base, he didn’t bother with the subliminal. And it worked better than anything else Republican presidential candidates have tried lately. Maybe all it proves is that many Republican voters don’t like indirect messages because they’re too dumb to interpret them. Give ’em a little rhetorical red meat, and they’ll follow you to the ends of the earth.
Thanks as always to lovely Liz from Queens, who has contributed countless expressions to the blog and continues to do so. My cup runneth over.
(1980’s | businese | “sure thing,” “fait accompli”)
“Done deal” always makes me think of the mob expression “made man.” The alliterative spondee lends both expressions the necessary sense of finality and irrevocability. I don’t know of any connection between “done deal” and organized crime; the earliest uses of the term I was able to find come out of the financial industry, soon absorbed into political discourse. As you might expect given its business origins, “deal” clearly refers to transactions, not cards, although I can imagine a casino employee responding to a poker player’s complaints with “Shut up — it’s a done deal.” Newsweek noted in 1985 that the phrase was a favorite of Treasury Secretary James Baker, and such early patronage by politicians favored its fortunes; there’s no doubt “done deal” is as useful in politics as in banking (or the Mafia, for that matter). Even today, the phrase turns up most often in financial and political news — not that they’re different. “Done deal” has now come to be used more often, if not predominantly, in the negative, to caution us that there’s no guarantee the contract will be completed as advertised (e.g., “this is not a done deal”).
“Done deal” originally referred to business maneuvers, but as politicians picked it up it came to mean any sort of dead certainty (a little like “slam dunk,” but used in different situations). A way of saying “we’re not going back” or “you can count on it.” A done deal need not actually be done, but the point is that even if the papers aren’t signed, they will be soon. It does seem to me that “done deal” is often used to refer to a transaction or agreement that is not yet formal or final; once the deal is truly executed, it is no longer necessary to call it “done.”
“Done deal” represents a form of grammatical displacement not uncommon among new expressions. The concept is an old one, so how did we express it in the old days? “Settled,” or more poetically “chiseled in stone.” In a simpler key, “all over.” These are all adjective phrases that cannot serve as subject or object. Commonplace ideas look for new parts of speech to inhabit, and nouns may slip into power where once ruled only adjectives. To some extent I am speaking fancifully in attributing will to words, which are but bits of breath and ink, but if you spend enough time observing the language, it’s easy to slip into the belief that words have life and motive independent of us, their creators but not their controllers.