Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 40 years

clawback

(1980’s | bureaucratese? legalese? financese? | “recoup,” “recover”)

No longer the sole property of sportswriters, this noun-verb complex has invaded the financial pages and legal journals in force. When I was young, you clawed your way back into a contest through determination and effort, not quitting until the game was on the line and you had a chance to win. It didn’t have to be a single game; it could happen over course of a season, as in a baseball team clawing its way back into the pennant race. It might be used in the context of an individual sport like tennis or golf, but I think it more often went with team sports. In the business world, you might claw your way to the top, but you don’t claw back your way to the top — though you might claw your way back to the top. There’s something ruthless about clawing when people do it; it requires unreasoning vigor, like a jungle cat, blindly fighting its way forward as long as it can move.

In the late seventies, the U.S. began imposing treble (i.e., threefold) damages on defendants who lost certain kinds of civil suits. The U.K. responded by passing a law of their own that gave a British person or corporation the right to recover the portion of the total damages that was not actually compensatory (in other words, the part that was multiplied on after actual damages were awarded). In both the British and American press, this was widely referred to as a “clawback provision.” The expression was much more common in the British, Canadian, and Australian press for at least a decade thereafter, and it is indubitably a Briticism.

My impression was that the expression refers mainly to something governments do, as in the Bernie Madoff case, but a corporation can do it, too; take Wells Fargo’s repossession of stock from disgraced executives in the wake of a banking scandal. I suppose that a business partner could claw back money that another partner had misused, but for the most part it seems to be something an institution does. Clawbacks normally occur when assets have been stolen or used illegitimately; when you hear the word, you can be pretty sure that there was some funny business that has been found out, and a governing body, private or public, is doing something about it. (That isn’t always true; for example, when the British government was privatizing public industries in the eighties, they decreed that a certain number of shares had to be available to British investors. In some cases, that meant “clawing back” shares bought by foreigners to make sure enough shares were available.) The government generally needs some kind of judicial ruling, but a corporation needs no more than the approval of the directors.

In truth, the new expression here is “clawback” (n.) since “claw back” (v.) has been a permissible construction for a long time. (As we saw above, “clawback” also serves as an adjective. I hope I am cold in my grave before “clawbackly” becomes standard English.) But its present sense seems to have arisen around the same time, and I wouldn’t want to state with certainty that one preceded the other, though I would guess the verb came first. It has never left legal and political contexts, or spread outward from them. Law and justice must have their own language.

Tags: , , , , , , , , , , ,

man cave

(2000’s | advertese | “den”)

The evidence strongly suggests that man-caves are the creation of marketers, despite visible traces of the expression before the mid-aughts, which is when it starts turning up in bulk in LexisNexis. The phrasing likely owes a debt to the author of “Men are from Mars, Women are from Venus” (1992), John Gray. While he did not, as far as I can tell, ever use “man cave” himself, he used the two words in close proximity, notably in the apothegms “Never go into a man’s cave or you will be burned by the dragon!” and “Much unnecessary conflict has resulted from a woman following a man into his cave.” In other words, let the old grouch suck his thumb and fiddle with his TV or his train set for a while. He’ll come out and make nice eventually. And if he doesn’t, it’ll be your fault. Gray’s biases aside, he was influential, and today’s more compact phrasing may claim his as an ancestor. Actually, the first use I found in LexisNexis is not due to Gray but to a Canadian columnist writing about house floorplans; she proposed that the basement be renamed “man cave,” because that is where men go to get away from their women. (She had in mind a damp, cobwebbed basement, not a home entertainment center. “Cave” is the French word for basement, so the use of “cave” is more intuitive in Canada than here.) Was author Joanne Lovering an early adopter or ahead of the curve? (Or ahead of the cave!)

But when “man cave” started showing up in quantity, it was purveyed by Maytag, of all corporations, which marketed a product called SkyBox, a vending machine for soda or beer that you could install right in your very own home. Fred Lowery, the director of Maytag’s “strategic initiatives group,” noted that “every guy would like to carve out his own little place in his home. Internally, we call it the man cave. And lots of guys, at some point, would like a vending machine in their man cave” (January 29, 2004). There you have it. Very soon, real estate agents began touting the things, sports promoters jumped on board, and it became a proper fad. No man cave was complete without a big-screen television and a sofa — video game consoles and sports-related items also popular — and if not your very own vending machine, at least a dorm refrigerator, maybe even a full bar. What you won’t find is a workbench. The man’s retreat in my youth was likely to involve tools and at least the possibility of repair or construction. A few men still favor that, but these days it’s more about swilling beer while endless hours of sports unroll before your glazed eyes. Well, not really; what it’s really about is male bonding or just having a place to get away from your woman. The corresponding “woman cave” has not made much headway, a few sightings in the press notwithstanding, but all the ladies have to do is wait; sooner or later some savvy marketer will attract huge sums convincing women they need their own gender-specific refuges.

“Cave” is an interesting word to use here; to my mind it calls up two different associations. First, of course, the caveman: brutal and self-reliant (actually, cavemen were much less self-reliant than we are). Primitive, crude, and therefore manly, the caveman lords it over his woman and slays giant beasts. Just what we all want to be, right? The second association with “cave” is a dangerous, unpleasant place where no sensible woman would set foot to begin with. They’re dark and treacherous, lairs of wild animals, drifters, or lunatics. Of course, that’s what he wants you to think, ladies. He has a giant-screen TV in there — how dangerous can it be? Just don’t get burned.

Why has “man” become such a common prefix in compound nouns since the dawn of the new millennium? Nobody says “man about town” or “man alive!” any more, but you can’t get away from “man-hug,” “man-bun,” “man-boobs.” “Man cave” predates some of these, though “man-boobs” dates back to 2003, according to Urban Dictionary. Is it a simple matter of dumbing down, the word “male” having become too complicated for us cavemen? Is it a wistful attempt to recover a lost sense of masculinity by reverting to the simpler (and therefore more primitive) term? Is it an attempt to express solidarity? “Man-splaining” and “man-spreading” go the other way, of course, used by women in solidarity, not men.

Tags: , , , , , , , , , , , , ,

lanyard

(1990’s)

You’d be surprised at how many meanings this word has — anyway, I was. They’re all related to the notion of a strap (going back to Middle French and Old High German), and that association holds true even in these latter days. A hundred years ago, its primary meanings were: a strong rope used to help secure ship’s rigging, cord used to fire a cannon, strip of leather used to hold a snowshoe (for example) together, and another nautical reference, which evolved directly into today’s use of the word: the cord a sailor hung his knife on so he could carry it around his neck. “Lanyard” has a few other miscellaneous meanings as well: chinstrap (on a hat), summer camp favorite (the lanyard as craft project seems to date back at least to the middle of the twentieth century), what the referee wears a whistle on (back to the sailors again), military decoration (worn on the shoulder), part of a safety harness. When used to help construction workers secure their tools, lanyards aren’t just for necks any more; they can attach to shoulder or wrist as well. By and large, the old meanings are still active, and probably no less common than they ever were. Around them has sprung up a new field that calls on lucrative forces of the Zeitgeist like security, fashion, and commerce, commerce, commerce. Now we have Lanyards USA, Lanyards Tomorrow, and CustomLanyard.net.

If the old meanings are still around and there aren’t any new ones, why an entry? Just my sense that the word has become vastly more popular. It may have meant a lot of things in its storied past, but it always had a specialized air about it. Nowadays, though, you hear it everywhere — everybody from lowly janitors to Super Bowl spectators wears one. And it’s being applied in ways it never was before. Now “lanyard” is what you call the cord or chain you hang your glasses on around your neck; in my youth, plenty of people wore their glasses around their necks, but not on lanyards. It was always true that the lanyard, whatever it denoted, had a strongly utilitarian cast. But not any more; lanyards still serve everyday functions, but they also should match your clothes or sparkle or advertise or say something interesting. An accessory at the very least, potentially more.

The lanyard revolution is more than anything a consequence of our efforts to keep ourselves safe, which has made us ID-happy. Why do you need a lanyard at the Super Bowl, or to get into your office? So you can display your credentials and prove that you belong there. Sure, people carry keys and other household objects around their necks, too, but your standard lanyard nowadays comes with the clear plastic ID-holder, so you never have to dig out your card to show the guard. In the seventies, members of a few professions were using lanyards for that purpose, but now almost any public employee and lots of private ones wear them as a matter of course. It’s a tacit acknowledgment that we have accepted increasing restrictions on our movements in hopes of preventing, or at least limiting, mayhem and bloodshed. Lanyards are an emblem of that loss of freedom, another component of the uniform that the wealthy, ever security-conscious (and for good reason), force on the masses. Adding insult to injury, the bosses and big money want us to regard the badge of servitude as just one more consumer good. If they succeed, we lose again.

Tags: , , , , , , , , , , ,

latchkey kid

(1980’s | therapese?)

Also latchkey child, though that wording seems almost archaic now. Some sources date the expression to the 19th century, but it’s probably later. Random House assigns an origin between 1940 and 1945, and Dorothy Zietz in “Child Welfare: Principles and Methods” (Wiley, 1959) cites not only “latchkey child” but “eight-hour orphan” and “dayshift orphan” as synonyms. Zietz points to “’emergency’ day care programs which became prominent during World War II [that] are now regarded as part of the community’s basic child welfare services,” which will come as no surprise to anyone who has ever heard of Rosie the Riveter. Nonetheless, in 2017 it is generally assumed that Generation X both invented and perfected the concept of the latchkey kid. Scattered references can be found before 1980, but the phrase really took off afterwards, which explains why Gen X gets the credit. (Full disclosure: I’m a proud member of Generation X (the older end) but was not a latchkey kid.) I can’t find any sign that “latchkey child/kid” came along before World War II, certainly not as early as the nineteenth century. It’s easy to imagine a Victorian illustration of a disconsolate waif with a key on a string or chain (not a lanyard) around her neck, but the term was not needed then because the kids were working the same hours as their parents. We still have plenty of latchkey kids, of course, but the novelty has worn off. Today, Free Range Kids carries on the tradition of advocating unsupervised time for children.

God help us, a lot of those Gen X’ers are parents now, and they indulge in the eternal practice of contrasting their kids’ experience unfavorably with their own. The Generation Next of parents proclaims that all that time with no adults in the house made them resilient and self-reliant, and maybe it did. But then why have so many turned into helicopter parents who starve their own kids of opportunities to learn how to manage without adult intervention? I suspect such generational shifts aren’t all that unusual, because parents have a commendable desire to spare their children the traumas they had to go through. But the wider tendency to bewail these kids today goes back a long time, too long and steady to be wholly unfounded. Every generation of parents sees their own experiences as definitive and notices only that which has deteriorated. The thing is, a lot of the time they’re right; standards do change, sometimes for the worse, and good parents must be especially alert to such slippages.

We associate latchkey kids with working single mothers and always have, though plenty of them have working fathers. From this has arisen a certain stigma the phrase can never seem to shake. Even today, it is used as a class marker, one of many indications of poverty, crime, substandard education, and the rest of it. Numerous studies suggest that latchkey kids don’t generally do worse than average; they share the fate of all studies that call easy explanations into question. We just know that the kids are worse off now and/or will do worse as adults; don’t try to tell us different. It is common to read nostalgic accounts of eighties childhoods, but at the time most press coverage — and there was quite a bit — was marked by dubiety. Some researchers pointed to pervasive fear among latchkey kids of emergencies they were unequipped to handle, or of intruders, or just of being all alone in an empty house. Latchkey kids may not want to relate such feelings to their parents, knowing that expressing doubt or anxiety will disappoint or irritate their hard-working elders. Then again, some kids learned to keep house, manage their time, or just watch lots of television. It’s unlikely that most parents want to leave their kids alone day in and day out, but unless the kid shows obvious ill effects, there’s no point feeling guilty over it.

Tags: , , , , , , , , , , , ,

hive mind

(1990’s | science fiction | “zeitgeist,” “will of the people,” “conventional wisdom,” “groupthink”)

It all started with the bees. The British apiarist H.J. Wadey probably did not invent the term, but he used it in the 1940’s to describe the process by which lots and lots of bees, each of which has next to no mental capacity on its own, work together to create an intelligence that cannot be accounted for simply by adding up the microcapacities of each bee in the colony. There was something a bit mystical about it, and that transcendent quality was picked up by other authorities on bees. From there it became property of science fiction writers, for whom the concept was tailor-made. In their hands, it could retain the sense of a purer intelligence emerging from the collective, or it could be a means of imposing zombie conformity and obedience on the rest of us. Science fiction runs to utopia or dystopia anyway, and the hive mind can be used to exemplify both, even in the same book. The phrase had not become common outside of science-fiction circles; I doubt most Americans were familiar with it when I was young.

There the matter rested until the mid-1990’s, when the expression received the benefit of two cultural megaphones: first Kevin Kelly, founder of Wired magazine, then the film Star Trek: First Contact. Kelly saw the hive mind as the result of amplifying human capability with computers (preferably implanted) to enhance our collective intelligence and create a larger force, human yet superhuman, that would change everything for the better — although individual drones might not fare so well. A year or two later, Star Trek: First Contact came out, which featured the Borg as the villain. The Borg had appeared on Star Trek: The Next Generation (the Patrick Stewart cast, which also populated the film), but this seems to have been the first time the phrase “hive mind” ever appeared in the script. The Wired geeks and the Star Trek geeks between them formed a critical mass, and “hive mind” emerged from the sci-fi shadows and began to be encountered much more often. The onset of social media certainly didn’t slow the spread of the phrase; here again, the concept may be beneficent or noxious.

Kelly was an optimist, positing that the computer-aided hive mind would lead to a much greater capacity to solve human problems, whereas the Borg represents the dark side, gobbling up plucky individualists and producing numbing conformity while enriching its own hive mind with the contributions of other civilizations (sounds like imperialism, or the one percent). My sense is that today the pessimists are winning; “hive mind” has become a favored grenade to toss across the political divide, as stalwarts of the right and left accuse their opponents of stupidly parroting the sentiments put forth by their respective opinion makers. On this view, the hive mind is simply an overlord to which the bad guys pledge dumb fealty. (Of course, both left and right have their share of unreasoning myrmidons, but I wonder if they may be more characteristic of the right wing. “Dittohead” is no longer fashionable, but it’s worth noting that only right-wingers called themselves “dittoheads,” often with pride.) Even if the insulting use predominates right now, the more hopeful meaning may rise again. Take UNU, for example, which promises to help us “think together” by setting up a “swarm intelligence.”

Once you get away from the notion of a literal superbrain, the metaphorical uses of the expression come quickly into view. A single brain can itself be seen as a teeming hive mind, with neurons equivalent to drones, each doing its tiny duty but producing prodigious results by subordinating itself. (A more recent issue of Wired showcases an example of this sort of analogy, which has no counterpart for the queen bee.) More generally, the hive mind may serve as a symbol of our politics, in which millions combine to create and support a unified national government. (If that idealized picture makes you snicker, you’re not alone.) Our national motto, E pluribus unum, means “out of many, one,” and that’s not a bad summary of how a hive mind works. No single individual knows everything or can do it all by herself; the nation must muddle along making the most of whatever contributions it can get from hard-working citizens, who create the polity by banding together, at least partly unconsciously, to assert a collective will.

This post was inspired by the one and only lovely Liz from Queens, who nominated “hive mind” only last week, thereby sparing me the trouble of coming up with a new expression to write about. Thanks, baby!

Tags: , , , , , , , , , , ,

coffee date

(1990’s | teenagese? | “first date,” “brief encounter,” “not even a date, really”)

It’s tempting to see the rise of the phrase “coffee date” as concomitant with the rise of the gourmet coffee craze (which hasn’t abated), and the expression did become a lot more common around the time Starbucks did, from the late eighties to the mid-nineties. On-line dating services made their mark only a few years later and produced many more coffee dates, but the term existed well before that. Google Books fishes up a solid reference in Mademoiselle magazine from 1966 (not that Google Books’s dating is all that reliable). That article explained that the coffee date was the college equivalent of the Coke date. There’s no obvious origin for either phrase that I can find in my limited corpora; maybe it bubbled up from below.

College students being so mature and all, naturally they prefer coffee. But the point of the coffee date is not what you consume; it’s a probationary first meeting, which the parties use to size each other up. So it must be short, inexpensive, casual, easy to escape, and in a neutral, public place. Nothing much can happen, and that’s the point. If you hit it off, maybe a lunch date next. “Lunch date,” “dinner date,” and “movie date” are older terms — or at least they became common earlier — that imply a progression whose first step now is the coffee date.

Coffee dates have become so firmly part of the romantic how-to manual that a reaction has developed. While conventional wisdom still recommends them as sensible first meetings, certain apostates, such as this eHarmony blogger, dismiss them as old hat and unlikely to lead to serious relationships; others question whether they should be called “dates” at all. There are always doubters, but even they can’t deny that the dating landscape has changed, tilting the playing field decisively toward Starbucks.

speed dating

(late 1990’s)

An expression, and concept, with a verifiable origin. The on-line consensus — unanimous as far as I can tell — says that Rabbi Yaakov Deyo and his wife invented speed dating in 1998 as a way to encourage Jewish singles to meet each other and form relationships. It goes like this: between five and ten women sit at individual tables. The same number of men wait nearby. At a signal, each man sits down at a table and talks with the woman for eight minutes, then moves to the next table and does it again. In slightly over an hour, you meet several candidates, at least one or two of whom might be worth a follow-up. Now several national organizations sponsor speed-dating events, which may or may not have any religious, ethnic, or gender restrictions. The practice is sometimes known as “turbo-dating.”

I was struck by the ritual character of speed-dating, which was after all created by a rabbi. The basics of the process don’t vary much regardless of who’s in charge: several conversations in succession, each a fixed period of time; then participants notify the organizer which live one(s) are of further interest, whereupon the organizer puts two people in touch if they appear on each other’s lists. Perhaps the level of rigor does not measure up to the detailed ritualistic instructions of the Torah, but there’s a rule-bound quality all the same. One site notes the roots of speed dating in the traditional Jewish concept of the shiddach (match), basically an arranged marriage made with the help of a middleman or -woman. At any rate, like many concepts invented by Jews, from monotheism to relativity, speed dating has spread quickly and exercised tremendous influence.

It’s another kind of prescribed first date and so is related to the coffee date, but it’s even more circumscribed. Like a coffee date, your chances of success are low but the investment of time and energy is small, and like a coffee date, it can only arguably be called a date at all. Speed dating is distinctive because of the sheer number of people involved; if you buy the theory that most of the time we decide in a matter of seconds whether we’re attracted to someone or not, the approach makes sense. Just get a bunch of generally like-minded, well-disposed people in the same room and let nature take its course. The irony is that while speed dating looks like it was designed to deal with a glut of possibly eligible partners, it was actually invented to keep members of a relatively small, insular group from finding mates elsewhere. (Of course, in a large city like Los Angeles, where Rabbi Deyo first tried out speed dating, there are thousands of unattached Jewish adults, still an impossible number to navigate on one’s own.)

When a reader asked advice columnist Carolyn Hax her opinion of speed dating, she replied, “I liked it a whole lot better when it was called a ‘cocktail party.’” The point is well taken; speed dating is a highly regulated version of what was once known as “mingling.” You went to a party with people you didn’t know, and you went around and talked to them, allowing you to determine who might be a possible romantic interest. No timekeepers or chaperones required, and if you wanted someone else to have your number, you gave it to them. I’ve never tried speed dating, but I was never much good at mingling, so something tells me I wouldn’t be much of a speed dater, either. Both of my long-term relationships began with dates that lasted several hours, so maybe that’s just how I roll.

Tags: , , , , , , , , , ,

all in

(2010’s | militarese? | “giving one’s all,” “bound and determined”)

“All in all.” “All-in-one.” “All in the wrist.” “All in your head.” “All in the same boat.” “All in good time.” Or you could just settle for “all in,” shorn of superfluous objects and uttered with quiet conviction. It means we won’t turn back; we won’t give in. But that’s not what it meant in my childhood. Back then “all in” meant “worn out,” “exhausted.” That definition was on its way out then, and the usage we see today represents a revival, doubtless an unnecessary one. In poker, it meant “having put all one’s chips in the pot” (which makes more sense). “All in” was a bit anomalous among the many vigorous expressions for states of lassitude. Most of them are straight predicate adjectives: “beat,” “pooped,” “spent,” “wrecked.” It reminds me a little of “done in,” but literally that means “murdered,” something much stronger. The old usage (citations date back to the nineteenth century in Lighter) is mostly gone, but I believe the term is still current in poker. (Ian Crouch gives a good account of the evolution of “all in” in the New Yorker.) In the modern sense, popularized by David Petraeus’s biography (2012), it also seems related to poker somehow, but in a more positive way — a confidence in the supremacy of your hand that causes you to bet your entire stack of chips without hesitation. But “all in” doesn’t connote arrogance or unseemly displays of power so much as steely resolve or unswerving attention to the task at hand. “All in” is what you are at the beginning of the day; it used to be what you are at the end of the day.

Theoretically it ought to be possible to be “all in” squared — bent on reaching the goal AND too tired to go on. But the effort required to maintain such commitment precludes helplessness born of weariness. Being all in implies that you have enough energy to figure out and make the next move, or enough force of will to overcome the newest obstacle. The other verb that precedes the expression is “go,” which reminds us of how closely it resembles “go all out,” a phrase much beloved of sports announcers in my youth. I don’t listen to play-by-play as much as I used to, but I have the impression we don’t hear “go all out” much any more.

“All” in itself implies a group, so “all in” should suggest effort toward a common goal, as in “we’re all in this together.” It may, but it doesn’t have to. It is possible to go all in on your own private project, but it might sound a little odd. When politicians and military people use it, there’s at least a hint of pulling together. That assumption of camaraderie is made explicit in what may prove to be yet another new meaning for the expression. Penn State University’s “All In” initiative provides an example, the motto being “A Commitment to Diversity and Inclusion.” Here the term is used very self-consciously to express the ideal of a tolerant, easy-going community. Donald Trump’s ascendance has given this sort of communitarianism a boost, and so I suspect we may see the expression used this way more and more. Keep your eyes peeled; “all in” may shed its skin yet again.

Tags: , , , , , , , , , , , ,

blended family

(1980’s | therapese | “stepfamily”)

Contested terrain semantically, as in other, more obvious, ways. Start with the definition. Nowadays, most people would probably endorse a relatively loose definition of “blended family”: any family formed when an adult with one or more children takes up with a different adult, who may or may not have children. If you’re a purist, you might require that both adults have at least one child. In 1983, a writer defined it thus: “pop-psychology euphemism for members of two broken families living under the same roof, a mixture of step-parents, step-children and step-siblings.” Ten years before that, a psychology textbook defined it as a “family consisting of a husband and a wife, the children of either or both from a previous marriage, and children of the present marriage.” The new spouses had to have kids together, not just with former partners. The extra distinctions may have been made possible by a wider panoply of related terms than we can remember now. A surprisingly large amount of vocabulary sprang up around such filial configurations; in 1980, the New York Times propounded the following list: “conjugal continuation, second-marriage family, stepfamily, blended family, reconstituted family and metafamily.” (It missed “merged family,” also in use by 1980. “Mixed family” means that the parents are of different race, ethnicity, or religion.) Of these, only “stepfamily” would be familiar to most people in 2017, but Wikipedia distinguishes between stepfamilies (only one adult has a pre-existing kid) and blended families (both adults). According to the OED, “stepfamily” goes back to the 19th century; the earliest citation I found for “blended family” dated from 1964.

Why did “blended family” win out? Probably the usual mixture of euphony and accuracy, or intuitiveness. Most of us understood pretty quickly what it meant the first time we heard it in context, and it sounds good — not too long, not too short, scans nicely. “Second-marriage family” is clunky; “metafamily” is jargony and doesn’t make a whole lot of sense anyway. “Blended family” sounds a lot better than “reconstituted family” (just add water!), you have to admit. The only mystery: why didn’t “merged family” catch on?

We like to think that the quirks and foibles of our own generation are unprecedented, but blended families are hardly new. My father’s father grew up in one after his mother divorced his father and married her second husband. My mother’s mother was the daughter of a second marriage, an old widower and a young wife. Life expectancy was lower then, so remarriages were more often occasioned by death than divorce. Was there a decline in the number of blended families for a generation or two, long enough to forget how common such arrangements used to be? If so, the phenomenon has come roaring back. Somehow, before 1970 or so, we got along without a general term for it. Now we’ll never get rid of this one.

There may have been earlier examples on television, but “The Brady Bunch” was the first show to feature a blended family week after week, thus perhaps making the whole idea seem more wholesome. It is doubtful that the sitcom had much effect in its time, given its poor ratings and reviews, but pop-culture observers agree that it had a long and powerful afterlife among those of a certain age (mine), for whom the Brady Bunch is part of a comforting nostalgic penumbra (accent on “numb”). Several shows about different varieties of blended family have succeeded Mike and Carol and Sam* and Alice: Full House, Step by Step, Modern Family. The Bradys anticipated a trend; their descendants follow along behind, trying to catch up to everyday life. The Stepfamily Foundation started life in 1977; support groups and talks at the local library aimed at blended families seem to have arisen in the eighties, when the requisite self-help books also began to appear. New terms must surely arise to reflect new conditions, but the rule is that only one or two out of a larger number will make it to the next generation and a shot at immortality.

* The butcher. Remember?

Tags: , , , , , , , , , , ,

victory lap

(1980’s | athletese | “bow,” “youthful exuberance,” “rubbing it in”)

I’m not quite sure when the custom originated of taking a victory lap after a race. The first instances of the phrase I’ve found turn up in the context of auto racing in the fifties and sixties, but runners have probably been taking them at least that long. (Victory laps are reserved for runners and drivers; horses are spared.) For all I know the Lacedaemonians or Athenians made an extra circuit of the stadium after trouncing the other, and being Greek, they must have had a word for it. The purpose of the act seems simple enough: it gives the adrenaline a little time to subside and the athlete a little time to soak up adulation. As late as 1980, the expression was restricted to track, racing, or occasionally non-athletic contests, like the Miss America pageant, already a small deviation from the literal.

In its figurative sense, the term is used most often by political journalists, though not exclusively; musicians and fashion designers may take them, for example. (In student slang, a “victory lap” refers to a year of high school or college beyond the usual four.) The first specifically political use I found appeared in the Washington Post, describing Reagan’s meetings with assorted government officials after he won his first term in 1980 (he pledged “cooperation with all,” as new presidents customarily do). Non-racing athletes also rated the term around the same time; I was somewhat startled to discover that as early as 1983 Reds’ catcher Johnny Bench’s last season was referred to as a “victory lap.” When a well-known athlete announces retirement far enough in advance, he may reap respectful send-offs at opponents’ stadiums as well as his own. Sometimes it’s called a “victory tour” to give the whole exercise a grander sound; either way, it’s all about adoring crowds, which is what politicians are after, too. Even today, “victory lap” denotes the acts of elected officials more often than not. As far as I know, neither man used the term, but the post-election travels of both Obama and Trump were widely described as “victory laps”: Trump’s thank-you tour and Obama’s last round of visits to European capitals. In the latter case, the phrase didn’t evoke any particular triumph so much as a sense that it was Obama’s last chance to talk up his achievements.

The rise of this expression in political journalism has given it an unsavory connotation. Victory laps used to be joyful celebrations, perhaps not always spontaneous, but at least a moment of innocent exultation shared by athlete and audience. A certain amount of self-congratulation was involved, to be sure. But the politician’s victory lap generally has more to do exaggerating an achievement or rubbing salt in the wounds of the defeated. It is a thoroughly calculated gesture, at worst malicious and at best indulged in purely for its own sake. Politicians are forever being taken to task for taking crass advantage of such opportunities for self-promotion, either because the victory is illusory or because the victor is crude and ungracious. That tendency hasn’t changed and seems unlikely to.

Tags: , , , , , , , , , , , ,

in denial

(1980’s | therapese | “hiding one’s head in the sand”)

My guess is we owe today’s prominence of “denial” in psychological lingo to Elizabeth Kübler-Ross’s stages of grief. I doubt we would have “in denial” without the predecessor; the phrase as we use it now didn’t turn up before 1970 anywhere I looked. The term and associated concept — refusing to believe that which is clear to others, as by failing to acknowledge an emotional or psychological state, or even sheer physical reality — were already in existence, but Kübler-Ross’s “On Death and Dying” (1969) was very influential; one of its effects was to make the experience of denial common to nearly everyone. Not long after, the term became popular among counselors of alcoholics and other drug addicts who refused to admit they had a problem. “In denial” may be merely a compressed version of “in a state of denial.” It appears to be the most common phrase descended from “denial,” but not the only one; Pam Tillis hit the country charts in 1993 with a song about Cleopatra, Queen of Denial (though I’m pretty sure the redoubtable Rev. Billy C. Wirtz had used the joke before then).

“In denial” has been in use for a long time in other contexts, but the grammar is new. Now the phrase is most common as a predicate complement (e.g., “You’re in denial.”), possibly followed by “about,” but not “of.” In the old days, when it followed a verb it had to be active (e.g., “result in denial” or “engage in denial”). Of course, it appeared everywhere in legal prose (e.g., “in denial of the motion”), and it started to bob up in political contexts in the eighties, particularly around the time the Iran-Contra revelations were unraveling Reagan’s second term. It was kinder to say Reagan was in denial than to contend that he really didn’t know what was going on. Maybe this is one of the many terms Reagan helped into the language directly or indirectly, or maybe it would have happened anyway. By 1990 it had made its mark, though ace sportswriter Thomas Boswell put it in quotation marks as late as that spring. No surprise that it became popular — it’s compact and it packs a punch. The expression conjures a state of passive malignity or dangerous indifference, willful or not; like “passive-aggressive,” it’s always an insult.

Now “in denial” is entirely standard, eligible to be adapted to all sorts of uses, including humor, irony, and wordplay. (Here’s a bouquet of suggestions for compilers of rhyming dictionaries: “infantile,” “spin the dial,” “undefiled,” “linden aisle.”) I haven’t heard “SO in denial” or “in deep denial,” but I don’t get around much; both certainly lie within the universe of possible utterances. Or “Live in denial,” which may also be heard “living denial” (as in “Girl, you are just living denial 24/7“). “Oh, he’s such an old in-denial crocodile” could be the next catch phrase. “Hit denial on the head” might be a self-help slogan, meaning something like overcoming obliviousness and seeing the world without illusions. Why not “The In Denial 500,” which pits the nation’s most noxiously clueless bachelors against each other to see who can act the most idiotic? For you tongue-twister fans out there, it’s not much, but it’s the best I can do: Say “undeniably in denial” five times fast.

Tags: , , , , , , , , , , ,