(1980’s | bureaucratese | “order,” “assign,” “give a job to”)
This verb never quite went away, as it turns out. “To task” is very old, and it persisted for centuries, turning up in Shakespeare and in both Johnson’s and Webster’s dictionaries. According to Google N-grams, there were more incidences of the verb (I used the word “tasked” as a search expression) in 1900 than in 1940; it did not appear as often between the 1930’s and the 1970’s as it did before or after. The lapse of a couple of generations was sufficient, however, to prompt several influential journalists to object to the verb’s revival in the eighties. The redoubtable Helen Thomas took Robert McFarlane, Reagan’s National Security Advisor, to task over his use of “the noun ‘task’ as a verb” (November 20, 1985); William Safire and George Will both deplored the same usage just a couple of years later, as the Iran-Contra hearings were giving the verb an airing. Its route into everyday language runs through government officials, especially those associated with the military or espionage. It has spread to all fields now, used easily in sports and entertainment writing and everywhere else. One wonders if “multitask” would have taken off as it did if the root verb hadn’t trickled into the mainstream in the eighties.
The meaning of the verb was not much different in 2000 than it was in 1900. In the olden days, there was a greater tendency to use “tasked” to mean “burdened”; use of the verb strongly implied that the duties prescribed were unwelcome or excessive. That may be true today, but the link is not as strong as it was back then. It’s basically the same word as “tax” — also both noun and verb — but it has long had the meaning of “prescribed work” as opposed to “prescribed levy.” You might see “overtask” used as a substitute for “overtax,” for example. It may be a metathesis analogous to the Middle English “aks” turning into the modern “ask.”
By 1990, certainly, there were several possible ways to use “task” as a verb. First, it can be transitive or intransitive, although it is usually transitive, which we can discern from the fact that it is often used in passive voice. If it was not followed by a direct object — the unfortunate person who had a job dropped on her plate — it was followed by a preposition, usually “with” or “by” (there’s that passive voice). Or it may be followed by an infinitive, as in a phrase like “tasked to make the donuts.” What would be the alternative? “Tasked with making the donuts.” Semantically, there’s not much difference, and I don’t believe we should attach too much importance to the grammatical distinction. My ear and LexisNexis agree that by now “task with” has won out over the other variants as the predominant verb phrase.
There is a small but plucky group of expressions whose members have been around for at least a century or two but have either never been used commonly or have undergone some kind of eclipse before flowering in our era. I call the roll for the benefit of future generations: “overthink” had disappeared long since, but now it’s ordinary. “Hurtful” spent five hundred years as a word that sounded wrong but has spent the last thirty proliferating. “Ramp up” has meant several different things, but it has never in its long life (it goes back to Middle English) gotten the workout it has gotten since 1990. “Template” is a technical term dating back to the eighteenth century whose use has spread and soared. “Life lesson” and “bloviate” date from the nineteenth century. The former was used infrequently by philosophers, poets, divines, and no one else until 1990 or so. “Bloviate” is similar to “task” (v.) because it fell into disuse during the mid-twentieth century. “On task” must bring up the rear; it has little linguistically to do with this week’s expression, despite sharing a headword.
Martha and Adam from Queens suggested “task force,” which turned out to date from the thirties and forties but did remind me that “task” as a verb (using it in the infinitive — “to task” — never sounds right somehow) had been on my list for a while. Another victory for the Queens contingent!
(1990’s | businese? governese? | “appearance,” “perceptions,” “p.r.”)
While this expression is used mainly in political circles, it may not have originated there but in the business world. Not that there’s a great gulf fixed between them, or anything. “Optics” is a word for how things look, and it is used mainly by officials and journalists, though one comes across sightings in other fields now and then. “Optical” is an occasional adjective variant, or used to be. We’re not talking about binoculars and gunsights, even though “optics” may be used collectively to refer to devices with lenses. “It’s bad optics” means “it looks bad” or “it smells bad” or “it leaves a bad taste” — who would have thought such a humble expression the occasion for synesthesia?
Even though “optics” remains much more common in the Canadian press than in the U.S. press to this day, the earliest hits I found on LexisNexis (1986 and 1987) attributed the usage to American businessmen. It started to sound less exotic in the 1990’s, at least in Canada. Colleagues reported that it was a favorite of Jeffrey Skilling of Enron; he was listened to respectfully in the late 1990’s and early 2000’s, and his advocacy may have give the word a boost. Macmillan Dictionary’s blogger suggests that 2011 was the year all hell broke loose. I wouldn’t call “optics” in this sense a common word, even today, but you have to know what it means to follow the news. I prefer to think our use of the term comes from Canada; then it would join “cougar,” the only other expression I’ve covered with a clear Canadian origin. Ben Zimmer of the New York Times makes the case.
Part of the point of this word is that the institution that looks good, or bad — usually but not always the government — is assumed to be in control not just of what it’s doing but of how it comes across. Creating a favorable image is part of the job; “bad optics” are caused by lapses. The phrase is confusing, because it ought to mean inadequate vision; it sounds like a deficiency on the part of the observer rather than the agent or creator. But the ocular capabilities of observers are not in question when we discuss the optics of a situation or proposal; everyone can see the results of the latest triumph or gaffe. Attention to outward appearances, deceptive or otherwise, is as old as politics, but in recent years U.S. government officials have become much more open about attributing public resistance or discontent simply to poor “messaging,” as they say nowadays, or “public relations,” as we said in the prehistoric 1970’s. I associate this posture most strongly with Donald Rumsfeld and Dick Cheney, who still refuse to admit that there was a strong case against going to war in Iraq. The fact that nearly everyone else sees it that way only means that they failed to manipulate us effectively.
“Optics” serves the usual political ends of language to some extent; it is mildly obfuscatory, forcing the listener to waste precious seconds figuring out what the spokesperson is actually saying instead of focusing on the malfeasance being covered up. A writer in the Toronto Star (May 19, 1997) noted that the use of the word “optics” itself constituted a “dead giveaway that something unseemly is about to happen.” Politicians must walk a fine linguistic line, burnishing their reputations without committing themselves to anything. That requires in turn a lot of sidling up to what you mean rather than stating it clearly. It’s not a matter of flat-out lying, more a moment of misdirection long enough to distract voters from the latest scandal. “Optics” is just one more expression that helps them do that. But it has become common enough that it doesn’t serve the turn so well any more. New expressions must arise to pull the wool over our eyes.
(1980’s | teenagese? | “spare tire,” “middle-age spread”)
I remember from late childhood or early adolescence a tableau on the Special K cereal box: a waist-length photo of a woman with a faintly accusing expression asking, “Can you pinch more than an inch?” If you took a firm grip on the outermost part of your belly fat, was the resulting wad more than an inch high? Those were the days before the great fitness craze of the eighties, before liposuction became popular. Exercise and a healthy diet were the only ways to keep from getting disgustingly fat, and those were the values Kellogg’s was trying to promote, or give the appearance of promoting. (Special K has always been pitched to believers in fitness and healthy diet. It continues to strike me as an inexplicable brand name for a cereal, but not as ominous as “Product 19.”)
This week’s expression owes nothing to Kellogg’s, but it did come into its own at the same time as aerobics (“six-pack abs” and “no pain, no gain” are other new fitness-related expressions). Not long after the lady left the cereal box, “love handles” became an accepted term for that very specific variety of belly fat on one’s sides at the waist. Not to be confused with the beer belly, or saddlebags, deposits of adipose depending from the hips and thighs. Hip fat may be called “love handles” (do you notice? it’s never singular), but that usage is imprecise. For the last ten years or so, hip fat has also been referred to as “muffin tops,” but that conjures up an even more specific picture: rolls of hip and waist fat pushed upward and outward by tight jeans, forming a distinctive muffin-type silhouette. (Here is a fun list of other fat-related anatomical features, and another.) We delight in vocabulary that makes fun of our anatomical deficiencies, but most such expressions don’t spread far or have much staying power. The poetic “grab of flab,” another way to say “love handles,” never seems to have made the big time. Why has “love handles” lasted so well? It names a very common anatomical feature, for one thing; we need a word for those particular rolls of belly because nearly everyone over thirty has them. It sounds pleasingly warm and fuzzy and has a jocular quality that probably has helped make it attractive. “Muffin tops” will probably also persist, but it is more dependent on fashion; there is less need for the expression when tight jeans go out of style.
The origin of “love handles” is obscure; according to Lighter and Google Books, it was first recorded in the late sixties in glossaries of college slang. The phrase stayed mainly in the shadows throughout the seventies but began to dip its toes into the mainstream by the end of that decade. Well into the eighties it was placed delicately in quotation marks and frequently glossed; such niceties were not necessary by 1990, although you may see the phrase in quotation marks to this day, a sign that it still sounds slangy and not quite reputable. There’s a reason for that: Richard Spears’s Slang and Euphemism (1981) defines “love handles” as “fat on the sides of a man or woman held onto during copulation.” It took thirty years, but the sowers of quotation marks have pretty well lost the battle and the term has become respectable, used indiscriminately by doctors, advice columnists, and stuffy professors. As the phrase has become familiar, the prurient connotation has worn away, causing many to wonder why they’re called “love handles” when no one loves, or even wants, them; as a result, you see “hate handles” occasionally used as a synonym, in the manner of “slim chance” and “fat chance.”
(1990’s | enginese | “collection”)
There is a little complex of phrases here: “harvest [v.] data,” “harvest [n.] of data,” and “data harvest [n.].” None was in common use before 1980; LexisNexis and Google Books both suggest that they hadn’t made much headway as late as 1990. “Harvest [n.] data” originally referred to quantities of crops reaped or game hunted, and often still does. The first citations in today’s sense, which appeared sporadically in the eighties, mostly seemed to come from the space program, often as “harvest of data” from a telescope or spacecraft. The implication was abundance; when scientists uttered it, they were usually boasting about the capabilities, or hoped-for capabilities, of a new piece of equipment that was going to provide us with all kinds of new observations. That’s positively innocent when set alongside the more sinister sense the phrase has acquired in the internet age.
Somewhere in the mid-1990’s, computer industry executives began talking about harvesting data about what people were doing on-line, which was simply an expansion of a longstanding practice — market research — into new fields. That was when the term came to mean corporate, computer-driven aggregation and storage of personal information, which we now take for granted. I did encounter one anomalous use in a Washington Post article in 1994 about internet access service offered by the state of Maryland that permitted the user to “harvest data” about the state. That heartening notion of empowered consumers using the web to collect information has not persisted, and now we think of puny proletarians plucked clean of every potentially pertinent preference, practice, or pattern, permanently pinned in the pitiless panopticon produced by predatory purveyors.
“Data harvest” reminds me of “organ harvest,” also a relatively new expression with unsavory implications. In both cases, the purposes are legitimate, perhaps even commendable, but the way they are carried out leaves a bad taste. The connotations of “harvest” are changing from comforting and wholesome to devious and greedy. For thousands of years, a successful harvest was cause for thanksgiving, a time to rejoice and look ahead to better days. Even a poor harvest marked the end of an annual cycle and might spark hope for the future, in the manner of Dodgers’ fans crying “Wait till next year!” But now the harvest feeds only a select few; most of us sow but do not reap.
(1990’s | “jazz dab”)
We all know what this is, right? It is the smallest recognized denomination of facial hair, a tuft (or wisp, or dot) immediately under the middle of the lower lip. If it extends down to the chin, it’s a chin stripe. If it extends beyond the chin and comes to a soft point, like a paintbrush, it’s an imperial. Some men may have dense hair all along the lower lip, but it only counts as a soul patch if it’s centered — they’re never more than an inch wide, usually less. And if you grow whiskers anywhere else (except possibly the upper lip), it’s not a soul patch any more; it’s just part of another configuration. Here’s a reasonably comprehensive chart that illustrates different categories of facial hair.
As for the history of this particular beardstyle, there’s a firm on-line consensus that it was originally popularized by Dizzy Gillespie in the 1950’s and caught on among beatniks like Maynard G. Krebs, who wasn’t a real person, and who wore something closer to a goatee most of the time, but never mind. Full beards came back in style in hippie times, but in the 1990’s disaffected Gen X’ers (was there any other kind?) took it up again. Not until the nineties was it called a “soul patch,” however. Gillespie seems to have called it a “jazz dab” and I’ve also seen “mouche” cited as an older term for the same thing. Neither is familiar to me, but I (a disaffected Gen X’er) came along too late. You see “flavor saver,” which is newer, on-line, though that could apply just as well to a mustache. I’ve never heard anyone actually utter that phrase, but I’ll concede that it’s a time-honored function of facial hair to preserve bits of breakfast where they can embarrass us later on. Teenage boys trying to grow facial hair are often told they have dirt on their faces; in such cases “soil patch” might be appropriate.
It is not obvious why we should use the phrase “soul patch” to denote the typewriter eraser brush under the lip. It probably has something to do with “soul” in the musical sense (Dizzy Gillespie, remember?), and it does sound better than “jazz patch.” I uncovered an oddity about the phrase in the course of researching it: while many on-line sources chronicle the history of this tonsorial arrangement, several with reference to other terms mentioned above, hardly anyone speculates on the origin of the phrase itself. My sources agree that it did not exist in print before 1990, and LexisNexis shows that it was in use by 2000 and generally appeared without a gloss. Usually on-line commentators engage freely in etymological speculation, even if they aren’t any good at it. When I did “go commando” a few weeks ago, just about every page Google coughed up yielded some unsubstantiated or insubstantial theory explaining how the phrase arose. But no site accessible from the first ten pages of Google’s search results offers even the most casual hypothesis for the origin of this expression.
Soul patches, like the hipsters they often adorn, frequently find themselves objects of ridicule. Even defenders admit that only a select few look good with a soul patch, while others abhor middle-aged men who wear them in hopes of passing for young and hip. I don’t like them myself, but I’ve worn a full beard most of my adult life on the theory that trimming it occasionally takes much less work than shaving every day. The soul patch may require less sculpting than the chinstrap or many other styles, but you still have to shave everything else all the time to make it work. If it doesn’t improve your appearance or save labor, why do it?
(1980’s | academese | “attention,” “meditation,” “being in the moment”)
“Mindfulness” is not a new word, but it has a better claim to newness than “mindful,” long in common use, meaning “well aware” (occasionally it means “considerate”). You used the word when talking about something that demanded more than ordinary attention, or caution. One was mindful of bitter past experiences, or of threats, or of risks. I remember my Sunday school teacher exhorting us to be mindful of scriptural principles. In other words, it shouldn’t just be one more thing rattling around up there, it should fill your mind (if it were a noun, you might say a mindful). It wasn’t a casual word, but a portentous one. “Mindful” may still be used that way, but “mindfulness” never is.
The term today today also refers to heightened awareness, but the object of attention is that which confronts your consciousness right now. In this sense it comes straight out of Buddhism. The eightfold path prescribes mental habits; number seven is “right mindfulness.” “Mindfulness” as we understand it was probably invented by Dr. Jon Kabat-Zinn, a professor at the University of Massachusetts Medical School, who founded the Mindfulness-Based Stress Reduction program in 1979. He turned up in the news occasionally in the eighties. Another longtime proponent, Dr. Ellen Langer of Harvard, seems to have come on board a few years later; she used the expression as a book title in 1990 (the book, according to amazon.com, did not cite Kabat-Zinn, or even mention his name). To be sure, their focuses were different. Kabat-Zinn emphasized bodily relaxation through techniques very similar, if not identical, to what we used to call meditation: concentrate on breathing and sensations, keep returning to monitor them when the mind wanders, note your feelings without judgment, step back and try to experience your mind and body neutrally. Langer thought of it as more of an intellectual exercise; one method she prescribed was to “watch [television] as though [you] were someone else — a politician, or an athlete or a criminal. The point is to break through people’s assumptions with an active attention that stimulates their thinking” (quoted in the New York Times, March 4, 1986). Langer doesn’t talk about listening to your body, but both theorists emphasize conscious control of the mind and of keeping your focus on what’s going on right now.
Kabat-Zinn and Langer are high-powered professors with strong reputations. I’ve written elsewhere of the spread of Eastern religions in the U.S., but it’s not just gurus and rock stars; academics have often gotten into the act. It’s hard to see Kabat-Zinn as doing anything other than smuggling an anonymous version of Buddhist thought into the scientific community, buttressed with impressive studies proclaiming its beneficial effects. Langer seems more focused on business and social relations, so the connection to the mysterious East is weaker. But it’s there either way; mindfulness involves increased attention, though practitioners may differ on whether one should concentrate stubbornly on what’s going on inside the head or outside the body. To a good mindfulist, that’s an illegitimate distinction, or it should be. The point is to connect your consciousness with the movements and rhythms of the body, and that does tend to make any chasm between the two seem imaginary.
“No pain, no gain” and “karma,” also adapted from Asian religious wisdom, have been treated much more roughly than “mindfulness”; their American definitions contradict the original meanings and have pretty much obliterated them. Perhaps because mindfulness has always been promoted by academics and other elites, it has held onto something closer to its original denotation.
Business leaders, athletes, and politicians all discourse solemnly on the benefits of mindfulness. It has become an industry, with many experts and many web sites, and it has many of the characteristics of a fad. Its popularity is starting to provoke a backlash: a recent column on salon.com decries mindfulness training in scholastic settings, and other voices depict it as a convenient way to hold students (or employees) responsible for feelings of stress that actually are imposed by their superiors. If the bosses spring for a few mindfulness classes, they don’t have to try to create a better work environment. To still others, mindfulness is just a nice word for mindlessness, as the comic strip “Pickles” had it. Turn off your critical thinking and creativity, and vegetate yourself stupid.
My thoroughly brilliant girlfriend gave me this expression to write about. Where would I be without her?
has left the building
(1990’s | journalese (sports)? | “it’s all over,” “isn’t coming back”)
As every schoolboy knows, we owe this phrase to Elvis, or rather to a promoter at that fabled Louisiana Hayride concert in Shreveport. See the one and only Straight Dope (if you read the entire article you’ll be treated to a virtuoso catalogue of synonyms at the end). Hell, it’s even on Youtube. The phrase dates back to 1956, and several on-line sources, some of them reliable, agree that it became a conventional way of announcing the end of Elvis concerts in the sixties and seventies, though even after reading several accounts of the origin of the phrase it isn’t clear to me how often it was actually used. I think — I hope — nearly everyone agrees by now that Elvis passed on in 1977, but if the tag line came into anything like general use then, there’s no trace of it in my usual sources (lovely Liz from Queens recalls that it was used in news coverage of his death, which was prolonged, not to say obsessive). A song written in Elvis’s memory used it for a title — J.D. Sumner had a minor hit — but one reference in the January 1978 Stereo Review aside (“a perfect metaphor for [Elvis’s] passing”), it did not push the phrase into the lexicon.
Used literally, “has left the building” arises often in reporting on fires or hostage situations, and in such contexts it means something closer to “don’t have to worry about that person any more.” The figurative meaning of the phrase has not changed much (see above). Despite the early association with Elvis’s death, it is unusual, though not unheard of, to encounter “has left the building” used to mean “has died.” There’s a note of finality, but it’s not as final as that, more like “quit” or “moved on.” It has the same force and finality as “stick a fork in him, he’s done,” but isn’t used in the same way. Even when the expression refers to someone else entirely, it conveys a quiet homage to the King, evoking if not invoking his name. It allows us to talk about him in the present tense without acknowledging his death.
It shows up only once in LexisNexis or Google Books before 1980: a description of a David Bowie show after which the crowd was told, “Bowie has left the building,” presumably a knowing reference to Elvis’s announcer. The first use I found in LexisNexis was due to George Vecsey, the mildly legendary sports columnist for the New York Times, in 1983. He referred explicitly to Presley’s concerts in an article about the St. John’s basketball team losing a tournament game to get across the point that they were finished — a not unreasonable transposition of the original idea that Elvis was not around to give encores or meet worshipful fans. It interests me that Vecsey was a sportswriter, because this expression reminds me a little of “it’s not over until the fat lady sings” (invented ca. 1980 by a Washington scribe in reference to my beloved hometown Bullets) or even “that’s all she wrote.” In 1987, David Letterman, or one of his writers, used “Elvis has left the building” to describe a home run, presumably as in “It’s gone!” Sports lingo has a way of inventing or absorbing expressions whose literal sense is more literary or artistic; “on the same page” is another example. Sports jockeys have a yen for the colorful and memorable, and they don’t turn up their noses at a big, splashy phrase.
There was an incubation period of twenty years or so between the time Elvis fans knew all about “Elvis has left the building,” and the time it became a necessary part of everyone’s vocabulary. By 2000 it was often cited as an example of a cliché, and it was possible by then to substitute someone else’s name. Today we use it with other names or even inanimate objects. Most people probably still know that Elvis was the first one to leave the building, but that may not be common knowledge in a generation or two. Who knows? Maybe someone will unearth this post from the dim and dusty (as opposed to the dark) web and learn the truth.
(2000’s | journalese (gossip) | “belly”)
Definitely a Briticism, which is not something I would have guessed. It appeared rarely in the U.S. press before 2005, says LexisNexis, by which time the Brits didn’t even consider it cheeky any more. “Baby bump” is a creature of the gossip pages and has generally been the property of celebrities. By now it is possible to use the phrase with reference to any pregnant woman, but it still turns up on the gossip pages an awful lot. Presumably the American expression “baby boom” acted as a midwife helping “baby bump” enter the language. Alternative usage note: In recent years demographers have begun using the phrase to denote a temporary increase in the birth rate, using “bump” to mean “spike” or “uptick” rather than protuberance.
My sense is that the rise of the expression paralleled the decline in baggy maternity dresses, which were still the norm in my childhood. Pregnancy has become glamorous and has perforce developed its own style, at least among those who consider style important. Flaunting the physical changes wrought by pregnancy, rather than concealing them or at least blurring the outlines a little, is a change in fashion as well as mores, and the strong association with celebrities confirms that the baby bump is regarded a built-in accessory which women can dress, decorate, and display to attract attention to themselves and their blessed state. Then again, some celebrities may not want the extra attention. Chrissy Teigen recently responded to on-line speculation about her pregnancy by telling fans to “get out of my uterus.” I suspect the offenders thought they were just doing their job; it’s refreshing to learn that at least some celebrities miss the sensation of privacy.
When I was young, it was customary to talk about pregnancy as a state of being, not as a feature or possession. We said an expecting woman was “showing,” or “visibly pregnant,” but I don’t think there was really an equivalent for “baby bump.” The reluctance to show or mention manifestations of pregnancy was passing away even then, reflecting deeper changes in the intersections of individuals and society. Now the swollen belly has become just one more part of the body to show off, cheapening the sanctity of motherhood. That’s the moralist’s interpretation, anyway. It’s also possible to view the shift less censoriously as an evolution of convenience, offering an informal way to refer to a common physical condition, creating a different part of speech in the process and thus permitting greater variety and flexibility in sentence-making. (Many new expressions fall into this category.) Or simply a restless pressure to expand the language; writers are always looking for new ways to say old things.
Back in disco’s heyday, we did the bump. “Fist bump” has replaced “slap me five,” and chest bumps have become much more common. Why shouldn’t “baby bump” signify two prospective mothers bouncing their bellies together, in greeting or in solidarity? I guess that would be “belly bump,” wouldn’t it? Don’t get me wrong; I’m not trying to start a new fad.
(1990’s | teenagese?)
This evocative verb phrase is more of a head-scratcher than most, and I’m scratching as hard as anyone. There does not seem to be any convincing reason why “go commando” should mean “go without underwear.” The most common explanation found on-line is that commandos (see below) don’t wear underwear because it’s too much trouble keeping it clean when you’re on a mission, because it causes jungle rot, or because ferocious Scots warriors went without under their kilts. The problem is that even if it’s true that special forces never wear underwear, the reasons adduced for scanting scanties are not specific to commandos, but shared by all soldiers. I guess it sounds better than “go doughboy” or “go GI” or “go grunt,” but the association with the ilk of Navy SEALS seems fanciful at best. The expression does seem to be used more often of men than women; accordingly, it is not generally used to mean “go without a bra.” It’s the lower story.
The first citation in the OED dates from 1974, but it doesn’t start showing up in LexisNexis until 1996, when it was cited in a list of slang terms current among college students. More to the point, it was used on “Friends” by Joey (played by Matt LeBlanc) early that same year, which seems to have provided the impetus for “go commando” to enter our vocabulary. The phrase needed several years before it could be used without quotation marks and glosses, but most people recognize it by now. There is some dispute over whether the phrase is of British origin (probably not). The word “commando” goes back to the Boer War, where it referred to a raid or one who participated in the raid; the word comes originally from Afrikaans. Certain British troops were called “commandos” during World War II, and from there it entered American vocabulary. If “go commando” meant anything fifty years ago, it meant “act like a commando” — notably brave, relentless, or capable of quick, decisive action. Today, “commando” has a slightly musty sound, and the armed services don’t use it, at least not in the U.S. But we continue to honour the valour of those daring English soldiers who carried out assassinations and rescues behind enemy lines by naming an eccentric sartorial practice after them.
Maybe that word “daring” forms the bridge, as lovely Liz from Queens suggests. Going without underwear requires a devil-may-care defiance of convention and homely wisdom; forget everything you learned about what to wear in case you’re in an accident. It also suggests forgoing protection or a degree of safety, demonstrating courage and nerve, in which qualities commandos are unmatched. There is something exciting about dispensing with drawers — I remember in high school one of the more advanced boys (he had moved east from California) bragged to a girl that he wasn’t wearing any (unfortunately, I can’t remember exactly how he said it, but I’m pretty sure it didn’t have anything to do with commandos); he lowered his waistband an inch or two to demonstrate. The girl was suitably impressed. But whatever the cachet, it is a deeply personal decision, and many of us find forgoing that bottom layer uncomfortable or unhygienic.
Some on-line sources identify “freeballing” as a pre-1980 equivalent. It’s not a word I know, but that doesn’t mean it didn’t exist. Otherwise, I’m not sure there was an old word for it. And now we have one — language marches on. I’d like to thank lovely Martha from Queens for giving me this week’s subject! Always a pleasure to hear from my dedicated readers.