Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 40 years

word salad

(2000’s | journalese (politics) | “gibberish,” “incoherent speech,” “obfuscation”)

This expression recently underwent a significant change after a hundred stable years. The first citation I found dates from a psychiatric handbook of 1907, where it occurs in a discussion of dementia precox, the old name for schizophrenia, more or less (they weren’t exactly the same, but that’s the closest term in modern mental health vocabulary). It hasn’t changed meaning in that context; a textbook published in 1970 gave the following: “A jumbled, unintelligible mixture of words, usually containing both real words or phrases and neologisms. This disturbance in verbal communication is most frequently found in advanced schizophrenic reactions.” By 1980, arts writers used it now and then to talk about writers like Gertrude Stein and James Joyce, both of whom were considerably more artful than your average schizo, but somewhat less syntactically or semantically forthright than Mickey Spillane, say. It took thirty more years before the expression came to characterize political speeches; the first consistent victim was Sarah Palin in 2008, but in 2016, both Trump and Clinton, widely different speaking styles notwithstanding, were accused of producing word salad. (Somehow this expression doesn’t take to the plural.) The older uses are still found, but in ten short years the phrase has become quite common in political commentary, in which it was never used before Sarah Palin took the national stage. Merriam-Webster On-line provides a history with plenty of examples.

Like “hive mind,” “word salad” has become a favored term of abuse, but it need not be an insult. When used to refer to the ramblings of the mentally ill, it probably was always implicitly insulting — and that origin continues to be felt as we use the phrase today — but literary critics may treat it as a neutral descriptor. Not long before the move into political discourse, “word salad” took on two new uses: one referred to a technique of creating spam e-mails that used blocks of unconnected words in order to fool the filters; more significantly, it started to imply deception, pointing the way to politics. The crucial difference has to do with volition; the schizophrenic babbles uncontrollably, but the purveyor of catch-phrases strung together so as to defeat interpretation is doing it on purpose. In political discourse, it may take either shading, and they’re equally insulting — a variation on the old Reagan cleft stick: if he knows what’s going on, he’s a criminal; if he doesn’t, he’s too out of it to be president. Whether you think Trump just doesn’t know any better or is deliberately snowing us, you probably think he shouldn’t have the job.

Now that “word salad” is firmly enmeshed in political journalism, it is anyone’s guess whether psychiatrists will continue to use it; they may be forced to find a new phrase if the old one changes connotation for good. As late as the nineties, it was pressed into service as the title of a computer game and an on-line poetry magazine, suggesting that it might yet be considered favorable, or at least eye-catching. Those days appear to be over.

Why salad, anyway? The idea of several heterogeneous ingredients, mixed but not blended together, seems to be at the bottom of it, though the expression probably hails from German or French originally, and I’m not certain “salad” carries the same mental picture in those languages. I’ve seen “word hash” offered as a synonym, but if there ever was a contest, “word salad” has won. It’s more memorable than “jumble” or “logorrhea,” that’s for sure (personally, I’d like to see “word avalanche”). And I like the idea of pouring oil (and vinegar) on troubled word salad.

Tags: , , , , , , , , , , ,

who moved my cheese?

(late 1990’s | businese | “what the hell just happened?,” “now what?,” “now what do I do?”)

I was surprised to learn that the insubstantial book whose title gave us this week’s expression spent ten years on the business best-seller lists. Management guru Spencer Johnson published it in 1998, and soon it became immensely popular among bosses, who bought it in bulk for their employees — one commentator wryly noted that if your boss leaves a copy of “Who Moved My Cheese?” on your desk, you are about to be laid off. References in the press were ubiquitous from 1999 until 2005 or so, at which point hits in LexisNexis began a noticeable decline that has continued to this day. Such a decline is unusual. I can’t think of many locutions that have gone from widely used to infrequent: “cocooning” is one, “bobbitt” another. It seems plausible that expressions that arise from short-lived trends or specific people or events would be more prone to obsolescence. You don’t hear “peace dividend” or “Where’s the beef?” much any more. But there must be other factors; “truly needy” has pretty well died off, and we have as many poor people as ever. One continues to encounter “who moved my cheese?” almost invariably as a direct reference to the book, which most people regard as either a life-changing parable or an insult to our intelligence; it’s too popular to inspire indifference.

It was a skinny book with large type and lots of illustrations. The standard blurb read, “A management expert offers techniques for dealing with change in the workplace and in life,” although the titular cheese actually referred to your primary goal, toward which your path is strewn with obstacles. So the title might be translated as “I was making progress in the right direction, and then something changed and made everything harder.” Or, more simply, “Who messed me up?” Johnson’s moral: people who expect new circumstances, recognize them, and adapt to them are winners in the game of life. They get all the cheese they want. (Full disclosure: I detest any kind of runny or smelly cheese, so I have a personal antipathy to this particular book that goes beyond my usual resistance to feel-good corporate nostrums.)

Since the book is intended to persuade people to manage their lives and make the best of things, it comes under the broad umbrella of self-help; more than one observer places it in a distinguished lineage that began with Dale Carnegie (“How to Win Friends and Influence People”). The business world, contrary to its crusty, hard-boiled reputation, now goes in for touchy-feely — or pretends to — and “Who Moved My Cheese?” is a relatively painless, soft-soap way to tell employees to get with the program, to accept whatever the bosses throw at them, no matter how unfair. Like mindfulness training, business self-help books offer executives a way to avoid making the office a better place. Give away this little book, and you can be as callous, arbitrary, and profit-mad as you want.

The most important question is the most basic: Is this phrase an expression or just a title? “Who moved my cheese?” doesn’t have a discernible definition, and reviews of the book rarely explained the title in detail, beyond pointing out that the cheese stands for whatever is most important to you; it doesn’t even have to work-related. We don’t use it in conversation and never did; how many people say “Geez, who moved my cheese?” when they find out they’re fired? And yet everyone who was sentient learned it back around 2000 and had at least a vague idea what it meant. Actually, the less you understand what the title means the more effective it is; that air of mystery rescues it from stultifying banality.

Tags: , , , , , , , , , ,

random acts of kindness

(1990’s | “kindness of strangers,” “being kind to others”)

The first use of this term in LexisNexis comes from a British source, which credits it to an American named Anne Herbert of San Francisco or somewhere out there in Bayarea. The story goes she saw graffiti that read “Random acts of violence,” and she created the counter-slogan the very same day. There have always been differing versions; the most economical I’ve seen is “Random acts of kindness and senseless beauty,” an elaboration on “random acts of kindness,” which was already in the air; there are several solid citations in Google Books before 1990, and an editorial in the December 1991 Glamour magazine used the expression. So Herbert didn’t invent it, but she improved it, and she gained plenty of credit for popularizing it. (Ben & Jerry’s was an early adopter.)

The breakout year was 1992. “Random acts of kindness” soon became talk-show fodder, then a bumper sticker, then the subject of a popular book. I think I first encountered it as a bumper sticker, though I can’t be sure — maybe around the same time as “visualize whirled peas”? It even inspired a short-lived movement that called itself Guerrilla Goodness, in which people went around putting money in other people’s parking meters or gratuitously helping senior citizens. The movement disappeared from view, at least under that name — which doesn’t mean it ceased to have adherents; The Random Acts of Kindness Foundation and Random Acts of Kindness Week (not to mention World Kindness Day) now carry on the tradition. In 1992, we still had George Bush’s evocations of the “thousand points of light” fresh in our memories, and the Guerrilla Goodness movement might be seen as a response, though I’m not sure its members saw it that way at the time. The movie Pay It Forward (2000) gave another boost to the phrase. “Pay it forward” itself has become a new expression, which comes directly from the movie title — a nice bit of inspiration on the part of whoever thought it up. As I’ve noted, fewer expressions arise unquestionably from films than you might think.

Originally the phrase invariably carried the sense of doing something for someone you don’t know and aren’t trying to butter up — as the Boy Scouts have always preached — but also whether they need it or not. But that’s slipping; people now blithely refer to random acts of kindness directed at friends and relatives based on knowledge of their situation. (But not enemies, generally. Let’s not have too much of a good thing.) The “acts of kindness” part is self-explanatory, but you have to keep an eye on “random,” often used to mean unmotivated rather than unconnected. For true believers in random acts of kindness envision a world-wide web of kindness evolving as more and more people chase down strangers in order to do something unexpectedly nice.

Schoolchildren are frequently encouraged to practice random or not-so-random acts of kindness, and this phenomenon has only grown since the late nineties. If training, especially early training, is destiny, we will have an unusually kind new crop of adults any year now. It could be happening, for all I know, at least among the young and powerless. The powerful continue to consider such things beneath them most of the time, trumpeting the occasional exception, which makes that much more of an impression due to its rarity.

Like Shakespeare’s mercy, random acts of kindness bless them that give and them that take, and as a practical matter, the benefits to the actor are touted as much, or more, as those to the recipient. Doing something nice for someone you’ll probably never see again makes you feel better, improves your health, burnishes your karma, whatever on-line claims you can dig up. Your good deed might nonplus, or even irritate, the beneficiary, but it definitely gives you a hit of endorphins. If the recipient happens to pass it on, that’s a bonus. Many charitable acts have poorly concealed selfish motivations, so that the case for altruism often turns into the case for its opposite. Even the Golden Rule hints at self-interest by suggesting that the more the rule is exercised, the more likely each of us is to reap the benefit. Which presumably is where the notion of “enlightened self-interest” comes from. Maybe we should just settle for acts of kindness, random or not. Even the philosophers should let us get by with a few of those.

Tags: , , , , , , , , , ,

in the mix

(1980’s | journalese (business) | “one of a number of options,” “available,” “eligible”)

When you try to pin down the way in which this expression has changed since the 1970’s, it takes on a certain I-know-it-when-I see-it quality. Mainly, you recognize it by what it is not: “in the mixture” or “in the mix of.” As we use it now, it takes no prepositional phrase, and is most often encountered as as a predicate complement ending a clause. Starting in the 1980’s, we began hearing “in the mix” used as an adjective, answering the question “what” rather than “where.” It was already current in two different sources that far back: recording reviews and articles involving construction or other manifestations of materials science. “In the mix” is distinguished by its generality, not be confused with “in this (or that) mix,” which is used only when a specific subject has already been defined. I thought of it as a musician’s term, but political and business reporters were using it by the late 1970’s, more or less recognizably as we do now.

It’s pretty clear that our use of “in the mix” comes out of materials science, where the expression applies to ingredients: When you make concrete, be sure to put gravel in the mix. I would prefer that it came from groovy sixties rock producers, as in “Bring out the horns in the mix.” (If you Google the phrase today, the first results that come up have overwhelmingly to do with music.) But the recording studio usage doesn’t allow for adding instruments that haven’t been previously recorded, unlike the other, which permits adding new components at a moment’s notice. Both senses have in common the idea of being fully integrated with the other people or elements, and that idea persists generally today, although it has become looser and more casual, so one might be simply one of several available companions for a trip to the bar Saturday night, not allied closely with anyone else in the group. “Mix,” it is true, used to be a synonym for “mingle,” as one did at parties, but “mix” in that older sense was strictly a verb, and no one at a party ever said “I’m going in(to) the mix now”.

Notwithstanding its origins in business prose, the expression has developed and retained a pronounced hip tinge conferred by younger people (or those trying to sound younger) and strongly associated with DJ’s and teenage movies. (There was even a movie of that title in 2005, about a DJ.) I’m not quite sure why. The phrase is short and punchy, which explains part of it; and the music biz may have aided its spread, which presumably would make it more attractive to the younger set. Whatever the reason, it has maintained that quality, so naturally the suits have gotten busy co-opting it; PBS has a television series called “In the Mix,” and any number of radio stations use the title as well. I don’t know what the future holds for this expression, but so far it has shown staying power and a certain amount of range.

Tags: , , , , , , , ,

play the race card

(1990’s | journalese (politics) | “appeal to one’s worst instincts,” “stir up trouble”)

Apparently a Briticism, which came as rather a surprise to me, considering the expression smacks so richly of American penchants for prejudice and poker. The earliest appearances in LexisNexis began in the U.K. ca. 1986 and didn’t show up in U.S. sources until 1990, though it took root very quickly (see penchants noted above). No hits from any country in Google Books before 1985, either. I would love to have a fuller understanding of the origin of “play the race card.” Few expressions have a clear origin or single inventor, but normally one finds isolated early examples preceding a flowering, or similar expressions serving as transitional forms. (In this case, an example might be Nixon’s references to playing the China card, presumably part of an old China hand, as one source suggests.) But in this case it seems to have caught on more or less instantly, at least by linguistic standards. Some sources suggest that the O.J. Simpson trial lent it ubiquity in the U.S.

The other surprise came out of the discovery that in those early British instances, and in many early American ones, too, the race card was played by the majority, fomenting suspicion and hatred of a minority group. I’ve grown used to hearing the practice imputed to members of minorities, trying to claim special privileges based on past discrimination. But it was originally a left-wing attack phrase, used of nationalist or anti-immigrant parties in England. Jesse Helms’s 1990 campaign for Senate against Harvey Gantt (who was African-American) ran an ad accusing him of favoring racial quotas, whereupon Helms was condemned for “playing the race card.” It worked; he came from behind to win a close election. By 2008, Republicans routinely accused Obama of the tactic; actually, right-wingers are happy to claim anyone, black, white, or red all over, is playing the race card. No matter which side does it, it is more than a breach of etiquette; it is dishonorable, a matter of taking unfair advantage. (It also constitutes a form of intimidation.) Which is a little strange, because in poker (or, more likely, bridge, as Lovely Liz points out), there’s nothing suspect about playing a card; it’s part of the normal course of the game. When transposed into politics, it becomes a low-down act, but maybe that says more about politics than cards.

The expression has spawned a few imitators; one hears occasional references to the “gender card,” “religion card,” “terrorist card,” or other nonce cards — but none as common, or quite as venomous, as “race card.” One rarely acknowledges playing the race card oneself; it is an accusation. Nor does one admire deft use of the race card, even when played effectively. Like negative campaigning, push polling, and plenty of other dubious political practices, everyone deplores it but will happily engage in it if it has any chance of working. Who says bipartisanship is dead?

Tags: , , , , , , , , , , , ,


(1980’s | academese (economics) | “cautious”)

This expression carries a couple of odd dichotomies considering how straightforward it appears. The most obvious pertains to that which it modifies; either persons or corporate bodies — whatever the Supreme Court says, they’re not the same — may be risk-averse, though presumably the risk-aversion of a corporation is ultimately traceable to individuals, whether executives or independent shareholders. More interesting is the fact that risk-averseness may proceed from two entirely different kinds of experience. A conservative corporate board avoids sudden shifts and grand initiatives because they feel prosperous; there’s no incentive to rock the boat. Yet it is a tenet of pop psychology that those who have lived through times of deprivation are suspicious of all but the safest investments, and, in extreme cases, may refuse even to keep their money in banks. (Both sides have in common assets to protect; if you have nothing to lose, there’s no point in being risk-averse.) But then there’s an absent dichotomy that one might naively expect to find in an expression beloved of bankers: the distinction between sensible risk likely to pay off and a crazy scheme. The risk-averse will stay away from both, desiring only the steadiest and safest.

The expression comes out of the discipline of economics and was most used originally in finance, starting in the sixties and becoming commonplace by the eighties. Soon it came to be used often of politicians and lawyers. Among corporations, insurance companies attract it the most; their risk-aversity comes from a visceral understanding of actuarial tables. Yet any stodgy company merits the term. Slowly but surely over time, it has spread into other kinds of prose, with movie reviewers and even the odd sportswriter resorting to it nowadays. More kinds of writers use it to describe more kinds of people — it’s not just for stockholders any more. The point of the compound seems to be neutrality; it strives to avoid any imputation of prudence or cowardice, and largely does, as far as I can tell.

In a previous post I remarked on the curse of capitalism — if one guy works harder, everyone has to work harder — and risk-aversitude bears the seeds of a different manifestation of it. In competitive markets, each company watches the innovations of others like a hawk. When they succeed, the other competitors follow; when they fail, everyone else drops plans to do something similar. Television works this way, though maybe less so now, when there are so many networks (an obsolete word, I know). Any change — introducing a new character into a popular series, or a new show about a controversial subject — carries with it a chance that your audience will flee in terror. But if it pays off, your competitors take note and resolve to do the same damn thing, backed up by shareholders who noticed that it made big profits for the other guy. Within a season or two, everyone is sick of the no-longer new gambit, and most of the imitators have made no headway. Whereupon they lose advertisers, another risk-averse group famously shy of causing offense, taking the money and running at the first sign of any immoral or objectionable acts that might result in lost market share. (Bill O’Reilly is only the latest in a very long line of such embarrassments.) Sometimes, what looks safe turns out to be dangerous. Risk avoidance, like any other strategy, is subject to misuse born of misunderstanding or bad timing, whether by the humblest investor or the loftiest board of directors.

Tags: , , , , , , , , , , , , ,


(1980’s | computerese? | “innate,” “(pre-)programmed,” “fixed,” “unalterable”)

The hard-wired smoke detector was already around in 1980; in that sense the term has not changed meaning since. “Hard-wired” meant connected directly to the building’s electrical system, meaning it was not powered by batteries, meaning that it would not infallibly begin making horrible chirping noises one morning at 3:00 and resist every sleep-fogged effort to silence it. A hard-wired telephone was similar in that it was harder to disconnect than the standard model you plug into a wall jack (already common in my youth, though far from universal). The cord connected to the system inside the wall rather than on the outside. Cable television might be hard-wired in that the cables connected to the source physically entered your house and attached themselves to a television set. Computer scientists had been using the term before that, generally to mean something like “automatic” or “built-in” — the only way to change it is to make a physical alteration to part of the equipment — and it remained firmly ensconced in the technical realm until the eighties. That’s when “hard-wired” became more visible, as computer jargon was becoming very hip. (PCMAG offers a current set of computer-related definitions.) In computer lingo, “hard-wired” came to mean “part of the hardware,” so “soft-wired” had to follow to describe a capability or process provided by software.

My father, erstwhile electrical engineer, pointed out that in his world, “hard-wired” was the opposite of “programmable.” In other words, the hard-wired feature did what it did no matter what; it couldn’t be changed simply by revising the code. Yet you don’t have to be too careless to equate “hard-wired” with “programmed” (see above) in the sense of predetermined. It’s not contradictory if you substitute “re-programmable” for “programmable,” but that requires an unusual level of precision, even for a techie. Every now and then you find odd little synonym-antonym confusions like that.

Still in wide technical use, this expression has reached its zenith in the soft sciences, in which it is commonly used to mean “part of one’s make-up,” with regard to instincts, reflexes, and basic capacities (bipedal walking, language, etc.), and more dubiously to describe less elemental manifestations such as behavior, attitude, or world-view. “Hard-wired” is not a technical term in hard sciences such as genetics or neurology. The usefulness of the expression is open to question: one team of psychologists noted, “The term ‘hard-wired’ has become enormously popular in press accounts and academic writings in reference to human psychological capacities that are presumed by some scholars to be partially innate, such as religion, cognitive biases, prejudice, or aggression . . . remarkably few psychological capacities in humans are genuinely hard-wired, that is, inflexible in their behavioral expression” (citation). Scientists may sniff at the term as used in pop psychology, but it does make for easy shorthand and probably won’t go away any time soon.

The reason we take so easily to applying the term “hard-wired” to the brain is that the computer, as developed over the last fifty years, forms the most comprehensive map yet for the workings of our minds. A contributing reason is the very common, casual linking of brain activity with electricity, as in referring to one’s “wiring” — even though one may also refer to one’s “chemistry” to explain mental quirks, probably a superior explanation. Watching a computer “think” helps us understand how our brains work, or maybe it just misleads us, causing us to disregard our own observations in order to define our own mentation with reference to the computer’s processing. There are obvious connections and obvious divergences; surely any device we concoct must reflect the workings of our own minds. But computers aren’t just for playing solitaire, calculating your tax refund, running a supercollider. They serve a humanistic function by giving us new ways to think about the old ways we think.

Tags: , , , , , , , , , , , ,

bring to the table

(1980’s | businese | “have to offer,” “start out with”)

What one brings to the table by definition benefits the party already there. It is a positive term, rarely used ironically, indicating qualities that will improve an existing situation or resolve a problem. In a job interview, it’s the thing that makes you desirable. Among athletes, it’s what will make the team into a winner. In diplomacy, it’s a bargaining chip that helps move the process along. Generally, it’s what you can do to help. There was a time when it might connote baggage as well as benefit; what you brought to the table was simply what you had, good or bad. But since 1980 or so, it has taken on the favorable connotation exclusively. The phrase arose in business and government; nowadays athletes also use it a lot. To my ear at least, when a phrase becomes popular among athletes, it has stepped irrevocably over the border into cliché country. I’m not exactly sure why, but I think it has to do with the fact that professional sports figures are quick to adopt new expressions from each other and use them frequently thereafter, rarely with any imagination or creativity.

You have to keep your eye on the table, because idioms that rely on that word come from different places. “Bring to the table” calls to mind negotiation: the big table everyone sits around to hammer out an agreement. “Everything on the table” almost certainly comes out of gambling — the moment of showing your hand. “Seat at the table” could come from either, or from the dining room. To get anywhere at any table, a seat is the minimum requirement. Waiters bring things to the table all the time, but that sort of pig-headed literal-mindedness doesn’t get the blog written. In all these expressions, the table by now is purely metaphorical; when an actual table is involved, we understand it to be a play on words.

There’s a certain kind of new expression that develops a settled usage even though it is not particularly distinctive and could occur in everyday conversation without any reference to the specialized meaning. That description is a little vague, so let me offer some examples: “at the end of the day,” “be careful out there,” “do the math,” “don’t even think about it,” “good luck with that,” “I’ll shut up now,” “in a good place,” “play well with others,” “smartest guy in the room,” “what’s your point?.” All of these expressions have in common an ordinariness, almost a triviality, that allows us to notice, if we think about it, that they could just as well have no meaning beyond that carried by the word string itself. And yet, when we hear such phrases, we grasp an extra dimension, so that even if the sense of the expression is not much different from the literal sense of the words, we know we are hearing a distinct expression. There must be a process that allows such utterances to transmogrify into idioms, but I don’t understand it. Is there any way to predict that “I’ll shut up now” would take on a universe of connotation while “I’ll go to the store” (so far) has not?

Tags: , , , , , , , , ,


(1980’s | bureaucratese? legalese? financese? | “recoup,” “recover”)

No longer the sole property of sportswriters, this noun-verb complex has invaded the financial pages and legal journals in force. When I was young, you clawed your way back into a contest through determination and effort, not quitting until the game was on the line and you had a chance to win. It didn’t have to be a single game; it could happen over course of a season, as in a baseball team clawing its way back into the pennant race. It might be used in the context of an individual sport like tennis or golf, but I think it more often went with team sports. In the business world, you might claw your way to the top, but you don’t claw back your way to the top — though you might claw your way back to the top. There’s something ruthless about clawing when people do it; it requires unreasoning vigor, like a jungle cat, blindly fighting its way forward as long as it can move.

In the late seventies, the U.S. began imposing treble (i.e., threefold) damages on defendants who lost certain kinds of civil suits. The U.K. responded by passing a law of their own that gave a British person or corporation the right to recover the portion of the total damages that was not actually compensatory (in other words, the part that was multiplied on after actual damages were awarded). In both the British and American press, this was widely referred to as a “clawback provision.” The expression was much more common in the British, Canadian, and Australian press for at least a decade thereafter, and it is indubitably a Briticism.

My impression was that the expression refers mainly to something governments do, as in the Bernie Madoff case, but a corporation can do it, too; take Wells Fargo’s repossession of stock from disgraced executives in the wake of a banking scandal. I suppose that a business partner could claw back money that another partner had misused, but for the most part it seems to be something an institution does. Clawbacks normally occur when assets have been stolen or used illegitimately; when you hear the word, you can be pretty sure that there was some funny business that has been found out, and a governing body, private or public, is doing something about it. (That isn’t always true; for example, when the British government was privatizing public industries in the eighties, they decreed that a certain number of shares had to be available to British investors. In some cases, that meant “clawing back” shares bought by foreigners to make sure enough shares were available.) The government generally needs some kind of judicial ruling, but a corporation needs no more than the approval of the directors.

In truth, the new expression here is “clawback” (n.) since “claw back” (v.) has been a permissible construction for a long time. (As we saw above, “clawback” also serves as an adjective. I hope I am cold in my grave before “clawbackly” becomes standard English.) But its present sense seems to have arisen around the same time, and I wouldn’t want to state with certainty that one preceded the other, though I would guess the verb came first. It has never left legal and political contexts, or spread outward from them. Law and justice must have their own language.

Tags: , , , , , , , , , , ,

man cave

(2000’s | advertese | “den”)

The evidence strongly suggests that man-caves are the creation of marketers, despite visible traces of the expression before the mid-aughts, which is when it starts turning up in bulk in LexisNexis. The phrasing likely owes a debt to the author of “Men are from Mars, Women are from Venus” (1992), John Gray. While he did not, as far as I can tell, ever use “man cave” himself, he used the two words in close proximity, notably in the apothegms “Never go into a man’s cave or you will be burned by the dragon!” and “Much unnecessary conflict has resulted from a woman following a man into his cave.” In other words, let the old grouch suck his thumb and fiddle with his TV or his train set for a while. He’ll come out and make nice eventually. And if he doesn’t, it’ll be your fault. Gray’s biases aside, he was influential, and today’s more compact phrasing may claim his as an ancestor. Actually, the first use I found in LexisNexis is not due to Gray but to a Canadian columnist writing about house floorplans; she proposed that the basement be renamed “man cave,” because that is where men go to get away from their women. (She had in mind a damp, cobwebbed basement, not a home entertainment center. “Cave” is the French word for basement, so the use of “cave” is more intuitive in Canada than here.) Was author Joanne Lovering an early adopter or ahead of the curve? (Or ahead of the cave!)

But when “man cave” started showing up in quantity, it was purveyed by Maytag, of all corporations, which marketed a product called SkyBox, a vending machine for soda or beer that you could install right in your very own home. Fred Lowery, the director of Maytag’s “strategic initiatives group,” noted that “every guy would like to carve out his own little place in his home. Internally, we call it the man cave. And lots of guys, at some point, would like a vending machine in their man cave” (January 29, 2004). There you have it. Very soon, real estate agents began touting the things, sports promoters jumped on board, and it became a proper fad. No man cave was complete without a big-screen television and a sofa — video game consoles and sports-related items also popular — and if not your very own vending machine, at least a dorm refrigerator, maybe even a full bar. What you won’t find is a workbench. The man’s retreat in my youth was likely to involve tools and at least the possibility of repair or construction. A few men still favor that, but these days it’s more about swilling beer while endless hours of sports unroll before your glazed eyes. Well, not really; what it’s really about is male bonding or just having a place to get away from your woman. The corresponding “woman cave” has not made much headway, a few sightings in the press notwithstanding, but all the ladies have to do is wait; sooner or later some savvy marketer will attract huge sums convincing women they need their own gender-specific refuges.

“Cave” is an interesting word to use here; to my mind it calls up two different associations. First, of course, the caveman: brutal and self-reliant (actually, cavemen were much less self-reliant than we are). Primitive, crude, and therefore manly, the caveman lords it over his woman and slays giant beasts. Just what we all want to be, right? The second association with “cave” is a dangerous, unpleasant place where no sensible woman would set foot to begin with. They’re dark and treacherous, lairs of wild animals, drifters, or lunatics. Of course, that’s what he wants you to think, ladies. He has a giant-screen TV in there — how dangerous can it be? Just don’t get burned.

Why has “man” become such a common prefix in compound nouns since the dawn of the new millennium? Nobody says “man about town” or “man alive!” any more, but you can’t get away from “man-hug,” “man-bun,” “man-boobs.” “Man cave” predates some of these, though “man-boobs” dates back to 2003, according to Urban Dictionary. Is it a simple matter of dumbing down, the word “male” having become too complicated for us cavemen? Is it a wistful attempt to recover a lost sense of masculinity by reverting to the simpler (and therefore more primitive) term? Is it an attempt to express solidarity? “Man-splaining” and “man-spreading” go the other way, of course, used by women in solidarity, not men.

Tags: , , , , , , , , , , , , ,