Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 50 years

even a stopped clock is right twice a day

(1980’s | businese (finance)? politese? | “who woulda thunk it?”)

I had my suspicions about this phrase. I believed it was a recent coinage masquerading as hoary proverbial wisdom — like “no pain, no gain” or “if it ain’t broke don’t fix it” — but Google Books proved me wrong, tossing up examples from as far back as 1880. More than one source traces the concept, if not the wording, back to Lewis Carroll, who set the following puzzle (the sources differ on the details): Would you rather have a clock that shows the correct time twice a day, or a clock that does so once every two years? When you opted for the one that was right twice a day, he presented you with an inoperable (twelve-hour) clock. The other clock lost or gained a minute each day, so it took 720 days before it displayed the actual hour and minute. The phrase, or something like it, has also been attributed to John Steinbeck, Franklin Roosevelt, Gloria Steinem, and a Chinese (or Russian) proverb.

“Even a stopped clock is right twice a day” is often presented or cited as an old saying, and the idea, even the exact phrase, is relatively venerable, but it never rated the rank of proverbial and never competed with “don’t count your chickens before they’re hatched” or even “a stitch in time saves nine.” (Not even “a face that would stop a clock”!) I found very few uses in any context of the phrase, or its variants, before 1985. I’m not sure why it was so uncommon for so long. It is a bit long and cumbersome, I suppose, and the tenor of the adage — even the weakest or most inflexible mind will hit on a good idea occasionally — is not as fundamental as those of the proverbs turned clichés, though by now it has become a cliché itself. So what happened around 1990 to lodge this phrase in the language? I’ve noted elsewhere, only somewhat tongue-in-cheek, the emergence around that time of a national need for everybody to tell off everybody they disagreed with; maybe this expression tagged along. Or it may have been as simple as a few prominent people using it in temporal proximity. I don’t remember hearing it until I was an adult, maybe from Al Sharpton, though LexisNexis does not record Sharpton using the phrase.

It doesn’t seem in the least unlikely that Carroll’s resolutely logical mind produced the stopped-clock paradox, and philosophers and logicians have deployed his example to introduce related puzzles and arguments. But the phrase has taken a more malicious turn as weaponized in finance and politics. Around 1980, says LexisNexis, commentators gradually began using the expression to suggest obduracy, or sheer mental deficiency, in the target. The phrase came to suggest that a certain person was exceedingly stubborn or old-fashioned but nonetheless had come up with an approach or solution that the speaker happened to agree with. The implication was that it was dumb luck or a once-in-a-lifetime shot, unlikely to be repeated later that same day or ever. That is the expression at its most sophisticated. Sometimes it just means “Can you believe so-and-so said something intelligent?,” more of a brute-force model. Famed British interviewer David Frost in 1990 attributed the phrase to his father, who meant by it “everyone has something to teach us.” That is an unusually gentle interpretation. We prefer to use this dubious proverb with a side of snide.

Tags: , , , , , , , , , , , ,

on the spectrum

(1990’s | academese (psychology))

To the extent that it has a literal meaning, “spectrum” refers to the full range of electromagnetic radiation, which we associate primarily with the narrow band of visible wavelengths — “spectrum” comes from a Latin word meaning “look,” after all — but which also includes all those waves that allow us to communicate via broadcasting, telephones, etc., etc. (Sound waves are not electromagnetic, but electromagnetic waves allow us to transmit sound great distances.) The more figurative use, which denotes any sort of continuum, was a twentieth-century phenomenon, according to the OED. By the seventies, talk of the political spectrum was common; the word also had military uses that have continued to grow (as in the chilling “full-spectrum dominance”). So when autism started getting a bit more attention around that time, still a rare and exotic disorder, it was already associated with the notion of a spectrum, meaning that autism encompassed a range of severity. Some kids had just a few characteristics; others had it full-blown and really couldn’t function in everyday society.

“On the spectrum” as a fixed phrase used to describe people made itself felt first in the nineties in child psychology journals, and it was available in mainstream outlets, though not particularly common, by 2000. Since then, it has flourished, and it’s hard to imagine there’s anyone out there who doesn’t know what it means. It may be used as a neutral modifier, as an insult, or even as an apology for someone’s (but probably not one’s own) behavior.

“Autism,” from the Greek word for “self,” is another twentieth-century term, and it first referred to a mental disorder of adults so wrapped up in themselves that they don’t notice other people at all (“idiot” at its root means something similar). By the forties (thanks, OED!), it had come to be used of children, often in the phrase “infantile autism.” We still think of it as a kids’ problem, although there are plenty of adults on the spectrum, often still cared for, or at least paid for, by their parents. The terminology changes frequently. “PDD” — pervasive development disorder — was a standard term in the nineties, but you don’t hear it much today. It applied to relatively “high-functioning” autistic people, like Asperger’s syndrome, which we heard a lot about twenty years ago but is no longer formally recognized. The baffling nature of autism makes it necessary to arrive continually at new understandings that force new vocabulary into the lexicon.

Autism was even more mysterious when newspapers started writing about it in the late seventies. An Associated Press article described it as “a state of mind characterized by disregard of external reality. Autistic youths have great difficulty in communicating.” A pediatric specialist adduced typical symptoms: “a preoccupation with toys and other inanimate objects, a lack of desire to be held or cuddled, constant crying, or no crying at all, repetitious movements such as hand shaking, rocking and spinning, and head banging, and various speech and eating problems.” The other thing experts knew about autism was that it occurred predominantly in boys, and it was extremely rare. Sound familiar? It should, because with the exception of the “rare” part, it’s how we think of autism now. If it seems more widespread, that’s partly because we are paying more attention, but no matter how you count, a lot more people seem to be on the spectrum now than then. The new expression makes it easier to acknowledge that the syndrome and the damage it does may vary widely, rather than hurling every sufferer into the same ghetto, as “autism” unadorned does.

Tags: , , , , , , ,

first-world problem

(2000’s | journalese? | “silly little thing”)

“First world” and “third world” are outdated now (the second world was the Communist bloc); they were still current in my youth. The exact phrase “first-world problems” appears for the first time in LexisNexis in 1992, when it was used quite literally to draw the distinction between the kind of problems societies in the grip of late industrial capitalism have and the kind that plague places lacking money and control over their resources. Third-world nations (now “developing countries”) aspire to first-world problems, because they are less onerous and easier to solve than the kind they have at home. One also heard sometimes “real-world problem” — a practical rather than theoretical difficulty — which may have been a phonetic if not semantic ancestor.

The phrase took on an ironic tinge around 2000, and its ambit changed as it started to refer to personal inconveniences rather than broad social issues. By 2010 it was fashionable in hip circles, having emerged as a meme and social media staple. The expression is closely associated with white people; I have seen “white whine” cited as a synonym but don’t recall ever encountering it. Americans and other subscribers to the western way of wealth use it apologetically; the mixture of irony and apology confers a self-mocking tone. It’s a way of saying “I know how trivial this is, but it still bugs me and I have to say something about it.” If you use it about another person’s problem, it will likely be less tolerant and may be used with genuine scorn, but the smack of the trivial remains. For a first-world problem is not only frivolous, it arises from a frivolous cause. It has to do with Starbucks, tanning beds, or internet access that isn’t quite fast enough, not with food, shelter, or rock-bottom standards of living (it happens despite your standard of living, not because of it). Even with a self-mocking air, the phrase conveys self-congratulation as well. The less significant the problem, the more smug and awful it sounds. It’s related to that old favorite of baseball coaches, “a good problem to have,” when you have two productive left fielders, for example — a situation that must be resolved, but all whose likely outcomes are desirable.

In the eighties, you sometimes heard in political commentary “first-world nation with third-world problems,” usually pasted on borderline first-world countries like South Africa or Brazil. It meant that the society had a fairly advanced financial system and urban centers, but also a high percentage of truly poor and oppressed people. Since I’m in the habit of editorializing, I’ll point out that it’s not a bad description of the U.S. today. The federal government has pretty much abandoned its proper functions in favor of funneling money to the rich and propping up defense contractors, and the results show: failing infrastructure everywhere and no serious political will to repair it. Now that a couple of generations of Americans have been trained to believe unquestioningly that no good can come from government — even as they benefit in various ways from government largesse — it’s hard to see how we might pull together and solve some general problems with the general welfare. Maybe the coronavirus crisis will do it, but I’m not optimistic.

Tags: , , , , , , , , , , ,

the struggle is real

(2010’s | African-American? teenagese? | “I can see you’re having a rough time,” “things are tough (all over)”)

Which is it? Snarky and ironic or sympathetic and sincere? This expression leads an unusually vigorous double life, not in terms of meaning so much as tone. To some, this phrase is a clarion call to solidarity, a heartfelt affirmation of the hard work required to overcome all sorts of obstacles. We acknowledge each others’ efforts and express our fellow feeling. You can find many examples of this sort of usage. But it is equally likely — or close — to be used jocularly about an exaggerated response to a minor problem or insignificant deficiency. Urban Dictionary is not the only source to link the expression to the phrase “first-world problems,” which has been on my list for years.

The duality, if not duplicity, of this expression is striking in its persistence and consistency. I have covered a few expressions — trophy wife, Joe Sixpack, inner child — that have a similar double tone, capable of bearing respect or sarcasm. “Inner child” is a technical term in psychology and a figure of fun everywhere else. “Trophy wife,” born as a term of praise, soon turned into a term of contempt, although neither mode has quite been able to subordinate the other permanently. “Joe Sixpack” comes with its own class divide, used with a sneer only by the elite. “The struggle is real” swings both ways more readily than any of these except possibly “Joe Sixpack,” but it lacks the clear-cut class distinction of the latter. There may be a gender gap, however. My sense is that women are more likely to use the phrase in earnest, men more likely to turn it into a joke. But I don’t have a large enough sample size to make generalizations (no, twenty pages of Google results does not constitute a definitive sample). It would be significant if there were such a gender split, but I can’t prove it.

“Struggle” conveys an ongoing effort that does not necessarily target an external enemy, which is why “the fight (or combat) is real” wouldn’t work. Such words carry too much belligerence. Struggle requires an adversary, but not feelings of hatred or revulsion. (One does not wish to make light of others’ woes, but in the spirit of levity, I can’t resist pointing out that “The Strudel Is Real” would be a good name for a Viennese pastry shop.)

One site suggests that “the struggle is real” comes out of hip-hop, which sounds plausible. How old is it? I don’t know, but it didn’t enter the vocabulary until after 2010 and didn’t go completely mainstream until the middle of the decade. Christians use it a lot, I was surprised to learn; it has become part of a new proverb: “The struggle is real and so is God.” Now that’s co-opting! But you can find it all over the place; it’s not just kids, it’s sportswriters, scientists, and meme artists of all ages. It has permeated our language in a few short years, however we choose to use it.

Tags: , , , , , , , , , ,

chief happiness officer

(2000’s | businese)

How did this expression get started? With a McDonald’s ad campaign, ca. 2004, that’s how. Ronald McDonald was anointed “Chief Happiness Officer,” a fanciful title (and understood to be obviously so) whose holder was in charge of pleasing customers by cutting capers and dispensing his brand of cheer. It was a joke, people.

The office, if not the title, apparently owes much more to Google than McDonald’s; Google was among the first to install a Chief Happiness Officer (known as the Jolly Good Fellow). Tony Hsieh, the CEO of, published a book called Delivering Happiness in 2010, and that gave the expression a big push, though it had been percolating before that.

There is considerable debate over whether there is a genuine need for such executives in the c-suite or not. But the chief happiness officer has joined the ranks of corporate nostrums, and blogs and magazines feature a steady stream of articles on the new phenomenon. The main point is that the CHO is primarily responsible for employee morale, not customer satisfaction. Happy employees are productive employees, and increased productivity brings higher profits. The chief happiness officer’s job is to oversee efforts to keep the staff content, by offering perquisites or helping people over rough patches. Most commentators regard it as a human resources position — the CHO may be head of HR — but customer relations may also play a part. Some CHO’s do monitor employee/customer interaction closely in order to head off problems. That level of surveillance bothers critics. In order to keep employees happy, you have to have a lot of information about how they do their jobs, or deal with co-workers and management. That knowledge can be abused, and it’s unwise to concentrate it in the hands of an unscrupulous executive. The CHO is supposed to be benevolent, but what if your CHO cares more about feathering his own nest than improving the lives of workers?

I have said much about sneaky ways employers have of showing concern for their employees that either increase their own sway or conceal the ways that their actions and policies make workers’ lives more difficult and less certain. On the surface, the chief happiness officer seems like a noble corrective, but it can be part of the same package. They may be as adept as any executive at figuring out ways to shift blame to employees for the misery the bosses are creating, deflecting responsibility away from the higher-ups to the lower-downs. That puts them firmly in management’s camp, regardless of their job description.

Thanks go once again to Will from Paris, wry observer of the corporate jungle, who fed me this phrase months ago. You wouldn’t want me to move too fast and write a hasty, uninformed piece, would you? These things need time to ripen.

Tags: , , , , , , , , ,

cancel culture

(2010’s | celebritese? journalese? | “dropping,” “shunning”)

An expression so new it is premature to say anything about it, but then I thought of a little joke that harks back to the previous entry, to wit, the silencing of live performing arts and sporting events after the coronavirus outbreak has given “cancel culture” a whole new meaning. Two, actually: so many public events being called off (in which “cancel” acts as a verb), or a society (culture) built on mass elimination of public events (adjective).

The phrase is slightly odd — albeit artfully alliterative — but not hard to understand. “Culture” refers to a general way of living or doing business, that is, organizing the population, or characterizing certain distinct groups within it. That usage is familiar, even sanctioned. It is “cancel” that gives the expression its piquancy, as in “eliminate” or “cross out.” If you’ve been well and truly canceled, you don’t exist any more.

The first uses in LexisNexis date from the first half of 2018, years after the #MeToo movement originated, though the two have become inseparable in the public mind. Cancel culture involves the expunging of offenders in word or deed from the mainstream or respectable media — usually public figures, usually performers of some kind. It may go as far as arrest and conviction, but such cases are rare. Everyone condemns the offender, his gigs get flushed, and his reputation takes a sudden nosedive. If you are an ordinary person who tweets something indiscreet that attracts a lot of notice, there’s a decent chance your life will be ruined. Celebrities have more resilience — some performers have returned to work after being canceled — especially those who cultivate an audience who don’t believe in women’s rights, or even decent behavior toward women.

Much ink has been spilled over the question of whether the bringing down of important men represents the efforts of a mob of rampaging maenads or downtrodden women banding together to defeat the arrogant and powerful. I favor the latter view, though I recognize that any large-scale push to bring the guilty to justice will result in a few unfair if not erroneous verdicts. Unjust pillorying is bad policy, and so is gloating. But seeing a blatant sexual predator like Harvey Weinstein brought to heel gives us hope that some forms of cruelty and violence, even among the privileged, may become much harder to get away with.

There is a strong free-speech case against silencing people, though, and many of us remain uneasy over depriving the accused of any verbal recourse beyond abject apology. Not by denying them the power to speak, but by refusing to listen. Part of the reason the judgments against Cosby and Weinstein are so satisfying is that they were handed down by a court obliged to show some deference to the rights of the defendant. The courts accumulate and weigh evidence better than the rest of us can. When we push judicial (and judicious) restraints aside, we risk turning into an unreasoning, ruthless mob.

Tags: , , , , , , , , ,

It’s in the air . . . It is still early in our life with coronavirus, but a few emerging language notes may be in order as we’re confronted with all manner of new and repurposed expressions. Prophetically enough, I’ve already covered “abundance of caution,” “lockdown,” “shelter in place,” and “closure” (not to mention “cocooning,” a word that has not re-emerged). But the plague of new expressions rages unabated.

First off, it is “coronavirus” (one word) far more often than a two-word formation. The accompaniment “novel,” seen often a few weeks ago, already appears much less frequently in the layman’s press, because only one coronavirus matters at this point, and it doesn’t need any help. (The definite article is still common, but not invariable.) COVID-19 is the illness caused by the virus, I think, but some people seem to use them interchangeably and I’m not completely confident I have the nomenclature down.


Why is it a pandemic and not an epidemic? It’s a pandemic when the disease, or its cause, is previously unknown. It’s not a measure of relative destructiveness; either may be devastating. Somehow “pandemic” sounds scarier, at least to me, perhaps because of the echo of the shepherd god who gave us “panic.” Or an echo of Pandora’s Box? Pandemonium?

social distancing

Is “distance” a verb? (It is a reflexive verb, and as such it is implicit in this expression.) I can’t quite get my mind around “social distancing,” which means “keeping your distance” — the term has been around at least fifteen years — partly because the second word calls up a physical dimension, but the realm of the first is more abstract. I hear an ambiguity in this expression. Is “social” the governing concept, so that we still want to share each other’s space, but in order to do so we must stay a bit farther apart? Or is “distancing” the dominant idea, so that it’s more like expressing disdain, or snubbing the other person (who might, after all, be carrying a deadly virus)? Proponents of the practice will claim the former, but professional advice intended to combat spreading the virus has a distinctly anti-social character.

flatten the curve

“Flatten the curve” is our new rallying cry, and like all rallying cries nowadays it is a hashtag. You are to envision a graph plotting the number of cases of COVID-19 against elapsed time. The steeper the curve, the faster the infection is spreading. The goal is to slow the spread and thereby flatten the curve. Expect to see and hear it often in the next few weeks.

community spread

“Community spread” is when doctors can’t figure out how a new patient got the virus, because there’s no direct connection to anyone known to have brought it from somewhere else. As the first cases reared on the west coast, they were traceable directly to the source; the first clear-cut case of community spread was ominous, marking a new stage of contending with the disease.

distance learning

Why do we keep hearing “distance learning” now instead of “on-line classes”? Another new expression we don’t need but is foisted on us anyway. It could be “remote learning,” too — why not telelearning? — but I’m not hearing that nearly as much. Will this be the next great bureaucratic contribution to American English?


We already had so many words for this, but I swear I’ve seen “telework” — new to me — several times recently to mean “work from home” or “telecommute.” Sometimes new expressions appear to be the result of a malign, well-financed campaign to insinuate pointless novelties into everyday language. Like most conspiracy theories, that’s a bit too elaborate for the real world; it’s more likely that the word dropped carelessly from someone’s lips and has wormed its way into the word-hoard through the linguistic equivalent of capillary action. It remains to be seen whether this word will continue to blossom, or wither in the shade of its synonyms.

Tags: , , , , , , , ,

sweet spot

(1980’s | athletese | “happy medium,” “splitting the difference,” “best combination”)

. . . rhymes with “moonshot,” last week’s entry. I haven’t done that before. Got close once, with “comfort zone” and “standalone” falling but two weeks apart. And a couple of off-rhymes.

Down to business. The phrase dates back at least to the thirties in athletese, with reference to pieces of equipment designed to hit things — golf club, tennis racket, baseball bat. (The first reference I encountered involved a putter.) It was the area from which the ball took off with the greatest force with the least effort from the player. All-time baseball great Willie Mays described the sensation thus: “I know when I hit it there, because when I don’t it shakes up my whole body. When I do, it feels like I don’t even have a bat in my hands” (Life magazine, 1964). As of 1980, the term was still most used in that way, but somewhere around there it began to acquire what we might call a less literal meaning. For one thing, it could be used to name an actual place, such as a superior location to prospect for oil or station a satellite. That alley turned blind somewhere in the intervening thirty years, however, for you rarely hear the term so used now.

Now it has a few different meanings. Most simply, it refers to an optimum range, setting, or quantity in almost any type of process. So you could recommend a burner setting by saying, “The sweet spot for cooking oatmeal is medium.” While “sweet spot” is used very occasionally to mean “erogenous zone,” persons do not normally have one (so that it differs from a related word, “wheelhouse“). Most often, the sweet spot is an incorporeal and somewhat mystical yet altogether real place where two categories, often inversely or at least not directly related, intersect for the greatest advantage. So you might talk about the point where maintaining employee morale and high standards hit a sweet spot to generate maximum productivity, or where sales and expenses meet to create the easiest profits. In this sense, it resembles what we used to call the right balance — the most effective combination of two or more attributes for achieving a given goal.

The modern usage of “sweet spot” unquestionably goes back to athletese. My question is why call it that in the first place? I posited that it should be called the “heart,” or maybe “core” of the club, racket, or bat. Not that “sweet spot” sounds completely unnatural or counterintuitive, but it doesn’t seem to get to the heart of the matter (sorry). The clue lies in the word “sweet” as an interjection — think of a frat boy responding to a friend’s good news — where it means utterly satisfying or gratifying. Willie Mays’s definition upholds this interpretation; it’s sweet because it’s about how you feel when you hit the ball there, not a property of the bat.

I wonder if the G-spot, much discussed in the eighties, was influenced by “sweet spot.” No one ever used “sweet spot” to mean “G-spot” that I ever heard. No reason it couldn’t happen that I can see, but it never did. Another road not taken.

Tags: , , , , , , , , , , , ,


(1990’s | journalese? | “herculean or heroic efforts”)

Spelled as two words or one, but not hyphenated. The term was used mostly literally in the seventies to refer to the Apollo mission that landed on the moon and discharged passengers. But it was already a byword denoting a marshaling of exertion and resources in pursuit of a stupendous goal, preferably one that appears unachievable. Historian David McCullough called the Panama Canal “the moon shot of its day,” underscoring the difficulty of the project and the will of those who saw it completed. Moonshots demand a certain derring-do in addition to technical prowess and plenty of perspiration. Joe Biden gave the expression a lift several years ago when he supported a “cancer moonshot,” an acceleration of funding, research, and testing. That phrase would have sounded very strange back in the seventies. There was only one moonshot back then, and you didn’t append another term to it, even if occasionally some other endeavor merited comparison — in addition to the Panama Canal, the Manhattan Project was cited often. But that sort of syntactic shift happens continuously; you never step in the same language twice.

The word does have other meanings: in baseball, it names a long fly ball that goes particularly high. I don’t know that it’s mandatory, but it usually turns up in discussions of home runs. In finance, it names the phenomenon of a stock price continuing to rise after an initial public offering, soaring into the stratosphere. I think “moonshot” should be the name of a drink, but as far as I know it isn’t. (Any ambitious mixologists out there?) It may also take spot duty as a word for photograph of the celestial body.

It never occurred to me until I started working on the post that “moonshot” might be connected with “shot” as in “give it a shot OR your best shot.” On the surface, “shot” is used in two different ways: “moonshot” summons up a cannon hurling a projectile, while “best shot” means “best effort,” though it seems likely that in this sense “shot” ultimately comes from ballistic jargon. Does it recall the maneuver in hearts, shooting the moon, that is, winning by accumulating an entire suit? That’s a venerable term, and one can’t rule it out; I don’t see a direct link. But there is a definite relation between “moonshot” and “best shot,” in that both require putting forth one’s utmost. There ought to be a connection, even if I can’t spot it.

You don’t hear “If they can put a man on the moon, why can’t they . . .?” any more; it was commonplace in my youth as a means of lamenting any sort of government ineptitude. How could the organization that put men on the moon fail to pick up the garbage reliably? The moonshot continues to be recognized as an epitome of achievement, one that demanded extraordinary mobilization that by definition cannot be undertaken for every tedious municipal task. But it’s tempting to complain whenever officials and employees fail to go above and beyond. When you do something brilliantly, people expect it every time. If you always phone it in, no one hassles you, because it’s not worth it. It’s not always easy to find the sweet spot between setting the bar too high and performing so badly that you get fired.

Tags: , , , , , , , , , , , , ,

let it go

(1980’s | “forget it,” “give up on it,” “ease off”)

An expression more versatile than I had reckoned. I thought of it as meaning primarily stop dwelling on (or get over) something, and I found traces of such use in the late seventies. It may also mean turn a blind eye (the cop saw him shoplifting but let it go), which may have been available back then, but to tell the truth I don’t remember. Related, it may signify “stop pursuing,” as in a line of questioning, or “stop acting as if it is true,” as in a belief. If you’re talking about a grudge, it means something like forgive and forget. As in the memorable song from “Frozen,” it may mean turn your back on the past and make a new start. Which contrasts oddly with the apparent passivity of letting go of something; yes, you’re relinquishing your old self, but you’re also actively promoting yourself as a whole new person. Despite the ubiquity of the song, I don’t think you hear the expression used exactly that way very often in normal conversation; that definition remains an outlier. “Let it go” does not mean “unleash it” or “let ‘er rip,” although one might say “let it fly.”

That deceptive passivity gives the phrase its ambiguity; is letting it go an action, or the cessation (or suppression) of an action? Is one required to effect the other? It must be a conscious decision, an act in itself, but it may require a resolve to refrain from certain thoughts or deeds. Letting it go often requires persuasion, implying that it’s hard to do on one’s own; we need assurance from others that we’ll be better off if we abandon that festering grievance or disappointment. It is taking a load off your mind, ridding yourself of deleterious baggage, allowing old wounds to heal.

The ancestors of this expression, I take it, were “let yourself go,” which we may trace at least as far back as Cole Porter, and the simpler “let go,” which I remember from youth, or even its elaboration, “let go and let God,” a Christian injunction meaning surrender to God and let him take over. In other words, let your unconscious dictate your actions, or more simply, follow your pastor’s advice. Of course, “let it go” always had a literal sense, as in what one kid says to the other who has hold of a precious object. It also finds an echo in the primal “let me go” from childhood. Then there’s “let it go at that,” which meant “enough said” or “I’ll shut up now.” I doubt there’s any immediate connection, but we do find a number of new expressions that are simply abridged versions of existing ones. As “let it go” has evolved, it has taken on yet another meaning not quite like any that came before, “get on with your life.” These things happen even in the best languages.

Tags: , , , , , , , ,