Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 40 years

Category Archives: Entries


(2000’s | advertese | “resilient”)

“Strong” as a hashtag suffix has taken firmer hold with every disaster, natural or otherwise, but its progenitors predate Twitter. Even in my youth, “be strong” in the imperative could be comfortably used to mean “stay the course” or “hang tough.” “Stay strong” had the same feel. While “stay strong” most often referred to markets, arsenals, or muscles, by 1980 or so it could be used for emotional or mental states. The gradual transition from brute force to mental toughness had begun. It culminated in the Lance Armstrong Foundation’s phrase “Livestrong,” first used, as near as I can tell, in 2004. The Foundation is dedicated to people living “with, through, or beyond cancer,” and the Livestrong campaign originated as not only a slogan but a fundraising device: those ubiquitous yellow wristbands made for lots of donations. Armstrong has since been disgraced, and the renamed Livestrong Foundation lives on without him, but he remains the grandfather of this hashtag suffix. I’m not certain, but “Livestrong” seems to have been the first example of a compound word made from an active verb plus “strong,” which is now standard. Before that, we had “headstrong” (stubborn), but I can’t think of any other combinations of noun or verb with “strong,” which doesn’t mean there weren’t any. Armstrong’s own stirring history as a cancer survivor and seeming invincibility in bicycle races (later revealed to be at least partly the result of illegal drug use) made him a natural exemplar of resolution and perseverance; the whole idea had more to do with that than with fitness, or weapons stockpiles.

Today, inevitably, “LiveStrong” is a hashtag, along with others — #StayStrong, #HeartStrong, #PlayStrong — and is now appended to the name of every city where something awful happens. #BostonStrong, #HoustonStrong, #LasVegasStrong, now #LAStrong in the face of the latest round of wildfires. (There are a number of “-strength” hashtags as well.) Given our propensities for brutal weather and mass assassinations, it should have a rosy future. I don’t know if victims of sexual harassment and abuse have developed a “#___Strong” hashtag yet, but it would come as no surprise. “-strong” tends to go with terrors one has survived, like a bomb attack, or cancer or rape. That force is still there, though with wider use it is slipping. The expression has not become trivial, but there’s always that risk.

The new formulation commands us to maintain fortitude, with an implied lack of sympathy toward those who merely surrender to the latest misfortune. That’s a way to read it, perhaps unfair. Another unflattering interpretation is that “Livestrong” and its descendants are a substitute for bravery, not an expression of it, a hip way of whistling past the graveyard. If we repeat the mantra often enough, we can fool ourselves into thinking we really do have control over what comes next. Or maybe it is simply a mantra, a displaced form of meditation that allows us to overcome adversity. The “#___strong” mottoes always strike me as fake somehow, ersatz toughness ginned up more for the sake of appearances than for any real thought of overcoming. (Now that I mention it, don’t all hashtags have that quality?) But I’m also aware that it may just be me. I’ve learned not to question too closely the motives and means of those going through hard times, and to give the benefit of the doubt to those in pain or misery.


Tags: , , , , , , , , , , , ,


(1990’s | journalese? athletese? | “single-minded,” “intent (on)”)

The output of a laser meets a casual definition of “focused”: a light beam formed from many waves, all of the same wavelength, projected through a very narrow opening. There are those who believe that the uniformity of the light waves means that it is incorrect to describe a laser as “focused,” because focusing happens only with light of many different wavelengths, but it’s also true that there are such things as focused lasers. Besides, it’s the uniformity that gives the impression of focus, optics notwithstanding. So it’s not surprising that we took to talk of “laser focus.” I can’t think of any precise noun equivalents from before 1980, except perhaps for “undivided attention,” but we had several closely related concepts, such as “bearing down,” “bound and determined,” “powers of concentration.” It suggests not only purpose but precision, not only concentrating effectively but concentrating on the right thing. “Laser focus” has also done spot duty as a verb for twenty years at least, though it is not used in the imperative, as “focus” by itself is.

The expression seems to have arisen in sportswriting, if you believe LexisNexis (in this case, I’m not sure I do); the first unmistakable instances popped up in articles about boxers in the late eighties (the laser industry trade magazine “Laser Focus” had been around for several years by then). As with “wonk,” Bill Clinton did not invent the expression but helped solidify it in the early nineties when he promised a “laser focus” on the economy. For all that, it does not seem to have become rife until after the turn of the millennium; I don’t recall hearing it until probably after 2010, though it might have crossed my path earlier.

The advent of the CD player, which was for most of us the first practical, everyday use of a laser, helped make this term possible. Lasers were exotic then (they’re still kind of exotic), but there one was in your own home, bringing your favorite tunes to life. There was a vague understanding in the air that a laser was the magical part of the new piece of equipment, much spookier and more advanced than a diamond stylus or magnetic tape. So lasers were ushered into the general consciousness, opening up room for a new figurative expression. A mere thirty years later, “laser-focused” was declared business jargon by Bloomberg News, and it is clearly a term businessmen have picked up, more than politicians, though it is available to anyone now.

We generally hear the term as praise, but calling someone “laser-focused” may just be a nice way of saying they are wearing blinders; that is, it may imply the wrong kind of workaholism or micromanagement. It’s one thing to pour your efforts into reaching a commendable goal, but obsession has its own risks even in the service of a noble cause. I would say the term generally continues to have a positive connotation, but it certainly can suggest something else: an unhealthy involvement in a single pursuit that leads to exclusion or isolation. We don’t hear that when a corporate spokesman boasts of a laser focus on customer service, but when an individual exercises laser focus, we may wonder.

Tags: , , , , , , , , ,

in it to win it

(1980’s | advertese | “playing for keeps,” “playing to win,” “in the game”)

My limited investigations suggest that this expression was born of the fizzing brains of the New York State lottery’s advertising agents (specifically Lerner King Associates), but LexisNexis does show a surprisingly large number of Australian instances in the eighties. Did a forward-looking Aussie get wind of it early and cause it to catch on quicker over there? Or did it originate independently down under? There were lotteries in Australia then, but I haven’t found any connection with local publicity. Two New York lottery commercials, on the other hand, one from 1982 and the next from 1983, suggest an origin story. In the first, a group of Lotto players sings a catchy jingle that includes “to win it, you gotta get in it.” In 1983, the expression appears in its present form, spoken, to close another Lotto commercial. In Maryland where I grew up, the state lottery adopted “You gotta play to win,” which meant the same thing: in order to have any chance at all of taking a prize, you have to participate. So get out there and buy a ticket.

And that’s what the phrase meant then. Now it is more likely to mean “determined to win,” which still acknowledges the need to get in the game in the first place but conveys something much stronger than a tiny, notional chance of winning. (There’s an intermediate stage, which connotes being not just good enough to have reached the finals but to have a genuine chance of defeating the other team — “in the hunt,” as it were.) The newer meaning may also be signaled by adding “only” at the beginning of the phrase. It’s the difference between the subjunctive and the imperative, between recognizing what you have to do and actually doing it. The change was well underway in the nineties; athletes, politicians, entrepreneurs, and others who live by competition all used the expression in its “bound for victory” sense, while the older sense of merely being eligible to win was still in play. And that is still the case today, though my sense is that the latter meaning has become less common. In 1999, John McCain used a variant à propos our intervention in the Balkans that was frequently quoted: “we’re in it, and we have to win it.” A BBC television show about the National Lottery that debuted in 2002 borrowed the expression for its title.

The phrase has not become a cliché, exactly, but perhaps a catchphrase, a fitting fate for an advertising slogan. The little feminine rhyme gets your attention, and the scansion relies heavily on stressed syllables — four out of five, to be exact. “In” and “it” don’t usually bear a lot of weight in poetry or everyday speech, yet here they do, which may be one factor in the relative success of the phrase. True connoisseurs will catch the resemblance to the opening of a double dactyl, only one syllable shy.

“In it to win it” has taken on a distinct hortatory character since the eighties, and now it is often used to whip up the troops involved in group efforts. No question the expression has more intensity than it used to — it’s not about taking a light-hearted flyer any more. When used in the past tense, it’s usually triumphal; it would be odd to hear “we were in it to win it, and then we lost.” Most of the time, all the competitors are playing to win (note the difference from “play to win” as cited in the first paragraph), but only the ultimate victor is permitted to say so.

Tags: , , , , , , , , , , , ,

know the drill

(1980’s | journalese? | “know the routine,” “know how it’s done,” “know one’s way around,” “heard it before”)

I wish I had more confidence that this expression falls within my chronological limits. There are very few examples of it in LexisNexis or Google Books before 1980, but it’s never treated as a new expression — i.e., glossed or remarked on — which ought to mean it was already out there. That would make more sense if the phrase is, as I suspect, a Briticism, because “drill” in British English is much more comfortable referring to martial exercises, or any sort of procedure, repeated in the same way time after time. So “know the drill” isn’t as remarkable in British English as in American. But Americans seized on it quickly enough as the eighties wore on, and by 1990, the phrase was common in all sorts of writing. But I can’t shake the feeling that it was already common in the seventies (and before?), and my sources in this instance aren’t very reliable.

Originally the phrase hewed close to sports or politics, where we strive to make procedures and consequences predictable. But it has spread quickly, both in terms of sheer frequency of use and of breadth of connotation. Nowadays, “you know the drill” often means little more than “you know how it is” — a vague and general feeling that doesn’t have to be defined — far from a specific series of steps or exercises that must be followed in the same order every time. The drift is noteworthy because this expression had some potential to retain its integrity, but it’s not easy standing up to the sloppiness brought on by everyday use. Especially when the expression sounds cool, hip, or new, as I contend “know the drill” did in the eighties and nineties.

One knows the drill only if one has been through the experience in question. Usually it wasn’t very pleasant, and usually you’ve been through it more than once. A number of new expressions wear the mantle not just of experience, but bitter experience. “Lesson learned” is like that. “New normal” is always bad news. “Been there, done that” rarely bears a positive tinge. “Blowback.” “Do the math.” “Harvest.” “Optics.” There’s a whole family of otherwise unrelated expressions that nearly always leave a sour taste, although there’s nothing in the bare words that makes it so. In this case, the quality may arise from a distant echo of the dentist’s drill, or the old British military adage, “No names, no pack drill,” which translates loosely as “if they can’t figure out who did it, they won’t punish any of us.”

I assign the origin of the expression somewhat doubtfully to journalese, but except in a few cases, journalists act as conduits rather than as originators. It’s really just an acknowledgment that most new expressions are spread by members of the press when you get right down to it; maybe you got the latest locution from your best friend, but she or he probably found it on-line. If you don’t really know how the expression came to be, you can always blame the press. “Know the drill” ought to have a military origin, or athletic, right? And it probably does (most early uses that I found had to do with one or the other), but I haven’t found much satisfying evidence, so journalese it is.

Update, Nov. 25, 2017: An e-mail from Target, of all things, recalled to me another phrase with “drill” in it, “(This is) not a drill.” In other words, the unfolding emergency (Black Friday sales, in this case) is not to be taken lightly. Related to “you know the drill” but not closely, it has a different portent (more ominous, less weary).

Tags: , , , , , , , ,

real MVP

(2000’s | athletese | “unsung hero”)

The Most Valuable Player Award was invented in 1931 by the Baseball Writers Association of America (majestically abbreviated BBWAA). I have not been able to determine when the abbreviation “MVP” slipped out of its baseball backwater and into the mainstream of the language. I recall hearing it as a boy and knowing what it meant (I was a baseball fan) and assuming that the adults around me understood it, too. That’s a gap of forty-plus years; how many of those years went by before most people grasped the expression?

Although the formal name of the award might lead you to believe otherwise, baseball’s MVP is loosely considered synonymous with the best player (often on the pennant winner), or at least the one with the gaudiest statistics. There are many ways a player can be valuable to the team that don’t attract much attention, and such traits almost never get considered when it’s time to vote for the MVP. The expression “real MVP” takes that a step further by acknowledging directly someone heretofore unrecognized. Arguably, the real MVP is the person who should have been the MVP all along, but wasn’t because most of us fall for the cheap and flashy. Thus, “real MVP” implies that the nominal winner did not deserve the award. The more valuable player either was truly better in some way, or provided essential support.

A fine example of the latter came from Kevin Durant, in an acceptance speech that has done more than anything else to push “real MVP” into non-athletic contexts. While the word was available for such uses before Durant came along, it had remained primarily an athlete’s term for decades, mostly used to refer to another player, but possibly to a coach, the fans, or even a handicapped kid that inspired the team. Durant cited his mother as “the real MVP” upon being presented with the NBA’s Most Valuable Player Award because without her devotion, work, and self-denial, he never would have reached the pinnacle. The fact that the internet soon swallowed the phrase and vomited it back as a series of memes each more trivial than the last in no way diminishes Durant’s sincerity or character, or his powers of propulsion; the phrase has become much more common since he gave a shout-out to his mother in 2014.

My sense is that “real MVP” was little used outside sports talk before 2000, and probably for a while after it, too. It spread quietly during the first decade of the millennium, but it came more naturally to refer to a designated driver, or your sainted mother, or anyone who gets you out of a jam that way in 2010 than in 2000. In everyday speech, “real MVP” need not imply injustice or misunderstanding. It names anyone who performs a valuable service for one person or a number of people. The sense that the real MVP is laudable, even essential, remains, but not the notion that a less deserving person gets all the publicity.

For the sake of completeness I note that MVP also stands for “minimum viable product,” a barebones version of whatever your big idea is that allows you to test its feasibility or popularity. Such use of the abbreviation seems unlikely to overtake the established phrase within the next millennium, but if it does, I’ll take credit.

Tags: , , , , , , , ,

date night

(1990’s | therapese)

One of those expressions that has evolved a distinct new shade of meaning in the last forty years. Before 1990 or so, it was a loose, carefree expression that applied mainly to single people. As often as not, it was a shorthand way of referring to Friday or Saturday evening. Movie theaters and sports teams held promotional date nights to encourage under-25’s to come out and spend money. These uses have not disappeared by any means, but what has changed? Now date nights are the province primarily of the married, more specifically the married who sense that their relationship needs a boost. So you and your spouse ditch the kids and go out on the town and spend money. The custom has a bit more range now; you might go to a class together on date night, or church. The main consequence of the change? Date nights aren’t only for the young and frivolous any more. Now mature adults with responsibilities and work ethics are enjoined to enjoy them, too. The shift started in the nineties — I found only a few isolated incidences before then.

Date nights are urged particularly on parents, but sources of stress and separation besides kids may trigger a date-night deficit. (The “daddy/daughter date night” is an occasional non-marital variation, which likewise marks an effort to improve or deepen a relationship.) Couples need to reconnect and rekindle sometimes, and many well-meaning busybodies have issued extensive guidelines for doing so. I have hinted before at the meticulously planned architecture of relationships patiently builded by swarms of counselors, therapists, journalists, et al., et al., from coffee date to date night, or as you might say, from dates to nuts. They even advise periodic spontaneity, but if you have to plan it . . . oh, never mind. It’s not the decline in spontaneity that bothers me (most people aren’t that good at it, anyway) so much as the depressing uniformity of it all. An endless stream of like-minded relationship advice, however well-meant, must dull our romantic powers. Even if it works most of the time, sometimes ya gotta throw away the playbook.

After a brief and unscientific survey of LexisNexis results over the past month or so, I’d say that while date nights are urged upon all of us by the romance industry, the date nights of celebrities are reported endlessly, creating the impression that no one else ever takes one. Why not turn that around? Report on local couples’ nights out as if they were celebrities — what she wore, where they went, how close they danced, which base they got to, that sort of thing. I wonder how many people would enjoy that, and how many would hate it. We feel for celebrities who have to fend off paparazzi, and some of us would be all the more fervent if we had to go through it ourselves. But I’ll bet a lot of people would get a kick out of such oppressive attention. After all, it would mean you are worthy, it would mean you’re as important as . . . whoever you favor. The gossip page brought to life — from vicarious to visceral.

Tags: , , , , , , , , , , ,

lesson learned

(2000’s? | bureaucratese? | “I’ve learned my lesson,” “I’ll do better next time,” “I get the point”)

Now available as a pronouncement. Used to punctuate a conversation, it seems to come out of bureaucracy, especially the technological or military variety. NASA and the U.S. Army both have “Lessons Learned” databases that record and disseminate even quite small and apparently insignificant, but reliable, bits of practice gleaned mostly from previous failures. A lesson learned is something you ignore at your peril. They are empirical, and thus may soon become best practices. They could have to do with anything from peeling potatoes to preventing malfunctions in electrical circuitry to choosing material that will withstand re-entry into the earth’s atmosphere. In everyday journalism, lessons learned follow from disasters, such as a big hurricane, fast-moving computer virus, or financial crash. The phrase is often used by individuals, of course; even then, it has a peremptory tone, carrying a firm note of finality with more than an overtone of “never again.” The emphatic final syllable contributes to that, as in “promise kept,” “problem solved,” or even “slam dunk” (a spondee). Ending the utterance with extra oomph has a way of stopping the conversation. I haven’t heard “lesson learned” used jokingly much; it has retained its force and magnitude so far. That can change quickly. If some comedian picks it up as a tagline, we’ll start saying it in all sorts of trivial contexts.

The phrase “lesson learned” is intended to convey rue or determination. The actual lesson you learn is what we now call the takeaway, another new expression. “Takeaway” is not as portentous as “lesson learned,” but the two are closely related, with little daylight between them. Lessons learned are painful somehow, as the new normal is worse than what came before, even though there’s nothing in the wording of either phrase that requires that it be so. Here’s a little rhyme to help you remember:

Experience is a teacher,
But here’s what makes me burn.
It’s always teaching me the things
I do not care to learn.

As one supplicant asked on Stack Exchange, why not “learned lesson”? Partly because it invites confusion with “learned” (two syllables), which is used before the noun, but you see that fine old scholarly term less and less. There’s something about fixed word pairs where the adjective follows the noun. I remember how weird it sounded when Bill Clinton used the expression “date certain.” What is this, the Middle Ages? (He was actually speaking legalese at the time, which accounts for the medieval flavor.) “Siege Perilous,” “paradise lost,” “penny saved.” (Does “code red” fit the pattern? I can’t decide.) The past participle doing duty as an adjective adds a dash of verb flavor, a hint of resolute action. More generally, the noun-adjective construction is probably a remnant of the baneful French influence on English (particularly in matters of law), but it does lend an elusive, poetic quality, striking the ear and compelling attention.

Tags: , , , , , , , , , ,

gap year

(2000’s | journalese | “year off,” “Wanderjahr,” “sabbatical”)

A no-doubt-about-it Briticism. “Gap year” turned up in English newspapers in the 1980’s and was very common by 2000. It did not start to show up in U.S. publications until after the turn of the millennium, suggests LexisNexis. As late as 2005, most occurrences in the American press labeled (not “labelled,” thank you very much) both the term and the concept as British. By 2017 it has become standard American usage with occasional variants, such as the “bridge year” offered by Princeton University. Thorough definition and list of synonyms here. I got used to seeing the phrase a couple of years ago when Malia Obama was getting ready to graduate from high school, and there was much speculation about whether she would take a gap year (she did).

In American English, a gap is almost always a divide or deficit that should be bridged, filled in, or made up. But this term seeks to make something commendable out of it (although one suspects “bridge year” and other substitutes have sprung from lingering negative associations of “gap”). In its own narrow domain, it seems to have succeeded; many educators agree that gap years are valuable, at least for some students, and should be held free of any taint of laziness or irresponsibility that crusty old academics — or parents — might be inclined to attach to them.

It is said occasionally that working adults take a gap year — here again, the Brits seem to lead the way to new frontiers in usage — but they remain predominantly the prerogative of the recently graduated. (Although it’s no longer exceptional for students to take them during their college careers rather than at the beginning or end.) It would sound odd to refer to a professor’s sabbatical as a gap year, for example. But when an executive takes a trip around the world in mid-career, it might rate the term. The old expressions were more poetic: “kicking up your heels,” “a wild hair,” even “whim.” It’s all so methodical now; the gap year has spawned programs, counselors, fellowships, etc., and the restless spirit is gone. It’s just one more carefully planned part of your education, designed as much to intrigue admissions committees as to enrich the gapper.

The gap year generally entails good works, or at least a paying job. At one extreme, you have globetrotting do-gooders calling at every port to tend the poor and sick, care for maltreated animals, teach yoga to children in war zones, etc., etc. This sort of account is very easy to parody, but I suppose we should resist the urge and acknowledge the value of young people working for the betterment of others, even when they get a little self-righteous. Most gap years are more prosaic. Yet the term seems bound up with an improving use of one’s time, so spending a year goofing off on the beach shouldn’t be called a gap year. It could be, but it wouldn’t be (I nominate “goof year”). Just as a middle-aged wage slave taking a year to go back to school and finish a degree probably wouldn’t call it a gap year, though it’s theoretically possible. One takes a gap year to get away from school, not to return to it. But I wouldn’t be surprised if the term gets quite a bit looser. It has grown popular in its brief life in the U.S., and its denotations and connotations are likely to spread out.

Tags: , , , , , , , , ,


(1990’s | academese (science)? | “ecosystem,” “zone,” “range”)

A biome is part of what we used to call the biosphere before that became a brand name right around 1990. The defining characteristic of a biome is its biota (living things in aggregate) more than climate or topography. If you can demarcate part of the planet with reference to its plants, animals, and micro-organisms, you call it a biome. Although the word properly denotes a region in the natural world, usually large but limited, it was first used commonly to refer to miniature, artificial environments, as in zoos or Biosphere 2, which was an attempt to create a self-sustaining colony that had some success, depending on whom, when, and where you asked. “Biome” was still primarily a technical term then, but available to the curious or well-read. In today’s language, the term is somewhat more likely to be used to talk about real natural environments (i.e., outdoor and independent of human-made boundaries). The word’s initial bias toward the fake probably resulted from the fact that when Biosphere 2 got going, we needed a word to refer to the distinct regions contained within it — ocean, desert, arable land, etc. — that wouldn’t be readily confused with the words we already had for the real things. Since “biome” was rarely used at the time (mainly in articles about zoos), it came in handy.

Within the last twenty years, “microbiome” and “gut biome” have become popular, building on the miniaturization associated with the parent term and pushing it further. Not that a microbiome is impossibly small, although it is very tiny by traditional biome standards, but that it is populated entirely by micro-organisms. We have them all over the place — on our skin, in various organs (not just the intestines), the bloodstream — and other animals have them, too. One supposes that micro-organisms have their own biomes composed of nano-organisms. But it’s organisms all the way down.

So far I have been trafficking in popular definitions, and I really should be more precise. The human microbiome, according to an NIH paper in 2007, means “collective genomes of all microorganisms present in or on the human body.” That’s much more satisfying phonologically, and tends to confirm a sneaking suspicion that without genomes, there would be no biomes. Although Webster’s Third defines “-ome” as “abstract entity, group, or mass,” in these words it seems to indicate totality.

Ecology has come a long way, baby. Even fifty years ago, we still saw the world in three zones: torrid, temperate, and frigid. (All good words, it’s true.) There were general terms for various ecological regions — savanna, rain forest, wetlands — but our understanding back then of what those terms meant seems so primitive and unnuanced. The rise of microbiology has changed everything. We break the planet into smaller and smaller pieces partly because we keep finding smaller and smaller constituent parts, which perforce alters the way we see and understand the large-scale creatures. First one biosphere became many, and those mini-biospheres begat biomes, which begat microbiomes. Those little suckers may be small, but they’re not too small to study.

Any Hank Williams appreciators out there? Sing along: “Son of a gun, we’ll have big fun on the biome!” Farther west, it’s biome on the range. But stay away from the atomic biome tests.

Tags: , , , , , , ,


(1990’s | advertese? | “Ring it up!,” “Rake it in!,” “Score!”)

An onomatopoetic rendering of an old-time cash register — none of this infernal beeping we hear all the time nowadays, just a cheery little bell that rang whenever the cashier opened the drawer. (The sound was also attributed sometimes to slot machines, taxi meters, and pinball machines.) Like many onomoatopoeias, it doesn’t feel quite like a real word somehow, but language comes from many places and need not trace its lineage back to the Mayflower, nor yet the Indo-Europeans. There is also a one-syllable form; how well I recall the introduction to Beard and Kenney’s Tolkien parody, Bored of the Rings, one of many unfortunate childhood influences on my sense of humor, in which the authors freely admitted they were in it for the money and recorded hoped-for sales of the book with a gleeful “ching!” No prefix, which is a more accurate translation of the cash-register bell — although maybe the two-syllable version is descended from an old-time adding machine. My guess is that the opening syllable is an elaboration born of exuberance. “Cha-ching!” is a common variant. It is used most commonly, by far, as an interjection, but it may see spot duty as a noun or verb.

It’s not invariable, but typically “ka-ching” carries a strong suggestion of unseemly greed. One common way to use the expression is as the response of a litany, in which each item of a list of features, services, or transactions is met with “ka-ching!” In such cases there is usually an implication that the items enumerated have the primary purpose of mulcting, rather than helping, the customer. It is a tricky expression to translate into regular ol’ words, which is no doubt why it has been successful. There is no quicker, breezier way to say “someone’s making money off of this” in speech or gesture. (Does the old gesture of rubbing the tip of the thumb against the fingertips mean anything any more? That meant “give me money,” or more poetically, “cross my palm with silver” and implied bribery.) The expression may also have a disagreeably exultant tone when one is enjoying one’s own financial success. It’s not always accusing, but I think that’s predominant.

I found an example or two before 1991 in LexisNexis, but that seems to be the year “ka-ching” took wing through the good offices of young actor Seth Green in a commercial for a fast food chain called Rally’s. After every item the customer ordered, Green (as the cashier) ejaculated “cha-ching” or some variant (one of which was “ba-da-bing,” by the way, Sopranos’ fans). The commercial did not have national circulation, but it did become a hit with New Orleans Saints’ fans, who adopted “cha-ching” as a celebratory expression. Maybe that’s where Mike Myers picked it up, as it completed its journey to the center of the mainstream by way of of Wayne’s World (1992), which also popularized such immortals as “Not!” as an interjection, “hurl” (vomit), and “we’re not worthy.” “Ka-ching” existed before the movie; a 1992 New York Times article cites it as an example of a distinctive phrase from the film. But it doesn’t seem to have been as iconic (or moronic) as “babe” and its derivatives or the once-ubiquitous “party on.” (There are dissertations to be written about the effect of “Wayne’s World” on the language.) By the time the nascent Oxygen Network used it as the name of a show and a web site in the late nineties, the usage was probably still hip but hardly new.

Tags: , , , , , , , , , , , , , ,