no pain, no gain
(1980’s | athletese | “can’t get something for nothing,” “there are no shortcuts,” “suffering ennobles”)
This word blossomed in the eighties, along with various fitness crazes, notably weightlifting and aerobics. Its popularity — not its origin, for there are several earlier sightings — is frequently attributed to Jane Fonda’s first workout video (1982), which for all practical purposes created the aerobics fad. I’m pretty sure Fonda did not use the expression, at least not in the first video of what proved to be a long series; if she played a role in adding “no pain, no gain” to our vocabulary, it must have come later. But it couldn’t have come much later, because the phrase was tiresomely established within a few years.
It’s one of those expressions, like “pick your battles,” that sounds proverbial but isn’t. It was rare at best in English before 1970. (I have a compendium of proverbs and maxims published in 1888, which includes two that take advantage of the rhyme, but neither is as pithy as our latter-day proverb.) That doesn’t mean it wasn’t proverbial somewhere else, however. Google Books found three references before 1975 from Hindu texts — in the form of commentary on scripture or advice from swamis. The first hit in LexisNexis falls in an article about a guru based in Washington, D.C. (1979), though it’s not clear he uttered it. In such a context, the phrase has nothing to do with exercise or building muscle. It’s a matter of spiritual enlightenment, engaging in meditation to overcome desire and distraction. The pain is mental and emotional, but the gain is transcendent.
Well then us durn Americans got hold of the phrase and, not being noted for devotion to the finer things, we dumbed it down into gym fodder. (Many of us became familiar with the phrase from Soloflex ads in the early 1980’s.) The switch from mind to muscle remains our contribution to this expression. That and using it constantly, as we have since 1985 or so. The meaning seems blessedly straightforward and has from the beginning. In order for new things to grow, old things must pass away, and that hurts. Old muscle fibers must be torn in order to produce new, denser tissue. Mental deformations and emotional reflexes that hold us back must be confronted, dealt with, and overcome — and it will be a struggle. Self-improvement requires sacrifice. It’s cognate to the old prejudice that medicine can’t be effective if it doesn’t taste bad. If exercise doesn’t make you sore, it can’t be doing any good.
The rapid rise of the expression soon inspired a backlash, and most trainers would probably agree that “no pain, no gain” is not at all the same as “more pain, more gain,” as some might be prone to interpret it. Pain is a warning signal, and if too severe it will do more harm than good, in the gym as in the ashram. In this respect, “no pain, no gain” resembles “What doesn’t kill me makes me stronger.” It may be true in small or even moderate doses, but some diseases or experiences leave us permanently weakened, or in a coma. (Isn’t severe depression a sort of emotional or spiritual coma?)
The phrase may have an Eastern origin, but in the west it sounds puritanical or masochistic, preferred by those who love punishment. But that may be a consequence of simple-minded Americans trying to make a limited principle cover every situation. Some rules were never intended to be guiding stars. Use in moderation. Proceed with caution.
(1980’s | businese?, journalese? | “pigs at the trough,” “every man for himself,” “swarm (of . . .),” “melee”)
This expression had to learn to stand on its own in order to take its place in our vocabulary. It was quite possible in 1980 to use it as part of a simile, almost always juxtaposed with the noble shark. “Feeding frenzy” seems to have been invented after midcentury to describe the way hungry sharks eat; the first citation in the OED dates from 1960. The first citation I found in LexisNexis that dispensed with the sharks occurred in 1981, in the context of corporate mergers. Within a few years, it had come to be applied to lots of other things: the press, government officials, greedy litigants, or investors, for example. (Nowadays it may often evoke criminals or consumers.) It’s my sense that the merger mania of the eighties did more than any other cultural excrescence to propel “feeding frenzy” into prominence. Now the phrase most commonly refers to the press, especially the entertainment press, as in “tabloid feeding frenzy.” We have no trouble envisioning mobs of desperate reporters and photographers competing for the smallest scraps of sensation. But it’s also used to talk about political reporting, at least partly as a result of political scientist Larry Sabato’s 1991 book, “Feeding Frenzy: Attack Journalism and American Politics.” And then, surprise! sometimes it just refers to a lot of people stuffing their faces, as at a barbecue or banquet.
Metaphorically (for now we switch from simile to metaphor), “feeding frenzy” denotes a group of people competing in aggressive or violent ways. The violence may be wholly figurative, and it may be real, as when newshounds or shoppers jostle each other. Feeding frenzies usually arise suddenly and end soon, but always in relative terms — the feeding frenzy following Lindsay Lohan lasts until she can duck into a car, but dueling corporations can keep it up for months.
One highly mutable aspect of this term: when does it have an edge of contempt? When corporate executives snap up profitable firms, it doesn’t seem to bother anyone very much, but when paparazzi hound Princess Diana, the sneer is clear. For profit-minded executives, or consumers on Black Friday, the feeding frenzy is the norm, nay, commendable. On the other hand, some of us cling quaintly to the notion that unchecked intrusion into celebrities’ private business is not a worthy occupation. The expression may call to mind indiscriminate acquisition (especially when referring to wealthy collectors at tony auction houses), crude gorging, or even bestial cruelty. But it may also suggest fierce competition, which we generally celebrate, at least in the abstract. Most of the time, “feeding frenzy” bears at least a touch of scorn, but you have to watch the context. It’s not always there.
Who remembers John DeLorean? His lawyer in 1983 called prosecutors’ pursuit of his client a “feeding frenzy,” but with a twist. He evoked the image of sharks surrounding a wounded creature, eager to tear it to pieces. Why isn’t this idea more common? Sharks go ape at the scent of blood, right? We’ve all learned from a hundred disaster movies that the minute a drop of blood hits the water, the sharks close in. Real life has something to say about it, of course: Executives prefer to go after a healthy corporation to a hemorrhaging one, and the gutter press doesn’t wait until the movie star is down to start kicking. I suppose it’s unrealistic to expect such similes to hew too faithfully to their referents.
Back to the literal use at last. When we use the term “feeding frenzy,” it’s always aquatic animals, for some reason. Sharks, mainly, occasionally some kind of fish. Why? Rats, coyotes, and other land animals feed in voracious packs, but we don’t use the term in that context. Maybe sharks are just more evocative, or maybe “Jaws” was the most influential film ever, but this continues to seem strange to me.
(1980’s | enginese?, computerese? | “hiccup,” “technical difficulties,” “slip,” “screw-up”)
A word with many meanings and many parents, or maybe fewer than it appears at first glance. “Glitch” seems to have arisen in the sixties, probably in the space program. It may be descended from “glitch” used as a technical term in astronomy to mean “sudden change in rate of rotation of a pulsar.” But I’m not sure which came first or how they may have influenced each other. As early as the seventies, computer jocks commandeered the term, generally using it to mean “inexplicable malfunction.” (That seems to have been what NASA engineers meant by it, too.) A rather charming IBM magazine ad from 1981 offered to fix “bugs, burps, and glitches,” which might afflict even their superior machines. “Glitch” could refer to an interruption in an electrical power supply or signal, including audio and video broadcasts. These things are all related, and there does seem to be an underlying common meaning, to wit, a failure of technology, usually minor or temporary but not necessarily.
Or at least that was true thirty years ago, but now glitches need have little or nothing to do with technical capability. My sense is that the word is often reserved for predicaments caused by technology, but the link is no longer necessary. Another connotation of the term in its early life was that a glitch was not trivial, but not catastrophic, either. I found an article in “Analog Science Fiction” (1971) that explicitly contrasted “glitch” with “catastrophe.” (A glitch could be neutralized by clever improvisation, even if proper tools were not to hand.) But the main point of a glitch was that no one could quite figure out how it happened. No less an observer than Norman Mailer (“Of a Fire on the Moon,” 1970) illustrated the concept with the example of an “unaccountable electrical phenomenon like the light on an instrument panel suddenly turning on when the machine it serviced was most definitely off.” A glitch was the sort of thing a gremlin might cause. Again, while this implication is no longer necessary, my ear tells me that “glitches” more often take place without direct human agency — but again, you could come up with counterexamples without looking very hard.
Most dictionaries ascribe a Yiddish origin to “glitch,” which seems reasonable. (“Glitshen” is Yiddish for “to slip” or “slide,” which is not too far from “slip up” or “fail to perform as advertised.” “Glatt,” most often heard as an adjective modifying “kosher,” literally means “slick” or “slippery.”) I wonder if the rhyme with “hitch” didn’t help it get settled. “Hitch” can also refer to an interruption, and it was used (most often in the negative, as in “without a hitch”) to allude to a problem that keeps the job from being completed. “Glitch,” to this day, is almost always deployed as a noun. I found one or two instances of the verb as long ago as the seventies, but that never seems to have caught on.
Technically, this term may be a little too old for the blog; it was used regularly in at least a few mainstream periodicals by 1980, and it may have been considered an established expression by then. Well, I’ve cheated before: ramp up, hype, state of the art. I got drawn into the tangled origins of the term and kept going.
(1980’s | businese | “expansion”)
Here we step once again into the vexed, murky waters of politics and economics. No place for your humble observer of the language, but this flabby gem is indeed a new word, although it follows on the heels of two slightly older expressions, “global village” and “global economy.” (My father referred to the global village in an e-mail, which inspired this week’s post. Thanks, Dad!) “Globaloney,” a favorite of mine, came along later. Ever since the Renaissance, we Europeans have been moving outward, exploring and colonizing. From the beginning, most of that restlessness has been driven by capital — money looking for more commerce or access to more resources. In the U.S., we started late, and we needed about 125 years to pacify our part of North America and get the network firmly in place to ensure that the rich keep getting richer. That flowered as far as it could in the 1920’s. But then hiring picked up again a mere fifteen years later, during the next war, and we were ready to take over the world in 1945, by which time we were the only ones available.
More trade was good for business, so the money that had formerly zipped around the U.S. now started to zip around the world. Not just money, but information and people, too. There are strong arguments to be made in favor of travel and trade, which among other things tend to prevent nations from growing too isolation-minded and turning into Nazi Germany. The price of tea in China became less of an abstraction, but so did the cost of contaminated imported food. Globalization opens up lucrative opportunities for a few in position to take advantage, but it also increases the risk of contagion and epidemic, medical, financial, or otherwise, for all of us.
Globalization has made a lot of people mad, from left-wing laborers to right-wingers leery of world government. It’s easy to depict it as one more instance of the plutocrats taking away our livelihoods; the rise of the word has unquestionably gone along with the widening gap between rich and poor. (But there is always a loud, well-financed chorus to point out that hiring cheaper labor and exploiting new regions is just sound business sense.) “Globalization” started to appear in newspapers in the early eighties, firmly the property of businessmen talking about finding new ways to make money. The emphasis then often fell on high labor costs (“outsource” came along around the same time), and the word was more often used to justify laying off U.S. workers than to justify opening new markets or new mines.
Back then, it was not unusual to talk about globalization of markets, or particular industries, or technology, or the economy itself. But it was already possible to talk about globalization unadorned, a mysterious, superhuman process that just happens, which cannot be diverted or appealed. That’s generally how it’s used now, and such usage benefits those at the top of the heap. If your language conveys the notion that the ability of wealth to accrue and exercise power is natural and unstoppable, most of us will forget that beyond a certain level, inequality is caused by decisions made and policies carried out by living, breathing human beings, who individually put their pants on one leg at a time, but who collectively run the economy in their own interest. There’s nothing automatic or natural about it, and we have to fight hard to be heard at the best of times. When we stop paying attention, whether out of self-satisfaction, fatigue, or wishful thinking, the one percent will bend the rules further to take more for themselves and squeeze the rest of us still harder.
(2000’s | academese | “analyze,” “get to the bottom of,” “delve into,” “explicate,” “untangle”)
This is one of those expressions, like “get with the program” or “reference,” that seems unnecessary, because we already had so many words that meant the same thing and covered the same ground. It is also an example of an old word put to new use, repurposed, as we might say, like “concerning” or “dress down.” One figurative sense of the word was loosely comparable to today’s usage: that of “unpacking” memories, where the mind is likened to a storage locker full of old stories that must be extracted.
“Unpack” remained firmly the property of articles on travel and tourism as late as 1980, but it grew a new meaning by the end of the decade. Suitcases were no longer the thing; now one might unpack a concept, a character, or a government policy. I’ve even seen “repack” used in conjunction with “unpack,” as in this jargon-laden quotation attributed to a U. of Oregon professor in 2009: “Multimedia projects offer an effective platform for students to unpack and repack technical problems such as grammar in a creative way.” I’m not sure what it means, but I think it’s two different methods of explaining grammar to younger students: first you demonstrate the elements and then they use that knowledge to recognize general rules and patterns. Well, maybe. It sounds less convincing when I spell it out.
It’s not easy to tell from LexisNexis where the new meaning came from. The earliest instances turned up in book reviews, articles about theater, that sort of thing. The first time I remember hearing the word used to mean “sort out” was in an English class with E.D. Hirsch at the University of Virginia in 1988. That memory leads me to suspect an academic origin, and the literary locus of most early sightings doesn’t contradict that, although, of course, arts writing is often a conduit for therapese. While there is no reason that “unpack” couldn’t have come out of therapese, I don’t know of any evidence that therapists use it much, even today, much less in 1990. The history of academic fads plays a role in my speculation. “Unpack” came along roughly at the end of deconstruction‘s heyday. The avant-garde had moved on by the late eighties, but they still needed a word that meant “unravel” or “take apart,” because that is what academics do. “Deconstruct” had done the job pretty well for a decade or so, but it had taken on baggage during that installment of the culture wars; in fact, it had started out with baggage, because it was popularized by a French philosopher. It’s not hard to imagine herds of English professors on the lookout for a word that would denote the same action without partaking of the controversy. E.D. Hirsch is no post-structuralist, but he still needed a way to say “let’s break this down.”
“Unpack” didn’t see widespread use until after 2000; by 2010 politicians had started to use it and it had spread well beyond the arts ghetto. Travel writers still need it, of course, and no substitute has presented itself for what you do when you get home from vacation. But the new meaning has firmly established itself. Unpacking is indicated whenever there is involution, when not every facet of a problem or concept or personality is visible at the same time. It is most often a matter of revealing hidden assumptions underlying an argument or a directive (we used to say “expose” or “lay bare,” which applied to the assumptions, not the framework). There is no essential element of mystery involved — in fact, when a professor announces she is about to unpack something, she already knows the point she intends to make — but complexity is necessary. Does “unpacking” necessarily suggest exposing deception, or at least revealing that which someone else would rather keep hidden? It is true that when one wants to conceal something, one tends to cover it over, whether one is dealing with concepts or contraband. So the term may be used that way, but certainly not always.
Thank you to my friend and occasional colleague Mark from the Bronx for proposing “unpack.” Keep those cards and letters coming!
(2000’s | journalese (film)?, computerese? | “put to (new or different) use,” “recycle,” “adapt,” “convert”)
It sort of rhymes with “refurbish,” and there’s often an implicit sense of renewal or revival in “repurpose” that lies beyond our everyday definition: taking an object intended for one use and giving it another. From a specialized origin, this verb has grown out to encompass more and more objects. Now a favorite of advice columnists, interior designers, and community planners, “repurpose” came into common use a scant twenty years ago, mainly in the argot of film and television executives.
In those days, repurposing was something that happened to content; the sense might be summarized as “old content, new context.” Say you owned the rights to a bunch of old movies, or new data, and you wanted to exploit them, so you figured out a way to work them into a new form — putting them on-line, say — or to change them a little in order to attract a new audience, like adding some sort of interactive feature. Film executives were the pioneer repurposers, and the mid-nineties was a particularly opportune time for such a term to come along, what with the dawn of DVD’s and the internet. Those wily executives needed a word that disguised the fact that they were peddling the same old stuff; the idea was to make as few changes as possible when repurposing your content. Maybe some technical adjustment was required, but the point was to save money by transplanting what you already had. Sometimes it meant as little as “copy text from one web site to another,” so that after a year or two it became a way of referring to a rival’s lack of creativity or lack of respect for the customer. You might hear, “We don’t just repurpose our content.” You had to alter the content so it would work in the new context; failure to do so hurt sales.
Which is closer to what “repurpose” means now. It stretches much farther today, not only to objects, but money, food, or even ideas. In the case of abstractions, it generally means something like “redirect,” which is not so intuitive. But in general, the term has become more intuitive, not less; when you repurpose a building or a Christmas ornament, you are deliberately deviating from the way it was intended to be used; that is, you are giving it a new purpose, or “purposing” it again. When the word grew first in Hollywood, the purpose of the classic movie remained the same as ever — to make money for film executives. And “repurpose” suggested a conscious effort to make something new out of an existing product, not just repackage it. That’s why there was always something misleading about using the word that way. Now the deception has filtered out of the word, and with it the whiff of the arcane that wafted through the room when industry experts spoke sagely of repurposing “I Love Lucy.”
When did “to purpose” become a verb, you ask? It’s been a verb for a long time, meaning “to resolve (to do something).” But so far “repurpose” has not spawned a new definition for “purpose,” as in “assign a function to.” There is no root form of the verb; it requires a prefix. One can imagine other prefixes, or even the rise of “Oh, baby, purpose me!” But not yet; to date, “repurpose” stands alone.
The early career of “repurpose” is somewhat more interesting than I have let on. There is a ten-year gap in LexisNexis between the first (1984) and second appearances of this term, an extraordinary occurrence. Google Books shows only a few scattered instances in the eighties. “Repurpose” was used in an exceedingly specialized context, and even after reading several examples, I still can’t figure out exactly what it meant, but it had to do with a process applied to videodiscs that were playable on a computer, which is why I suspect that the true wellspring for this word may be computerese. But it didn’t start to show up in the mainstream press until the mid-nineties, and it doesn’t seem to have spread beyond the entertainment press until after 2000. One of the first hits in LexisNexis dates from 1994 in The New Republic: “Los Alamos [New Mexico], like most other defense-based civic economies, is searching for ways to repurpose itself.” Pretty advanced for 1994; even blocks and single buildings weren’t candidates for repurposing back then, much less entire cities. To this day, the reflexive use has not caught on, but don’t be surprised if it does.
Thanks to lovely Liz from Queens for nominating this word for investigation this week!
(1980’s | journalese | “gourmet,” “epicure”)
A British import, like “over the top” or “at the end of the day,” this word flared up when a bestseller used it in the title in 1984: “The Official Foodie’s Handbook.” (Only a couple of instances have been found in print before then, at least one of those attributable to one of the authors.) And it meant then what it means now, someone obsessed with cuisine — ingredients, preparation, or both — to the point that it is easy to make fun of them. Although some use the word with pride, non-foodies generally use it with at least a hint of condescension or exasperation. Food is important, and it’s a fine idea to take pleasure in eating it, as anyone who has spent time around an anorexic can tell you. But foodies may do so to such an exaggerated degree that non-initiates can’t really take them seriously. Their raptures often come across as forced and stagey, more a matter of competing with each other than expressing a genuine appreciation for the gust and lore of their aliment. Nearly all of us delight in taking pretentious know-it-alls down a peg, so we are prone to suspect that most of these people haven’t the least idea what they are talking about. And we are bound to be right a high percentage of the time. (Urban Dictionary offers some sulfurous definitions of the term along those lines.) The most likely antecedents were “preppie” and “yuppie,” words that had taken hold only a few years earlier (“Foodie” authors Ann Barr and Paul Levy alluded directly to “The Official Preppie Handbook” in their title). My first thought was that “Trekkie” lurks in the background. It conveys the same sense of fervid, fanatical devotion that “foodie” does — far better than “preppie” or “yuppie.” I don’t know how common “Trekkie” was in England, though.
Another reason this word is so annoying, aside from the people it applies to, is that it partakes of the irritating British habit of taking perfectly useful words and adding diminutive or cutesy suffixes, hacking off syllables as needed. (The British have a long tradition of swallowing syllables, but tell me, is “Featherstonehaugh” really pronounced “Fanshaw”? I suspect that was Wodehouse’s idea of a joke, but I’ve never been quite sure, what with Cholmondeley and Marjoribanks.) I ask you: “Chocky” (a piece of chocolate), “preggers” (pregnant), “botty burp” (fart), “brolly” (umbrella), champers (champagne), sarnie (sandwich). There are dozens of them. Then there’s things like “billy-o,” “tickety-boo,” or “moggie” (cat), where the word sounds like it was invented on a particularly obnoxious kids’ television program, even though no orthographic surgery is involved. Are these not the effluvia of a decaying culture? This from the once-proud nation that gave us rhyming slang, which is both amusing and intellectually stimulating, when not downright mystifying. Why does “me old china” mean “old friend”? Well, it’s really “me old china plate,” which rhymes with “mate,” which means “friend.” Then you get rid of the actual rhyming part because that would make it too easy. That’s three steps you have to go through — not for dummies. Rhyming slang is still around, and new terms continue to come forth (as in “britney [spears]” for “beers”), so it hasn’t been supplanted. But it’s a shame that the Brits insist on obscuring a powerful slang tradition with a glut of cloying, infantilized, and frankly unnecessary expressions. Yes, I am a blogger who loves words, but if you call me a “wordie,” I will find you and wreak dire vengeance.
Thanks to my sister for proposing this week’s expression! I was surprised that I hadn’t made a note of “foodie” on any of my rather disorganized lists, but surprises like that keep this blog entertaining, for me, at least.