Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 50 years

Tag Archives: celebrities


(2000’s | “embarrassing,” “appalling,” “repellent,” “disgraceful”)

The reason the word is so effective is that cringing is a very strong, primal reaction of instinctive avoidance. That which is cringeworthy is acutely shameful, disgusting, etc. — not just any old awkward moment or fleeting contretemps. The term so often attaches itself to the least excusable antics or pratfalls of celebrities (or anyone unfortunate enough to be in the public eye). For it is beloved of gossipmongers and social media addicts; anyone can humiliate themselves, but the word turns up disproportionately in celebrity journalism, or so Google makes it appear. Much celebrity journalism exudes more than a whiff of Schadenfreude, and “cringeworthy” suggests a certain pleasure in another’s discomfiture beyond the word’s primary effect of evoking the discomfort in oneself. My sense is that originally “cringeworthy” was used often in artistic contexts, to talk about a song, say, or a performance, that left you feeling sorry for the purveyor, and sorry for yourself for having endured it, too. Over time it has come to apply more often to situations, utterances, or actions that leave the feeling of having experienced something indecent, a low point in another person’s conduct that you would rather not have witnessed and can’t unsee, as today’s kids say.

This expression straddles the line between a strictly personal reaction and a social consensus about what is objectionable and what isn’t, which must go on to rank the objectionable things so we’ll know exactly when to start cringing. When you describe a text or act as cringeworthy, you are appealing to a set of boundaries that most people, or at least most people who have any interest in the field under discussion, would subscribe to. Each of us grimaces and shies away as an individual, but we are animated by a shared understanding of the awful.

“Worthy” as a suffix is not unknown, but seems kind of quaint. Praiseworthy, blameworthy, credit-worthy, seaworthy. It turns up now and then in surnames, as in Galsworthy. “Cringeworthy” was, in fact, the name of a character in the long-running “Bash Street Kids,” a recurring feature in the British comic book “The Beano,” and almost sounds like a name in a Dickens novel, but not quite. The mating with “cringe” works well because it too is an old-fashioned word. I daresay most people know what it means, but you don’t hear it much in casual conversation (the rise of “cringeworthy” may propel it into greater prominence). Two quasi-archaic expressions shoved together — a natural. Had the word been invented in the U.S., it might have come out “cringe-making,” but it is a Briticism; it was common in Commonwealth countries by the mid-nineties, a decade or so before it caught on over here. (A bit more history for them as wants it.) You do hear “cringey” sometimes, which means the same thing.

No mean Anglophile herself, Lovely Liz from Queens proposed this week’s expression. I say, thanks, old top!

Tags: , , , , , , , , , ,

me time

(1990’s | therapese? | “time to (or for) myself,” “free (or spare) time,” “break”)

With its echoes of the me generation and “all about me,” this expression can’t help sounding selfish. Yet me time is often touted as a way to make us more useful to others and is urged especially on parents. It’s what you need to refresh yourself so that you can handle your duties — particularly at home — with renewed vigor. Me time is not for the self-centered; it’s for the worn out. The emotional equivalent of breaks during the day at an office or construction site. Even the flintiest bosses have been compelled to recognize that employees will be more productive and last longer if they have some time to relax or occupy themselves with other matters, and dispensers of advice note that we often drive ourselves harder than anyone else and have to learn to cut ourselves some slack. If you don’t make time for yourself now and then, your duties will get even more arduous and exhausting, hastening a breakdown (and making you useless to those who depend on you).

Can “me time” be frivolous? Sometimes it is presented as pure hedonism, but much more often as an antidote to stress, a way to preserve equanimity in the midst of a demanding schedule. Which makes it an obligation, but to whom? A complicated mix of oneself and others. It is not essential to the expression, but “me time” may take a definite moral tone — not an indulgence, a responsibility.

Though not common in the nineties, the expression appeared occasionally, especially in the British and Canadian press, so it may be a Briticism. One pictures the exasperated English mother — having run through story time, nap time, play time — banishing the kids and declaring “me time!” If it didn’t originate in the States, it found its way here soon enough and became much more common after 2000. Now “Me Time” is the name of a Sephora skin care product — grandiosely described as “a firming and antioxidant-powered age-delay ritual fueled by black tea” — and even the august New York Times publishes a monthly column under this heading in its august Style section. Starting to lose that moral tone.

Here’s a question: Why doesn’t this phrase mean “time for me to shine,” or “pay attention to me”? Imagine a television host saying, “O.k., boys and girls, it’s ME time!” You wouldn’t have any trouble understanding that, right? “It’s Howdy-Doody time!” doesn’t mean it’s time for Howdy Doody to go off by himself and ignore everyone else for a while. It means he’s front and center. In a culture as self-obsessed as ours, we always need more ways to call attention to ourselves. But we would more likely say “my time” than “me time.”

We can thank Britney Spears for the recent spike in the frequency of this phrase. She used it innocently enough about a month ago to announce that she was taking a break from helping to tend her father, who was recovering from major abdominal surgery. We might have lauded her filial loyalty and wisdom in knowing when she needed a respite herself. Instead, a sirocco erupted in which a lot of people who need something better to do opined at length on matters that weren’t their business, and worse; Spears claims she has received death threats. Celebrities must hunger, at least occasionally, to be out of the public eye, and I suspect part of what she meant was simply “please leave me alone.” Grant me the mercy of being out of range of the tireless, and entirely otiose, celebrity gossip machine, eager to tar and feather this week’s villain for the slightest incorrect word, sentiment, or gesture. We sure know how to chew ’em up and spit ’em out.

Tags: , , , , , , , , , , , ,

bad hair day

(1990’s | “bad day,” “terrible day”)

A term that shot into prominence in the early 1990’s, though it had been around before that. How long before, I’m not prepared to say; one on-line source finds an example as early as 1970. No one has a convincing origin story for it; it just started bubbling up more often around 1990 and caught on. (1992 was the magic year, as far as I can tell.) As in the case of glass ceiling, the force of its rise and spread suggests pent-up demand. It’s something most of us undergo, at least occasionally, unless we’re fortunate enough to be bald. An inveterate cowlick sufferer in childhood, I know the feeling well. I can’t come up with a precise old-time equivalent, but the verb phrase “look a fright” meant something similar.

From the beginning, the bad hair day had a psychological component. You didn’t just have hair that wouldn’t behave, you were compromised mentally; your mood and concentration suffered because you couldn’t stop thinking about how your hair looked and imagining the reactions of everyone around you. It’s no secret that self-image affects self-esteem, but a full-blown bad hair day could be serious, causing one to become completely ineffective until the next day. And what happened if your hair refused to cooperate then? I don’t see any reason bad hair days couldn’t stretch to bad hair weeks and months, a long-term handicap, like a bad haircut, which is not the same thing but might be a precursor of some sort.

When you consider the disruptive force of the bad hair day, it’s not surprising the phrase took on the broader meaning of “day from hell.” A bad hair day often means a day where nothing goes right and you should have stayed in bed, but it still has to start with recalcitrant hair. Otherwise, it’s some other type of impossible day.

My sense is that “bad hair day” has receded somewhat and is not as popular or ubiquitous as it was when its flame burned brightly back in the nineties. But it’s part of the language, and its meaning hasn’t changed. The phrase has settled down as it has settled in, and now bad hair days may bring on the average more rue and less panic than they did back then. Yet the same threat of temporary psychological damage remains, and studies continue to show that bad hair days can have a debilitating effect, preventing us from doing our best work.


(1990’s | “rumpled hair or look”)

Probably a Briticism; the earliest instances I found in LexisNexis came from Canada, and to this day it seems to be more common in the non-American press. In England, “bedhead” for centuries has referred to what we could call a headboard in the U.S. I’m not sure whether that makes it more or less likely that the newer meaning — tonsorial disarray upon rising — arose in the Isles. Originally, bedhead was unintentional and therefore unwanted, but now it can refer to a studied style, one more way for celebrities to arrange their hair. It may even look sexy if it’s done right, but in the early nineties, when the word appears first in LexisNexis, bedhead was a misfortune, more to be pitied than ventured.

Bedhead strikes even before you find out it’s going to be a bad hair day. But sometimes your hair may be tamed with ritual application of unguents and elixirs, or at least Brylcreem. Someone should do a study to determine how often bedhead leads directly to bad hair and a subpar day (that would be a “bed hair day”). A substantial percentage, we may surmise, but how substantial? Thirty? fifty? ninety? If it were at the lower end, we might take modest comfort in knowing that we’ll get a break sometimes and the universe is not invariably a hostile place. All I ask is that the universe remain neutral. When it starts stacking the deck, I get offended.

Tags: , , , , , , , , ,


(2010’s | advertese | “endorser,” “influence”)

I came across this word in the arts world, of which I am a loose member, but of course it is far more common in the realm of advertising, particularly on social media, where influencers have grown a multi-billion dollar business. It’s a nice word for “shill.” As in the old days, they may be celebrities, but they may just as well be previously little-known people (and who knows? maybe the occasional bot) who have developed a potent on-line presence. The work is done by cultivating a following on Facebook, or Amazon, or somewhere, and convincing some brand to pay for plugging its products, if you don’t have a brand of your own. Then influencers weave their spells, convincing mobs of hapless sheep to buy the product just by kvelling over it on Instagram. The more loyalty an influencer inspires, the more dollar signs light up vice presidents’ eyes.

The concept of the recommendation is much older than advertising but has always held an honored place in it. What could be better than finding out exactly what you need from a friend or a respected authority? Like everything old made new again by the internet, such an adviser must have a new name. Am I the only one that thinks “influencer” sounds like a villain of some sort? Like “the fixer” or “deep throat” in a political thriller, a name people utter reluctantly, in a hushed, slightly awed tone.

In the arts world the concept is similar but a bit less crass. Influencers have the ear of the people with money, the people on the board who decide what to program and whom to hire. So if you want to promote something, you need to worm your way into their good graces. This sense is closer to how the word was used in the eighties (when used at all). An influencer was similar to an éminence grise or power behind the throne. They didn’t get the credit or the spotlight, but they got their way. That idea remained in use in advertese up to the social media revolution, but seeking out the one right person who can make your project happen is quite different from persuading millions to whip out their credit cards.

I’ve covered a number of new expressions that end in “er,” denoting agency of some kind. Some of them have a touch of the poetic: headhunter, rainmaker, -whisperer. Some lack any sort of distinction: deal-breaker, fraudster, server. “Influencer” belongs to a group composed of awkward, hyper-literal formations that strike the ear as bureaucratese or jargon: caregiver, early adopter, facilitator, first responder, warfighter. Adding an “er” suffix is one of those linguistic shortcuts — like pasting “ize” on a noun to create a verb, or adding “ment” or “ness” to go the other way — that help establish that quality. Such affixes are the last refuge of those with no ear or sense for language who just need to come up with a new word for whatever it is. Even the more literate may resort to an “er” nonce word after painting themselves into a grammatical corner.

Influencers have become a thing in recent years, and advertisers have embraced them heartily, as excited articles pile up in trade journals analyzing the most effective means of employing their services, rules to live by, practices to shun. Micro- and nano-influencers have shorter lists of followers but may be potent within those limits; they have begun to attract their due. These things rise and fall, but we seem poised to hear ever more about influencers in the near term, at least.

Tags: , , , , , , , , , , ,


(businese | “risk”)

A word of many uses in everyday language to which one has been added in the last forty years. A quick review of the wide range of meanings this term had in the seventies, say:

a. the act of learning about or experiencing a stimulus, especially an unfamiliar one (“exposure to jazz, French culture, etc.”); goes with “to”

b. the direction your window, etc. faces (“southern exposure”); no preposition

c. for photographers, it meant how much light came in before the shutter closed, or simply a frame of film that had already been shot (you could even have a double exposure, and that’s no double entendre)

d. inadequate covering of body parts not normally displayed, voluntarily (“indecent exposure”) or involuntarily (“die of exposure”); no preposition

e. personal embarrassment caused by no-longer-secret conduct (e.g., “he was disgraced by his exposure as a tax cheat”); goes with “of”

f. attention from the popular press, what one gets when one is a celebrity; no preposition

g. potential harm caused by ingesting or absorbing hazardous substances from the environment (such as sunlight or air pollution or radiation); goes with “to”

And now there’s

h. financial risk caused by heavy investments in a weak sector, or just too much debt; goes with “to.” Probably a descendant of g., or at least that’s the one it most closely resembles. In the aftermaths of the 2008 crash, and the 2000 crash, and the 1987 crash, and the 1981 crash, we’ve gotten used to the idea of toxic financial instruments and practices, and this usage is a natural outgrowth. While “exposure” had this meaning well before 1980 in financial jargon, the increased fragility of the U.S. economy in recent decades has no doubt helped push it outward into the general vocabulary. (Even in a purely financial context, it also partakes somewhat of e. If it partakes likewise of d., you’re in bad shape; even a good lawyer won’t be able to do much.)

One way to sort definitions d. through h. is to place each one by its potential for undesirable results. With e. and g. and most likely h., you’re worse off than you would have been otherwise, but d. and f. may cut both ways. Exposure of the body may subject you to injury, or it may give you the warped satisfaction of forcing another person to participate unwillingly in your sexual gratification. Even in the latter case, you’re still vulnerable, to arrest if nothing else. As for f., today’s darling of the gossip pages is tomorrow’s disgrace, if e. kicks in and your secret vice is found out. It may not be that dramatic; a celebrity may fall from favor simply by attracting too much attention (“overexposure”). It may be fun at first, but any kind of exposure ultimately invites danger to one’s reputation, or even one’s life.

This week’s term, with its implication that one has been caught doing something wrong, points to a peculiarity of English: we don’t have a reliable word for revealing hidden good deeds rather than hidden malfeasance. “Expose,” “unmask,” “uncover,” “reveal” itself — they all imply that one has been up to no good. “Unveiling” might work, but we use that more often about statues than about people. I was thinking about this as I tried to translate a German title that included the phrase “Enttarnung eines Helden.” “Introducing a hero” or “Exhuming a hero” might get the point across, but the first is imprecise and the second ghoulish. “Recovering a hero” has an unfortunate association with upholstery. How do you reveal that someone has acted heroically when the available verbs suggest villainy?

Tags: , , , , , , , , ,

date night

(1990’s | therapese)

One of those expressions that has evolved a distinct new shade of meaning in the last forty years. Before 1990 or so, it was a loose, carefree expression that applied mainly to single people. As often as not, it was a shorthand way of referring to Friday or Saturday evening. Movie theaters and sports teams held promotional date nights to encourage under-25’s to come out and spend money. These uses have not disappeared by any means, but what has changed? Now date nights are the province primarily of the married, more specifically the married who sense that their relationship needs a boost. So you and your spouse ditch the kids and go out on the town and spend money. The custom has a bit more range now; you might go to a class together on date night, or church. The main consequence of the change? Date nights aren’t only for the young and frivolous any more. Now mature adults with responsibilities and work ethics are enjoined to enjoy them, too. The shift started in the nineties — I found only a few isolated incidences before then.

Date nights are urged particularly on parents, but sources of stress and separation besides kids may trigger a date-night deficit. (The “daddy/daughter date night” is an occasional non-marital variation, which likewise marks an effort to improve or deepen a relationship.) Couples need to reconnect and rekindle sometimes, and many well-meaning busybodies have issued extensive guidelines for doing so. I have hinted before at the meticulously planned architecture of relationships patiently builded by swarms of counselors, therapists, journalists, et al., et al., from coffee date to date night, or as you might say, from dates to nuts. They even advise periodic spontaneity, but if you have to plan it . . . oh, never mind. It’s not the decline in spontaneity that bothers me (most people aren’t that good at it, anyway) so much as the depressing uniformity of it all. An endless stream of like-minded relationship advice, however well-meant, must dull our romantic powers. Even if it works most of the time, sometimes ya gotta throw away the playbook.

After a brief and unscientific survey of LexisNexis results over the past month or so, I’d say that while date nights are urged upon all of us by the romance industry, the date nights of celebrities are reported endlessly, creating the impression that no one else ever takes one. Why not turn that around? Report on local couples’ nights out as if they were celebrities — what she wore, where they went, how close they danced, which base they got to, that sort of thing. I wonder how many people would enjoy that, and how many would hate it. We feel for celebrities who have to fend off paparazzi, and some of us would be all the more fervent if we had to go through it ourselves. But I’ll bet a lot of people would get a kick out of such oppressive attention. After all, it would mean you are worthy, it would mean you’re as important as . . . whoever you favor. The gossip page brought to life — from vicarious to visceral.

Tags: , , , , , , , , , , ,

in a good place

(1980’s | journalese (arts) | “at peace,” “in the right frame of mind,” “pleased,” “happy with the way things are going”)

To dispense first with the obvious, we’re not talking about “in the right place” or “in a good location,” or any other literal use of this week’s expression. If you can substitute either of them for “in a good place,” you’ve got hold of its old, boring meaning. Well, that’s so seventies. This week we’re investigating the emotional side of this expression when it describes a mental state rather than real estate. You’re confident and secure as a deodorant commercial, content with your lot, have a good mindset; all’s right with the world. It’s very similar to “feeling good about oneself,” which is a little older, more of a sixties expression. Because it referred to one’s mental condition, in the eighties a variant was “one’s head is in a good place.” That did not mean the same thing as “one’s heart is in the right place” (one has good motives), but the newer phrase “come from a good place” does.

Maybe I’m imagining it — I don’t think so — but this idiom seems to turn up disproportionately in celebrity reporting, and a substantial number of early uses dropped from the lips of popular singers and actors. (I collected examples from James Taylor, David Crosby, Gary Busey, even Betty Friedan.) Maybe I also imagine — naah — that it often has a lightly veiled meaning in such contexts. When a star says “I’m in a good place now,” it usually suggests (or acknowledges) that she has gone through a rough patch — drug rehab, petty theft, a bad breakup, any or all of the messes celebrities get themselves into. It’s a way of saying one has bounced back or gotten over the problem.

No one would have said an abstraction was “in a good place” a generation ago, but an NFL official used it to express satisfaction with a revised rule last month. We still use it much more readily about people, and it has spread well beyond the celebrity ghetto; any of us can use it casually to describe ourselves. It is bound to continue to spread. The phrase has long been available to talk about groups, teams, or agencies, not just individuals. European leader Jean-Claude Juncker said last fall that the EU “is not in a good place right now.” His insertion of “not” follows a later trend; once the positive expression has made its way, the negative can find its place, too.

The obvious origin for this expression is the old euphemism for heaven, “a better place,” as in “He’s gone to a better place now.” I found a transitional example in Smokey Robinson’s eulogy for Marvin Gaye (1984): “I don’t think [Gaye] would have wanted us all to be here today, sad and crying and mourning, because he’s in a good place now. He’s somewhere where nothing can hurt him from now on.” (It’s not clear that Robinson was referring to heaven; he might have meant mere oblivion.) If you want to talk about something more sublunary, you have to settle for “good,” so people won’t think you’ve died. I’m not convinced that’s the root expression, but I can’t think of a better explanation. As talk of heaven has become less ordinary and much less serious, at least in advanced circles, its watered-down variant has crossed the bourn to become the property of the living.

Tags: , , , , , , , , ,

baby bump

(2000’s | journalese (gossip) | “belly”)

Definitely a Briticism, which is not something I would have guessed. It appeared rarely in the U.S. press before 2005, says LexisNexis, by which time the Brits didn’t even consider it cheeky any more. “Baby bump” is a creature of the gossip pages and has generally been the property of celebrities. By now it is possible to use the phrase with reference to any pregnant woman, but it still turns up on the gossip pages an awful lot. Presumably the American expression “baby boom” acted as a midwife helping “baby bump” enter the language. Alternative usage note: In recent years demographers have begun using the phrase to denote a temporary increase in the birth rate, using “bump” to mean “spike” or “uptick” rather than protuberance.

My sense is that the rise of the expression paralleled the decline in baggy maternity dresses, which were still the norm in my childhood. Pregnancy has become glamorous and has perforce developed its own style, at least among those who consider style important. Flaunting the physical changes wrought by pregnancy, rather than concealing them or at least blurring the outlines a little, is a change in fashion as well as mores, and the strong association with celebrities confirms that the baby bump is regarded a built-in accessory which women can dress, decorate, and display to attract attention to themselves and their blessed state. Then again, some celebrities may not want the extra attention. Chrissy Teigen recently responded to on-line speculation about her pregnancy by telling fans to “get out of my uterus.” I suspect the offenders thought they were just doing their job; it’s refreshing to learn that at least some celebrities miss the sensation of privacy.

When I was young, it was customary to talk about pregnancy as a state of being, not as a feature or possession. We said an expecting woman was “showing,” or “visibly pregnant,” but I don’t think there was really an equivalent for “baby bump.” The reluctance to show or mention manifestations of pregnancy was passing away even then, reflecting deeper changes in the intersections of individuals and society. Now the swollen belly has become just one more part of the body to show off, cheapening the sanctity of motherhood. That’s the moralist’s interpretation, anyway. It’s also possible to view the shift less censoriously as an evolution of convenience, offering an informal way to refer to a common physical condition, creating a different part of speech in the process and thus permitting greater variety and flexibility in sentence-making. (Many new expressions fall into this category.) Or simply a restless pressure to expand the language; writers are always looking for new ways to say old things.

Back in disco’s heyday, we did the bump. “Fist bump” has replaced “slap me five,” and chest bumps have become much more common. Why shouldn’t “baby bump” signify two prospective mothers bouncing their bellies together, in greeting or in solidarity? I guess that would be “belly bump,” wouldn’t it? Don’t get me wrong; I’m not trying to start a new fad.

Tags: , , , , , , , , , , ,

keep it real

(1990’s | journalese (hip-hop) | “don’t get above yourself,” “don’t forget your roots,” “trust your instincts”)

The origins of this expression are not in doubt. Like “shoutout,” it arose among hip-hop artists (or rappers, as non-initiates called them then) in the mid-1990’s. It is easy to understand but hard to define. Or rather, it is hard to stop defining; the expression never had a very restrictive field and has become quite versatile. But in the earliest days it most often had a personal angle: remembering where you came from — and a social angle: not letting fame and fortune distract you from the causes you’re fighting for. In 1996, according to a New York Times article, “keep it real” meant “don’t lose your edge or your anger. . . . don’t forget your ‘homies,’ the friends you grew up with and who haven’t made it.” Three years earlier, Dr. Dre had offered a gentler definition: “just being yourself, staying true to yourself and doing what you like.” Less of a social angle there, but his definition has the quality of being broad and narrow, general and particular, that this expression has always borne. It can imply sincere, unaffected, down-to-earth, unaltered (as in hair or body parts), uninfluenced by social pressure, willing to say unpopular things, brutally honest. There are many types of authenticity. Another facet of the expression is a tacit injunction to remain part of your particular minority rather than adopting a version of majority culture, as the Times definition implies.

White people use this expression quite a bit more than they did twenty years ago, but to me it still feels coded black — that is, when white people utter it, they are deliberately using a phrase they regard as characteristic of African-Americans. Along with that goes a patronizing quality, a subtle but unmistakable acknowledgment of linguistic slumming from a position of superiority. The undertones can be quite noticeable and are almost always present. When the utterance is accompanied by some white guy’s attempt at a rapper’s hand gesture, the condescension becomes overt.

living the dream

(1980’s | journalese | “making it big,” “making the grade,” “getting ahead”)

The springboard for this expression, which at first appears but an insignificant variant of “live one’s dream” (it was also customary to say “live out one’s dream”), very likely was the push for a national holiday marking Martin Luther King’s birthday in the mid-1980’s. Before that, the exact wording was unusual; the possessive pronoun or possibly “that” were the norm. In 1985, King’s widow urged us on toward “living the dream”: creating in America the ideals King laid out in his most famous speech. (Ted Kennedy a few months earlier had used the phrase “living the dream of Martin Luther King,” it still being necessary to specify which dream he meant). The exact phrase became much more common once the federal holiday was established, appearing often in mottoes and titles. King’s dream was more substantial than most, imbuing the word with the breadth of vision necessary to change society at the roots rather than simply making a better life for oneself. “Dream” also functions as an adjective, as in “dream vacation”; some of that sense of wonder survives in this week’s expression, which is never more than faintly ironic; it’s congratulatory, not derogatory. It could mean “living in a fantasy world,” but it never does, as far as I can tell. (That would be “living (in) a dream,” I guess.)

I don’t discern a connection between “living the dream” and the American dream. King’s dream was only incidentally a matter of material comfort and respectability, so maybe that isn’t surprising. It seems that any expression incorporating “the dream” ought to refer to the American dream by default, but in this case apparently it never did. “Living the dream” demands a different kind of ambition. There’s doing what you always wanted to do and doing what everybody always wanted to do. Living your dream involves the former; living the dream involves the latter. That is the change brought about by the shift in wording. You might live your dream by becoming dogcatcher, but you are not living THE dream, because most of us don’t share that goal. “Living the dream” comes up often in stories about our dream factories — professional sports, the music industry, Hollywood — because lots and lots of people want to be LeBron James or Beyoncé. To live the dream, you don’t have to be successful, but you do have to represent a group — all the people who wish they could be you, or at least live the life you’re leading. It would be tone-deaf for a high-powered star to say openly that she is standing in for millions of people who want to be in her shoes, but celebrities must endure not only stalkers and paparazzi, but all the harmless people who just want to be them.

Tags: , , , , , , , ,

feeding frenzy

(1980’s | businese?, journalese? | “pigs at the trough,” “every man for himself,” “swarm (of . . .),” “melee”)

This expression had to learn to stand on its own in order to take its place in our vocabulary. It was quite possible in 1980 to use it as part of a simile, almost always juxtaposed with the noble shark. “Feeding frenzy” seems to have been invented after midcentury to describe the way hungry sharks eat; the first citation in the OED dates from 1960. The first citation I found in LexisNexis that dispensed with the sharks occurred in 1981, in the context of corporate mergers. Within a few years, it had come to be applied to lots of other things: the press, government officials, greedy litigants, or investors, for example. (Nowadays it may often evoke criminals or consumers.) It’s my sense that the merger mania of the eighties did more than any other cultural excrescence to propel “feeding frenzy” into prominence. Now the phrase most commonly refers to the press, especially the entertainment press, as in “tabloid feeding frenzy.” We have no trouble envisioning mobs of desperate reporters and photographers competing for the smallest scraps of sensation. But it’s also used to talk about political reporting, at least partly as a result of political scientist Larry Sabato’s 1991 book, “Feeding Frenzy: Attack Journalism and American Politics.” And then, surprise! sometimes it just refers to a lot of people stuffing their faces, as at a barbecue or banquet.

Metaphorically (for now we switch from simile to metaphor), “feeding frenzy” denotes a group of people competing in aggressive or violent ways. The violence may be wholly figurative, and it may be real, as when newshounds or shoppers jostle each other. Feeding frenzies usually arise suddenly and end soon, but always in relative terms — the feeding frenzy following Lindsay Lohan lasts until she can duck into a car, but dueling corporations can keep it up for months.

One highly mutable aspect of this term: when does it have an edge of contempt? When corporate executives snap up profitable firms, it doesn’t seem to bother anyone very much, but when paparazzi hound Princess Diana, the sneer is clear. For profit-minded executives, or consumers on Black Friday, the feeding frenzy is the norm, nay, commendable. On the other hand, some of us cling quaintly to the notion that unchecked intrusion into celebrities’ private business is not a worthy occupation. The expression may call to mind indiscriminate acquisition (especially when referring to wealthy collectors at tony auction houses), crude gorging, or even bestial cruelty. But it may also suggest fierce competition, which we generally celebrate, at least in the abstract. Most of the time, “feeding frenzy” bears at least a touch of scorn, but you have to watch the context. It’s not always there.

Who remembers John DeLorean? His lawyer in 1983 called prosecutors’ pursuit of his client a “feeding frenzy,” but with a twist. He used the image of sharks surrounding a wounded creature, eager to tear it to pieces. Why isn’t this idea more common? Sharks go ape at the scent of blood, right? We’ve all learned from a hundred disaster movies that the minute a drop of blood hits the water, the sharks close in. Real life has something to say about it, of course: Executives prefer to go after a healthy corporation to a hemorrhaging one, and the gutter press doesn’t wait until the movie star is down to start kicking. I suppose it’s unrealistic to expect such similes to hew too faithfully to their referents.

Back to the literal use at last. When we use the term “feeding frenzy,” it’s always aquatic animals, for some reason. Sharks, mainly, occasionally some kind of fish. Why? Rats, coyotes, and other land animals feed in voracious packs, but we don’t use the term in that context. Maybe sharks are just more evocative, or maybe “Jaws” was the most influential film ever, but this continues to seem strange to me.

Tags: , , , , , , , , , , ,