Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 40 years

Tag Archives: advertising


(1980’s | academese (economics) | “cautious”)

This expression carries a couple of odd dichotomies considering how straightforward it appears. The most obvious pertains to that which it modifies; either persons or corporate bodies — whatever the Supreme Court says, they’re not the same — may be risk-averse, though presumably the risk-aversion of a corporation is ultimately traceable to individuals, whether executives or independent shareholders. More interesting is the fact that risk-averseness may proceed from two entirely different kinds of experience. A conservative corporate board avoids sudden shifts and grand initiatives because they feel prosperous; there’s no incentive to rock the boat. Yet it is a tenet of pop psychology that those who have lived through times of deprivation are suspicious of all but the safest investments, and, in extreme cases, may refuse even to keep their money in banks. (Both sides have in common assets to protect; if you have nothing to lose, there’s no point in being risk-averse.) But then there’s an absent dichotomy that one might naively expect to find in an expression beloved of bankers: the distinction between sensible risk likely to pay off and a crazy scheme. The risk-averse will stay away from both, desiring only the steadiest and safest.

The expression comes out of the discipline of economics and was most used originally in finance, starting in the sixties and becoming commonplace by the eighties. Soon it came to be used often of politicians and lawyers. Among corporations, insurance companies attract it the most; their risk-aversity comes from a visceral understanding of actuarial tables. Yet any stodgy company merits the term. Slowly but surely over time, it has spread into other kinds of prose, with movie reviewers and even the odd sportswriter resorting to it nowadays. More kinds of writers use it to describe more kinds of people — it’s not just for stockholders any more. The point of the compound seems to be neutrality; it strives to avoid any imputation of prudence or cowardice, and largely does, as far as I can tell.

In a previous post I remarked on the curse of capitalism — if one guy works harder, everyone has to work harder — and risk-aversitude bears the seeds of a different manifestation of it. In competitive markets, each company watches the innovations of others like a hawk. When they succeed, the other competitors follow; when they fail, everyone else drops plans to do something similar. Television works this way, though maybe less so now, when there are so many networks (an obsolete word, I know). Any change — introducing a new character into a popular series, or a new show about a controversial subject — carries with it a chance that your audience will flee in terror. But if it pays off, your competitors take note and resolve to do the same damn thing, backed up by shareholders who noticed that it made big profits for the other guy. Within a season or two, everyone is sick of the no-longer new gambit, and most of the imitators have made no headway. Whereupon they lose advertisers, another risk-averse group famously shy of causing offense, taking the money and running at the first sign of any immoral or objectionable acts that might result in lost market share. (Bill O’Reilly is only the latest in a very long line of such embarrassments.) Sometimes, what looks safe turns out to be dangerous. Risk avoidance, like any other strategy, is subject to misuse born of misunderstanding or bad timing, whether by the humblest investor or the loftiest board of directors.

Tags: , , , , , , , , , , , , ,

on demand

(1980’s | businese (finance) | “on request,” “when you want it”)

When did “by request” become “on demand”? The expression in financial circles is quite old; a note or loan might be payable “on demand” (all at once when the lender calls for it) rather than on a fixed schedule over time. But somewhere in there, it took on a much wider range of use. The campaign for abortion rights certainly played a role; by 1970 it was not unusual to hear talk of abortion on demand, which became a rallying cry as laws banning abortion came under attack. That trend has been going the other way for the last two decades, too late to stop the expansion of “on demand,” which now applies to nearly everything that can be ordered over the internet, from groceries to streamed movies to academic courses. All you have to do is snap your fingers, or tap your phone. (Doesn’t sound right, does it? even though it’s a literal description. But the old meaning of tapping a phone continues to get in the way.) You may have to wait longer than you did when you left the house to supply this need or that, but we are beguiled by the ease of letting a credit card and a delivery service do all the work, making the new “order” seem all the more attractive.

So a staid and venerable financial term has sprawled all over the place like lava flow from an angry volcano, aided first by medical and cultural trends (not just abortion — drug treatment and medical care more generally glommed onto the phrase in the seventies and eighties) and then by the rise of the personal computer, which even before the internet infiltrated our lives occasioned much talk of providing computational or word-processing services on demand. The phrase has become a hyphenated adjective as well. “On-demand economy,” based on people spending money from their smartphones, is a phrase you will hear more and more.

There seems to be an implicit democratization at work, too. If you have enough money, just about anything is available on demand, and that’s been true for centuries, making allowances for the fact the number of things we want, or think we need, has grown over time. Now you don’t need much money to acquire goods or entertainment on demand. If money can’t buy it, it’s not so easy. We may forget that not everything desirable can be had at the click of a mouse.

I’ve suspected for a long time that the internet has completed our transformation into a nation of three-year-olds, a trend initiated by the Sears Roebuck catalogue and the rise of advertising in the late nineteenth century. The consumer economy requires people to come up with new stuff to want and must continually come up with quicker and more reliable ways to get it to them. eBay, for example, consummates a huge number of transactions every day called “buy it now.” Is that much different from “Want it NOW” or “gimme NOW”? When it comes to tangible items, it’s not even instant gratification — that CD or toaster won’t fall into your lap the minute you click “confirm and pay” on Paypal. But we’ve learned to treat it as instant gratification; making the purchase is as good as holding the object of desire in our hands. Amazon wants to use drones to deliver packages faster than ever; next year it will be something else. We have created an economic monster that requires our appetites, and the means to sate them, to continue growing indefinitely. How long can we keep it up?

Tags: , , , , , , , , ,

wow factor

(1980’s | journalese (film)? advertese? enginese? | “appeal,” “oomph,” “oohs and ahs,” “brilliance”)

Inasmuch as “wow” and “factor” both have relatively long and complicated histories, perhaps we should begin there before considering their union. “Wow” appears to go back to a Scots interjection, which could be laudatory or derogatory, and our modern understanding of the word emerged even before the beginning of the twentieth century; by 1925 it was going strong as an interjection and available as a noun or verb. The interjection is far more common than the other two today and probably always has been. “Factor” is an even older word that early in the twentieth century meant “gene,” basically (allowing for evolution in our understanding of genetics); now it is defined much more generally as “element or constituent, esp. one which contributes to or influences a process or result” (OED), especially if it’s important and its action is not well understood. “Factor” preceded by another term to denote a particular substance or catalyst is quite common in medicine; “Rh factor” being a longstanding example. “Risk factor” no doubt started life as a medical term but now flourishes in other fields. “Factor” became popular in Hollywood during the seventies, when it followed “Delta,” “Neptune,” “love,” and “human” (twice) in film titles (they all had to do with science fiction or espionage). And, to complete the picture — or the confusion — “wow factor” was used occasionally among stereophiles before 1980 to talk about irregularities in playback speed of tape decks and turntables, as in the phrase “wow and flutter.” So it seems the stage was well set.

By the mid-1980’s, the phrase started turning up in writing about entertainment (particularly films and television), computer software, merchandise more generally, and even service industries like banking. One early exponent was marketer Ken Hakuda, who used “wow factor” in 1987 to talk about his success in selling toys which he freely admitted were not useful or valuable except as a source of mindless fun. He used the term to refer to a highly visible feature of a product or its packaging that makes a strong, immediate impression, causing shoppers to whip out their wallets. That quality of impressiveness constitutes a common denominator among objects blessed with the wow factor. I’m not willing to take a firm position on the origin of this particular meaning. If I had to guess, I would say Hollywood, but advertese seems like an equally logical breeding ground, and I can’t say it didn’t start there. Because the phrase goes frequently with technological advance (especially when you’re talking about cinematic special effects), it is possible to argue that its source is enginese. While two of the earliest citations found in LexisNexis are due to Steven Spielberg and Harrison Ford, the very first (1984) was in the title of Miss Manners’s column, of all places. Did she supply the headline, or do we owe it to a forever anonymous editor? By the mid-1990’s, the expression was no longer extraordinary and had shed quotation marks, exclamation points, capital letters, and such tricks of the trade.

If you looked even cursorily at the pre-1980 equivalents listed at the top of this entry, you may have surmised, correctly, that I struggled to find convincing synonyms from the old days. That is because we used to say the same thing with adjectives — e.g., dazzling, eye-catching, awe-inspiring, cool — or verb phrases: knock your socks off, set something apart, jump off the shelves. Many new expressions have ensconced familiar ideas in new parts of speech, which usually is a net gain for the language. More ways to say the same thing reduces monotony and opens up room for small but significant variations in connotation. I’m inclined to consider the popularity of “wow factor” deserved. It’s short and to the point. And the ground meaning is quite clear, though it can imply two slightly different things, just as in the sixties, “wow” conveyed two different levels of excitement. One was the old gee-whillikers gob-smacked elation at seeing anything unexpected and pleasing. The other was quieter, more meditative, as in the pothead grokking the universe as he exhales. No squealing or jumping up and down, but the profound sense of something worthier than oneself that must be absorbed and appreciated with a drawn-out “wow.” “Wow factor” has always leaned more heavily in the direction of the former sense, but it can shade toward the latter sense as well, and seems to do so more often as time goes by. Not that the two meanings are all that far apart.

It has occurred to me to wonder if we should hear this expression with a whiff of the tawdry or meretricious. Given its early use and likely origins, it’s not hard at all for an old snob like myself to inflect it this way. But that would demand an ironic edge that I rarely or never hear when the phrase is used. A “wow factor” is a good thing that will impress the audience, sell the product, or make something stand out. The idea that there must be something cheap or slutty about it never seems to have taken root.

Tags: , , , , , , , , , , , , ,

hold that thought

(1990’s | journalese? | “keep that in mind,” “we’ll come back to that,” “hang on”)

This expression is a bit of a dark horse. It slipped into the language without fanfare somewhere between 1970 and 1990 and did not get fully established in print until at least the latter year. An early adopter, sportswriter Thomas Boswell, used it a couple of times before 1990; Ross Perot said it in 1992 (I don’t associate it with him particularly, unlike some other characteristic phrases). One thinks of “hold back” (as in a dam or fence), “hold to,” or even “hold with” (affirm, believe, approve of), but none of those seems like a proper ancestor. “We hold these thoughts to be self-evident” doesn’t have the same ring as Jefferson’s canonical phrase, and it’s not the right meaning anyway. “Hold” in this case simply takes the place of “hang onto” or “suspend.” “Hold on,” “put on hold,” or “hold everything” are much more like it.

“Hold that thought” has always had a bit of contradiction built into it, or at least the potential for one. As the phrase is normally used, it asks the hearer to set something aside but also keep it in the forefront of one’s mind, prepared to reintroduce it at the first opportunity. Take it away, but don’t let it get away. So you rein in the idea on the tip of your tongue, knowing a more opportune moment will soon arrive. In the early days, the phrase could also carry a more unreserved meaning, closer to “stick with it” or “keep the faith,” but I am not conscious of seeing or hearing it used that way now. There is another distinctive feature of “hold that thought,” which is that writers often use it to begin or end a paragraph, or even as a paragraph unto itself. That gives it an air of portentousness, an injunction to the reader to keep your eye on the notion in question. My sense is that in conversation its use tends to be more casual, but even there it may take on the same minatory tinge. One more point, for the sake of completeness: you may see “hold that thought” used in the indicative sometimes, but in that mood it lacks any particular interest; we are discussing the imperative.

My best guess is that this expression arose on television, particularly in news programs or talk shows, where interviews make up most of the entertainment. “Hold that thought” enshrines a necessity imposed by commercial television, which dictates regular breaks in programming, often of two minutes or even more, well beyond the retention span of most of our fellow citizens. Let’s say an expert guest finally gets going just before the host cuts to a commercial. In such cases, the interviewer needs a polite, encouraging way to ask the speaker to take a break and pick up where she left off, and also to enjoin viewers to keep track of the topic through a volley of detergent ads. “Hold that thought” plays that role admirably, I think. The New York Times (April 26, 1987) put it like this: “Television is not always a great place to explore ideas that are complex, subtle or slippery. Things get in the way: a smart-aleck host, the scarcity of time, ‘hold that thought, here comes a station break.'” Sometimes “hold that thought” appears when there is no pause, as in cases where it means “wait while we introduce a related concept” (this usage is available in prose as well as speech). But most often it portends an interruption or delay. That’s why the alternate sense of this expression — “cling to an idea” — didn’t stay in the running. “Hold that thought” was needed for other things.

Tags: , , , , , , , , , ,

play one on TV

(1990’s | advertese | “know something about it,” “fake it”)

The “one” in this fixed phrase refers most often to doctors or lawyers, although it has any number of possible antecedents. It is used mainly as a disclaimer by ordinary people, normally in the negative, as in “I’m not a _______ and I don’t play one on TV.” That sentence signals that I lack genuine expertise, however well-informed I might be in general. Sometimes it is used to accuse another person of faking or pretending; in this sense the speaker appeals to the deception inherent in acting, taking a puritanical view. An early example comes from comedian David Steinberg, who described Ronald Reagan as not a president but someone who played a president on television. And sometimes the expression is used, specifically by actors, to claim authority to talk about the topic at hand. In this sense, it emphasizes the preparation and commitment required to play a dramatic role convincingly. The idea follows, in a backhanded way, from the disclaimer cited above; if not even being able to impersonate an M.D. is yet more proof that you are not an expert, then impersonating an M.D. ought to confer some expertise, however evanescent it is in practice. Less often, the phrase is used to describe someone who really looks the part, when used with “could” (a successor of the old expression, “sent from central casting”). The phrase is used quite literally a surprising percentage of the time, of or by television actors, but it has a strong ironic tradition as well. The hipness that came so naturally in its early days has persisted.

LexisNexis and Google Books (and TV Tropes) agree that “play one on TV” didn’t exist before 1984, when a commercial pitched Vicks Adult Formula cough syrup with actor Chris Robinson (Dr. Rick Webber on General Hospital) narrating. A couple of years later, he was replaced by Peter Bergman (Dr. Cliff Warner on All My Children) — the Bergman version can be found here. The noteworthy point about the commercial is that it made explicit use of the premise that you should not trust an actor playing a doctor for medical advice. Like many ad agency products, this one packs a lot of aporia into thirty seconds, so I will take the liberty of summarizing it. The narrator notes that if your child were coughing, you would go to the doctor and get the best medicine, but when the harried mother has a cough, she rummages through the medicine cabinet (“playing doctor at home,” putting your unqualified self in place of the doctor, just like the narrator, get it? and with a bonus reference to titillating children’s games) and grabs the kids’ cough suppressant rather than one specially formulated for harried mothers. It’s a dizzying ride: first the actor suggests that you shouldn’t take his word for it — he just plays a doctor on a soap opera, after all — and that taking the wrong cough medicine is as dumb as listening to an actor spout medical advice. But by the end of the commercial, by gum he’s given you medical advice: you are supposed to rush out and buy Vicks Adult Formula. Don’t listen to me; do listen to me. The disclaimer carefully planted at the beginning softens up your defenses and primes you to trust the fake doctor at the end.

When critics in the eighties talked about this commercial, they tended to miss most of the ambiguities but did latch on to the idea that it takes a lot of chutzpah to trot some guy out there who doesn’t know anything about medicine to tell you which cough syrup to buy. That reaction still erupts when an actor claims special competence derived solely from playing a certain role. We are smart enough to know that we shouldn’t trust advertisers, but most of us are not smart enough to figure out all the different ways we are being manipulated, and sooner or later we succumb.

One oddity courtesy of LexisNexis: This phrase comes up in the search results almost exclusively in U.S. publications, very rarely in Australian, Canadian, or British sources. Such a pattern is very unusual in the kind of expressions I look into. Most new expressions circle the globe quickly and turn up in English-speaking sources in Asia, Europe, and North America, but not this one. I’m not sure why that should be, and it may change over time. It may suggest no more than the truism that American culture is steeped and pickled in television to a degree not seen elsewhere, or it may just have to do with our standards of truth in advertising.

Tags: , , , , , , , , , ,

how I roll

(2000’s | journalese? | “my way or thing,” “what I like,” “how I do things”)

I learned this expression from my girlfriend’s daughter. It had its day in our household last year and has since receded, although it will undoubtedly rear its head again. The kids didn’t invent it, though. The earliest large-scale media event I found that employed the phrase was a Pepsi commercial during the 2005 Super Bowl. I came across some older examples, but it seems safe to say that the expression gained a lot of ground after that. By the end of 2009, a writer on dismissed the phrase as out of date, but that was probably true only among the avant-garde; most of us were just getting started. The pronoun varies; any combination of persons and numbers is possible, but I, we, and they seem to predominate. Oddly, one finds relatively few examples of the the third-person singular, but the others all make their presences felt. It can also be used in the negative to decry an action that one does not condone.

“How I roll” or “the way I roll” has an invariable meaning. It follows the statement of a habit, preference, or wish that the speaker thinks might raise eyebrows, and pre-empts any doubts or objections. The phrase is not defensive; in fact, it implies pride in the behavior or belief, underlain by a healthy dose of “whether you like it or not.” Raise all the eyebrows you want; I don’t care. It’s supposed to feel insouciant or devil-may-care rather than emphatic or truculent, and as far as I can tell it usually does.

I don’t know which of the many meanings of “roll” deserves to be honored as the true ancestor of this expression. Dice? Dough? Drums? Cigarettes? Eyes? Tape? Bandages? Along? Over? Out? Up? On the river? With the punches? Rock and? Does it go back to driving somehow? I like the idea of a defiant French student defending her pronunciation of the letter r with a swift “That’s how I roll!” Or maybe a mugger explaining his technique for relieving drunken sailors of their money. Some of these possibilities are sillier than others, but none of them seems absurd on its face.

roll with it

(1990’s | athletese? | “take it as it comes,” “go with the flow,” “make the best of it”)

A phrase betokening resignation but not despair, suggesting the will to carry on amid adversity. It indicates relaxation rather than passivity. The origin of the expression is not as clear-cut as I thought. It seems most likely to descend from the old boxing exhortation, “roll with the punches”; another possible parent is martial arts, rather than the sweet science. But it could also come from sailing (as in rolling with the waves, but that’s not as idiomatic), or even something more cosmic (as the earth rolls around the sun, we have no choice but to roll with it). I still think the first is most likely, mostly because the phrase goes invariably with unpleasant or frustrating circumstances. Nobody ever rolls with winning the lottery; it has to be something that makes your life more difficult. And it usually is a change in conditions imposed from the outside, like bad weather, a legal verdict, or other people’s mistakes. The phrase may be used in response to a change in oneself, as in the diagnosis of an incurable disease, but only when the obstacle is presumed to be beyond our control. If you can’t make it better, you roll with it; if you can improve by applying yourself, it is assumed in our self-help culture that you will.

The expression is popular among athletes and has been for a long time, but I found examples from therapy, education, music, and popular culture as far back as 1970. That’s why I’m skeptical of a tidy origin myth for this term. “Roll with it” can be read as a distillation of the first part of the Serenity Prayer, which is closely associated with Alcoholics Anonymous: “the serenity to accept the things I cannot change.” Rolling with it means not getting wrought up about things you can’t do anything about. Just deal with it and keep moving, because resistance makes it worse. We need the stock phrase, because it’s something we have to remind ourselves to do — it feels counterintuitive, like steering into a skid. And yet it’s certainly a handy rule for a species as adaptable as ours.

Tags: , , , , , , , ,


(1980’s | journalese? advertese? | “manufactured excitement,” “hyperbole,” “puffery”; “tout,” “boost,” “oversell”)

Not eligible for the blog, strictly speaking, this word was pretty well established before 1980, but my girlfriend suggested it and being a sensible feller, I tend to do what she tells me. (It’s not the first time I’ve bent the rules; “community service” and “unintended consequences” are both terms I’ve covered even though they came along before 1980.) “Hype” was used without gloss several times even before 1960 in Billboard magazine: “What constitutes a hype? It is the launching of tunes and records with great fanfare and thousands of free promotional records to disk jockeys and juke box operators . . .” (October 29, 1955). Note that “hype” could take an indefinite article then, which is no longer possible; our ancestors used hype to mean act or instance of promoting, rather than to refer to aggregated examples of any old promotional efforts as we do today. Grammatical distinctions aside, hype involves ginning up interest by means of exaggeration; it doesn’t have to rise to the level of conscious dishonesty, but that implication is there the majority of the time (see below). The ubiquity of the catchphrase “Don’t believe the hype,” which owes its cachet to Public Enemy, indicates as much. Here are a couple of examples from the early days:

“You can’t hype kids into buying things” (attributed to John Roberts, an organizer of Woodstock, 1969)

“He was also known to be a hype artist by nature. He never just liked something. He loved it. And when he loved, everybody knew it.” (Arnold Shaw, 1974).

The proportions vary, but in both quotations the combination of creating a false impression and overstating the worth of a product or an idea is present.

“Hype” actually became less disreputable with this new meaning. Before 1970, it generally meant “junkie” or “needle” (as in “hypodermic”), or denoted a certain trick for swindling checkout clerks in underworld slang. Both of these meanings go back to the 1920’s, according to Lighter’s slang dictionary, so the word has had a distinct air of dishonesty, not to say sleaze, for a long time now. Both senses were pretty well gone by my boyhood and by now have been terminally supplanted. It seems unlikely that they are direct ancestors, anyway; “hype” presumably is short for “hyperbole,” unless someone has a better explanation.

I noted above that “hype” generally involves lightweight deception, or at least a tinge of dishonesty. To my ear, this seems especially true when it is used as a noun. Used as a verb, “hype” refers as readily to the efforts of professional ad men and flacks, kept more or less within ethical bounds, as to those of hucksters and mountebanks. Not that the verb never suggests lying, but it’s less inevitable. The past participle may also mean “excited and alert” (as in “psyched” or “pumped”): picture an athlete exclaiming, “We were hyped up!” after a close game. In that sense it may simply be short for “hyper,” although it means something a little different. (Mercifully, “hyper” has avoided the sense of “one who hypes.”)

What is the relation between “hype” and “buzz”? Sometimes they are treated as synonyms, but they aren’t. Hype, executed properly, creates buzz. Hyping isn’t merely exaggerating, innocently or otherwise; it’s about generating publicity. Get the town, or the blogosphere, buzzing about your product. If your hype doesn’t create buzz, you’re pretty hypeless.

Tags: , , , , , ,


(1990’s | advertese | “pre-teen,” “person at that awkward age,” “kid”)

A word we owe to advertisers. It bubbled up in the late 1980’s, mainly in marketing publications, although it appeared in the mainstream press now and then, most notably in a USA Today series inaugurated in September 1989, “The Terrible Tweens.” (Royal Caribbean seems to have been an early adopter, offering both “Teen” and “Kid/Tween” programs on their cruises by the end of the 1980’s.) It took a few years to mature, but the word was solidly established within ten years and has become widely recognized and understood.

The origin of the term appears uncomplicated. The resemblance to “teen” is obvious (it’s why we don’t call them “twixts”), and the reference to the time be”tween” young child and teenager is catchy. It was defined as “those between 8 and 12 years old” in the Washington Post (January 24, 1988), which is, I suspect, about how the term would be generally understood now. Maybe 9, maybe 13, but since tweenhood may be a state of mind that need not correspond with precise ages, we should expect a little fuzziness. Some definitions showed more variation in the beginning; for example, a report on McDonalds’ advertising strategy (November 9, 1988) explored its practice of marketing to subgroups including “‘tweens’ (9-to-16 year olds),” while an article in Adweek less than six months earlier gave a range of “10-15.” U.S. News (April 1989) confidently gave “9 to 15.” You could get pretty much any endpoints you wanted, but the core of prepubescents and beginner pubescents remained constant. The traditional preference for 12 or 13 as the beginning of the teenage years seems to have reasserted itself, and there’s much less tendency to incorporate full-blown teenagers into tweendom nowadays. Sometimes the word was spelled with an initial apostrophe in the beginning; sometimes you saw “tweenage” or “tweenager.” It’s a good thing the variant didn’t catch on, or we would all be heartily sick of hearing about Justin Bieber, tweenage idol.

We may see this term simply as the product of the advertiser’s restless, relentless pursuit of the bottom dollar. Whenever defenseless spending money is discovered in a sub-group of the population, the sharks of commerce circle, seeking to engross a healthy chunk of it for themselves. Somebody found out that pre-teens — some of them, anyway — had a certain amount of money, so they had to be defined, categorized, converted to data, and appealed to. Just another demographic in an ever more precisely demarcated consumer universe. Pre-teens’ embrace of social media has lately given the youngsters a new kind of consumer power (and new ways to get into trouble).

The word soon elbowed its way into the parents’ lexicon, adding one more milepost to a track stretching from colic and the terrible twos to empty nests and fledglings returning to fill them. It’s one more group to worry about, one more place the wheels can come off the cart — according to a world view in which childhood and youth are recognized as a succession of traumas. If we hope to understand our children, we must learn about the special characteristics of tweens, their developmental stages and kinks, their symptoms and syndromes, and how not to ruin them utterly (hint: anything you say or do may doom them to a bitter, ineffectual adulthood). The same urge to dissect ever more finely, to understand ever more minutely, is at work among parents as it is among advertisers.

In 1988, Polaroid (Polaroid!) offered its Cool Cam to the youth market (PR Newswire, February 19, 1988), “designed especially for trendy ‘tweens'” (defined here as “the latest demographic label for the 9- to 14-year-old set”). The “tween,” understood as another subgroup of the youth population, was very new then. Nowadays cascades of carefully orchestrated opportunities to spend money confront tweens at every turn, including a fashion designer for tweens who is herself a tween (she promises “blood, sweat, and glitter”). They have money, they have Twitter, and they know how to use them. The rest of us had better stand back.

Tags: , , , , , , , , , , ,

erectile dysfunction

(1990’s | doctorese? therapese? | “impotence”)

In 1998, Viagra was introduced. Around the same time, pharmaceutical companies sharply increased direct-to-consumer advertising of prescription drugs following changes in guidelines issued by the FDA. Anyone else think the fix was in? That was, to say the least, a windfall for Pfizer. Many different medications could expect to do well by attracting the attention of the actual users (not the prescribers, as in the good old days), but those commercials featuring Bob Dole as pitchman sent American men dashing to the doctor to ask about treatment for erectile dysfunction, which had not been a favorite topic among American men before. (The implied commentary on Dole’s performance as a presidential candidate went charitably unreported.) Viagra made impotence respectable.

“Erectile dysfunction” is another euphemism, intended to palliate the misery of the older word, to make us less ashamed, less trapped in a debilitating medical condition. That’s the promise of Viagra: it’s not in your mind, it’s just a little problem with the blood vessels. Pop a pill, wait an hour, and your troubles will be over. Researchers started to find in the 1980’s that impotence is most often caused by an underlying physical condition, not some deep, dark psychological problem or even old age; the success of Viagra and its cronies seems to bear that out. How has it changed us? Too early to tell, I guess. The prospect of eighty-year-old geezers bragging about their prowess does not appeal, and we won’t know until 2026, when the first baby boomers hit eighty. Previous generations had a certain delicacy about such things, but those days are long gone; even as euphemisms proliferate, the talk gets more frank.

There may have been published uses of the term “erectile dysfunction” before 1970, but it seems to have emerged during the following twenty-five years. It remained the property of doctors and therapists well into the 1990’s; most of us continued to struggle along in our benighted way with “impotence” or “can’t get it up.” One urologist, Dr. Fernando Borges, preferred the term because “impotence suggests powerlessness, weakness” (St. Petersburg Times, 1987). The phrase seems to be a short step from “sexual dysfunction,” a general term already in wide use in the 1970’s, and is almost certainly a descendant. I’m not sure whether a doctor or a therapist used it first (New York Magazine cited it as psychologists’ jargon in 1979), but they were the only ones who used it for a long time. In the last half of the 1990’s, the phrase became legion, and you heard it everywhere. Which still seems to be true. Especially on all those blankety-blank commercials.

“Erectile dysfunction” is commonly abbreviated “ED,” another innovation we probably owe to widespread advertising. Oddly, “ED” also stands for “eating disorder,” which also entered therapese in the 1970’s (the Clinic for Eating Disorders at the University of Cincinnati opened in 1974), although it went mainstream by the mid-1980’s — much quicker than “erectile dysfunction” — perhaps because there really was no earlier word for it. There is little danger of confusing the two meanings, since the number of people who suffer from both conditions must be the null set, or pretty darn close.

Tags: , , , , , , , , ,

been there, done that

(1990’s | “(I’ve) been there before,” “that’s old hat,” “seen the elephant”)

Definitely a nineties expression. I heard it first early in the decade, and boy, did it have legs. It took over our aural landscape for a few years, then became slightly less omnipresent, although it has definitively entered the language and may leap from almost anyone’s lips. The ease with which it is alluded to, adapted, and parodied reflects its status as a more or less instant cliché — a phrase already slightly painful to hear and repeat after only a few years’ hard use, but so much a part of our vocabulary that we have no choice. (Here is a definition of “instant cliché.” Another one here. I wish the expression were original with me.)

There are a few cites before 1990, and an emerging on-line consensus holds that the idiom originated in Australia. Mountain Dew used the phrase in an ad campaign in 1994, which helped it take off, but it was around at least a few years before then. It has been used in several song titles — the earliest, as far as I know, in 1990 (John Cale and Brian Eno). It may come with numerous elaborations, the most common of which is probably the addition of “got the t-shirt.” A number of web sites and blogs now have variations on the phrase as part of a title or URL, notably a video game site for combat veterans and an organization devoted to helping prostitutes and victims of sex trafficking (we used to call them “white slaves,” although “white slavery” was the more common phrase). “BTDT” has settled into its inevitable role as texting shorthand.

The meaning is simple and hasn’t changed much: I’ve been through this and don’t want to do it again. Whether the tone is world-weary, disdainful, dismissive, or matter-of-fact, the ground meaning has stayed pretty stable. There are several more than reasonable treatments of this phrase on-line already. Safire gave it its very own column in 1996; Phrase Finder,, and Wisegeek also cover it very well.

Why did it catch on and take root so fast? We could talk about the compactness or simplicity of the phrase, or how it breezily blends sarcasm and dismissal, how flatly it puts you in your place. All those factors contribute, but the greatest driving force is our restless hankering after the new. Just as many people must have the latest gadgets, many people like to latch onto new expressions. You hear it on television, in a movie, from your neighbor: hey, there’s a new one! I like it! “Been there, done that” has the benefit of expressing a sentiment most of us feel from time to time. New AND useful is hard to beat, and such a phrase may spread incontinently. Even if its ear-numbing frequency tails off after a few years, it has bored into the language. New words and phrases catch on because enough people want them to, for better and worse: it’s a democratic process.

Tags: , , , , , , ,