Tag Archives: marketing
(2000’s | advertese | “den”)
The evidence strongly suggests that man-caves are the creation of marketers, despite visible traces of the expression before the mid-aughts, which is when it starts turning up in bulk in LexisNexis. The phrasing likely owes a debt to the author of “Men are from Mars, Women are from Venus” (1992), John Gray. While he did not, as far as I can tell, ever use “man cave” himself, he used the two words in close proximity, notably in the apothegms “Never go into a man’s cave or you will be burned by the dragon!” and “Much unnecessary conflict has resulted from a woman following a man into his cave.” In other words, let the old grouch suck his thumb and fiddle with his TV or his train set for a while. He’ll come out and make nice eventually. And if he doesn’t, it’ll be your fault. Gray’s biases aside, he was influential, and today’s more compact phrasing may claim his as an ancestor. Actually, the first use I found in LexisNexis is not due to Gray but to a Canadian columnist writing about house floorplans; she proposed that the basement be renamed “man cave,” because that is where men go to get away from their women. (She had in mind a damp, cobwebbed basement, not a home entertainment center. “Cave” is the French word for basement, so the use of “cave” is more intuitive in Canada than here.) Was author Joanne Lovering an early adopter or ahead of the curve? (Or ahead of the cave!)
But when “man cave” started showing up in quantity, it was purveyed by Maytag, of all corporations, which marketed a product called SkyBox, a vending machine for soda or beer that you could install right in your very own home. Fred Lowery, the director of Maytag’s “strategic initiatives group,” noted that “every guy would like to carve out his own little place in his home. Internally, we call it the man cave. And lots of guys, at some point, would like a vending machine in their man cave” (January 29, 2004). There you have it. Very soon, real estate agents began touting the things, sports promoters jumped on board, and it became a proper fad. No man cave was complete without a big-screen television and a sofa — video game consoles and sports-related items also popular — and if not your very own vending machine, at least a dorm refrigerator, maybe even a full bar. What you won’t find is a workbench. The man’s retreat in my youth was likely to involve tools and at least the possibility of repair or construction. A few men still favor that, but these days it’s more about swilling beer while endless hours of sports unroll before your glazed eyes. Well, not really; what it’s really about is male bonding or just having a place to get away from your woman. The corresponding “woman cave” has not made much headway, a few sightings in the press notwithstanding, but all the ladies have to do is wait; sooner or later some savvy marketer will attract huge sums convincing women they need their own gender-specific refuges.
“Cave” is an interesting word to use here; to my mind it calls up two different associations. First, of course, the caveman: brutal and self-reliant (actually, cavemen were much less self-reliant than we are). Primitive, crude, and therefore manly, the caveman lords it over his woman and slays giant beasts. Just what we all want to be, right? The second association with “cave” is a dangerous, unpleasant place where no sensible woman would set foot to begin with. They’re dark and treacherous, lairs of wild animals, drifters, or lunatics. Of course, that’s what he wants you to think, ladies. He has a giant-screen TV in there — how dangerous can it be? Just don’t get burned.
Why has “man” become such a common prefix in compound nouns since the dawn of the new millennium? Nobody says “man about town” or “man alive!” any more, but you can’t get away from “man-hug,” “man-bun,” “man-boobs.” “Man cave” predates some of these, though “man-boobs” dates back to 2003, according to Urban Dictionary. Is it a simple matter of dumbing down, the word “male” having become too complicated for us cavemen? Is it a wistful attempt to recover a lost sense of masculinity by reverting to the simpler (and therefore more primitive) term? Is it an attempt to express solidarity? “Man-splaining” and “man-spreading” go the other way, of course, used by women in solidarity, not men.
(1980’s | journalese (film)? advertese? enginese? | “appeal,” “oomph,” “oohs and ahs,” “brilliance”)
Inasmuch as “wow” and “factor” both have relatively long and complicated histories, perhaps we should begin there before considering their union. “Wow” appears to go back to a Scots interjection, which could be laudatory or derogatory, and our modern understanding of the word emerged even before the beginning of the twentieth century; by 1925 it was going strong as an interjection and available as a noun or verb. The interjection is far more common than the other two today and probably always has been. “Factor” is an even older word that early in the twentieth century meant “gene,” basically (allowing for evolution in our understanding of genetics); now it is defined much more generally as “element or constituent, esp. one which contributes to or influences a process or result” (OED), especially if it’s important and its action is not well understood. “Factor” preceded by another term to denote a particular substance or catalyst is quite common in medicine; “Rh factor” being a longstanding example. “Risk factor” no doubt started life as a medical term but now flourishes in other fields. “Factor” became popular in Hollywood during the seventies, when it followed “Delta,” “Neptune,” “love,” and “human” (twice) in film titles (they all had to do with science fiction or espionage). And, to complete the picture — or the confusion — “wow factor” was used occasionally among stereophiles before 1980 to talk about irregularities in playback speed of tape decks and turntables, as in the phrase “wow and flutter.” So it seems the stage was well set.
By the mid-1980’s, the phrase started turning up in writing about entertainment (particularly films and television), computer software, merchandise more generally, and even service industries like banking. One early exponent was marketer Ken Hakuda, who used “wow factor” in 1987 to talk about his success in selling toys which he freely admitted were not useful or valuable except as a source of mindless fun. He used the term to refer to a highly visible feature of a product or its packaging that makes a strong, immediate impression, causing shoppers to whip out their wallets. That quality of impressiveness constitutes a common denominator among objects blessed with the wow factor. I’m not willing to take a firm position on the origin of this particular meaning. If I had to guess, I would say Hollywood, but advertese seems like an equally logical breeding ground, and I can’t say it didn’t start there. Because the phrase goes frequently with technological advance (especially when you’re talking about cinematic special effects), it is possible to argue that its source is enginese. While two of the earliest citations found in LexisNexis are due to Steven Spielberg and Harrison Ford, the very first (1984) was in the title of Miss Manners’s column, of all places. Did she supply the headline, or do we owe it to a forever anonymous editor? By the mid-1990’s, the expression was no longer extraordinary and had shed quotation marks, exclamation points, capital letters, and such tricks of the trade.
If you looked even cursorily at the pre-1980 equivalents listed at the top of this entry, you may have surmised, correctly, that I struggled to find convincing synonyms from the old days. That is because we used to say the same thing with adjectives — e.g., dazzling, eye-catching, awe-inspiring, cool — or verb phrases: knock your socks off, set something apart, jump off the shelves. Many new expressions have ensconced familiar ideas in new parts of speech, which usually is a net gain for the language. More ways to say the same thing reduces monotony and opens up room for small but significant variations in connotation. I’m inclined to consider the popularity of “wow factor” deserved. It’s short and to the point. And the ground meaning is quite clear, though it can imply two slightly different things, just as in the sixties, “wow” conveyed two different levels of excitement. One was the old gee-whillikers gob-smacked elation at seeing anything unexpected and pleasing. The other was quieter, more meditative, as in the pothead grokking the universe as he exhales. No squealing or jumping up and down, but the profound sense of something worthier than oneself that must be absorbed and appreciated with a drawn-out “wow.” “Wow factor” has always leaned more heavily in the direction of the former sense, but it can shade toward the latter sense as well, and seems to do so more often as time goes by. Not that the two meanings are all that far apart.
It has occurred to me to wonder if we should hear this expression with a whiff of the tawdry or meretricious. Given its early use and likely origins, it’s not hard at all for an old snob like myself to inflect it this way. But that would demand an ironic edge that I rarely or never hear when the phrase is used. A “wow factor” is a good thing that will impress the audience, sell the product, or make something stand out. The idea that there must be something cheap or slutty about it never seems to have taken root.
bells and whistles
(1980’s | advertese? | “additional features,” “doodads,” “frills”)
There’s plenty of on-line speculation about the origin of this expression. It is a puzzler because there are several possibilities, none of which has anything obvious to do with “bells and whistles” as used since 1970 or so: features of a product — a car, computer, camera, etc. — not needed to make it work but which add power, capability, or luxury, and cost. In today’s language, it doesn’t even have to be tangible; a web site, business plan, or legislation might have bells and whistles. What I saw in Google Books makes me think that the main conduit into everyday language was computerese, although some on-line authorities say car dealers used it first. Generally bells and whistles are thought to be a good thing, but there is a persistent undercurrent dogging the phrase. Sometimes bells and whistles are considered distracting, superfluous, or excuses to drive prices up without delivering better performance. An investment manager “aims to provide simple, yet solid guidelines that work for any investment plan in the long run, devoid of any quick fixes or bells and whistles.” A writer deplores over-elaborate restaurant desserts: “There were tuiles, there were chocolate towers, there were flowers made of spun sugar. Good luck finding the actual dessert amid all the bells and whistles.”
All right, you ask, why use “bells and whistles,” devices not normally associated with ease or comfort, to refer to such things? World Wide Words, WiseGeek, and Phrase Finder have all taken a swing at this, and the consensus seems to be that it has to do with fairground or movie-house organs, which incorporated many sound effects, including bells and whistles, the better to hold the crowd’s attention. Or it may derive from model railroading, as in “This train set is so true to life it has all the bells and whistles.” There are other possibilities as well — factory time signals, parties and celebrations, buoys, alarm systems — but they seem less plausible. The fact is, no one knows for sure how this popular expression crept, seeped, or slithered into our vocabulary between 1960 and 1990. That’s not a dig or swipe; that’s a respectful acknowledgment of the mysteries of language.
Actually, I found one reference as far back as 1977 to a writer who explained the origin of the phrase, in a short-lived magazine called “ROM.” Unfortunately, Google Books’ snippet view, which I have complained about before, didn’t show me the answer, only the set-up. The nearest library that has copies of this periodical is in Rochester. But if any of my faithful readers can track this one down, I will award you a free subscription without hesitation. I’ll bet the proposed derivation has something to do with circus organs or locomotives. But even if it doesn’t, it won’t be definitive.
What “bells and whistles” ought to denote is “means of getting your attention.” It’s not that they’re cool or make your machine better, it’s that they make you sit up and take notice, like loud a noise going off in your ear. It should be what marketers do, not what designers and engineers do (arguably, the engineers create the features and the marketers turn them into bells and whistles). That’s why I suggest that we thank advertese for pushing a new, improved meaning for this expression into the forefront.
(1990’s | enginese | “collection”)
There is a little complex of phrases here: “harvest [v.] data,” “harvest [n.] of data,” and “data harvest [n.].” None was in common use before 1980; LexisNexis and Google Books both suggest that they hadn’t made much headway as late as 1990. “Harvest [n.] data” originally referred to quantities of crops reaped or game hunted, and often still does. The first citations in today’s sense, which appeared sporadically in the eighties, mostly seemed to come from the space program, often as “harvest of data” from a telescope or spacecraft. The implication was abundance; when scientists uttered it, they were usually boasting about the capabilities, or hoped-for capabilities, of a new piece of equipment that was going to provide us with all kinds of new observations. That’s positively innocent when set alongside the more sinister sense the phrase has acquired in the internet age.
Somewhere in the mid-1990’s, computer industry executives began talking about harvesting data about what people were doing on-line, which was simply an expansion of a longstanding practice — market research — into new fields. That was when the term came to mean corporate, computer-driven aggregation and storage of personal information, which we now take for granted. I did encounter one anomalous use in a Washington Post article in 1994 about internet access service offered by the state of Maryland that permitted the user to “harvest data” about the state. That heartening notion of empowered consumers using the web to collect information has not persisted, and now we think of puny proletarians plucked clean of every potentially pertinent preference, practice, or pattern, permanently pinned in the pitiless panopticon produced by predatory purveyors.
“Data harvest” reminds me of “organ harvest,” also a relatively new expression with unsavory implications. In both cases, the purposes are legitimate, perhaps even commendable, but the way they are carried out leaves a bad taste. The connotations of “harvest” are changing from comforting and wholesome to devious and greedy. For thousands of years, a successful harvest was cause for thanksgiving, a time to rejoice and look ahead to better days. Even a poor harvest marked the end of an annual cycle and might spark hope for the future, in the manner of Dodgers’ fans crying “Wait till next year!” But now the harvest feeds only a select few; most of us sow but do not reap.
(2000’s? | businese | “sell,” “unload more product”)
Confession: As far as I can remember, I never heard this word before last Sunday morning, on the radio, in a talk given by Angie Hicks of Angie’s List, warning of businesspeople (landscapers, in this case) trying to sell you more services than you want. This points to the crucial question about upselling: Is it a favor to customers, offering what they want, only more and better, or is it second cousin to a scam — a way to boost profits on the backs of unwary or overly obliging consumers? Depends on whom you ask. The word was and still is most commonly used among salesmen and marketers, and they take pains to treat upselling as a great boon to the consumer, merely an unselfish attempt to alert us to the advantages of buying more (and, coincidentally, spending more). A word to the wise, etc. Consumer advocates take a much dimmer view of upselling, regarding it as a pushy or sneaky way to extract more money; many consumers consider it a turn-off.
The term means two different things in common use. One is getting a client to buy a more expensive kind of whatever he’s buying, like a fancier bottle of wine (or, failing that, a case of the cheap stuff) or the next car model up the scale. The other thing it means is convincing the customer to buy things that go with whatever he happens to be buying, like French fries with a hamburger or extra batteries with a watch. The latter is also called “cross-selling,” and while some people insist on the difference, most regard “Do you want fries with that?” as a fundamental example of upselling. It is close to “upgrade” but not quite the same; as a noun, it grazes in the same paddock as “pitch.”
The word has never been all that precise grammatically. Used indiscriminately as a noun or verb; as a verb, it may be transitive or intransitive. When it takes an object, the object may be a product or a person. And it can take on prepositions: “Upsell to” means basically the same thing as “upsell,” but it has a different significance from “sell to,” because when you “upsell to,” the object of the preposition will be not the customer, but the higher-end product. Like a lot of modern words, those who use it prefer to elide, nay, jettison tiresome grammatical distinctions. For the legal eagles among you, the Code of Federal Regulations defines the term in the context of telephone sales (2004 revision): “soliciting the purchase of goods or services following an initial transaction during a single telephone call. The upsell is a separate telemarketing transaction, not a continuation of the initial transaction.” And it can even do spot duty as an adjective, as in “upsell items” (items you can pitch to the customer as an upgrade).
How modern is it? LexisNexis shows that it wasn’t used in the mainstream press before 1990, although it did turn up in magazines with “marketing” or “advertising” in the title. Sales lingo it was and ever shall be, but it seems to be growing more common every decade, and I found cites in major U.S. newspapers (with explanations) by the mid-1990’s. Oddly enough, the first uses on LexisNexis date from the late 1970’s, in the Canadian press.
I’m not sure what the old word for this was, maybe because “upselling” is what we used to call “selling.” Salesmen have always tried to get you to buy more or better, or to unload items with a higher profit margin. It was what salesmen did, and we didn’t need a special word for it. But the science (ulp!) of marketing demands its own vocabulary, its own fine (or blurred) distinctions, and it shall have them.
(1990’s | computerese | “dry run,” “preliminary version,” “demo”)
Ineligible for the blog on both historical and semantic grounds, but it seems to be creeping into contexts other than testing computer software. Even now, the association with computers remains strong if not quite inevitable: a stage of software development in which the product is not ready for sale but is ready to be tested on a large scale by real live users, not just the developer’s own engineers. (The latter stage is known as “alpha testing” when it is referred to at all.) The cynical way to look at this is that the developer gets lots of work done for nothing, although most will offer discounts to beta testers on the finished product. The phrase required some explaining at first, but now everyone knows what it means: On April Fool’s Day, Google invited users to test the beta version of Google Nose, which enables users to search by odor (Google, of course, is famous for keeping some of its products in the beta stage for years).
According to Wikipedia and at least one other source, “alpha” and “beta” in this sense were broken in by IBM computer jocks long ago, well before the personal computer was thought of. It was widely used in talking about both hardware and software certainly by the late eighties, if not before, but it didn’t turn up much in the mainstream until some time in the nineties. (I believe that’s when I became familiar with the concept.) Beta versions are for the expert or adventurous, or for employees of large corporations who act as guinea pigs. Ordinary shlubs like me avoid beta versions; let somebody else step on all the landmines. (Hey, it’s not like most software doesn’t have plenty of bugs and glitches even after rollout.)
One descendant, “beta reader,” has cropped up recently in the realm of fan fiction, where it means something like “editor.” Here again, it refers to a reader at a certain stage of the development of the text, not in its roughest form but not finished, either; the idea is that the beta reader will help the author improve the story and get it ready for publication, posting, or whatever comes next. In this sense it obviously derives from the old computer-industry use but may point the way to a new set of meanings. Watch this space.
(1990’s | academese? advertese? | “pioneer,” “early bird” )
The interesting thing about “early adopter” is that its meaning has never varied in the slightest, and while its range of use has broadened a bit, neither its denotation nor connotation has changed to speak of. Someone who snaps up a new practice or product as it becomes available — someone interested in the latest technologies, brands, or services. From the beginning, the expression has had a strong connection with technological advance, and it still does, although nowadays it may freely be used in talking about customs, rules, or attitudes. That was not true in the 1980’s.
The earliest uses recorded in LexisNexis date from the early 1980’s, concentrated in the banking press. It was not long before “early adopter” was taken up in computer circles, and the term quickly became common in talking about (i.e., promoting) new personal computers, network technology, operating systems, etc. The term likely was coined by the economist Everett Rogers, who invented a field by publishing a book called “Diffusion of Innovation” in 1962, in which he classifies people according to how quickly they adopt new things; one of the classes was “early adopter,” who weren’t the very first to pick up the latest thing (those are the “innovators”) but who come right after and presage wide consumption or use. Most of us are content to follow our own English Pope:
Be not the first by whom the new is tried,
Nor yet the last to lay the old aside.
Marketers were probably the first to use the term regularly, and it was rarely seen outside the business or computer press until at least the mid-1990’s; it was rendered in quotation marks as late as 1999 in US News and World Report. But that wasn’t really typical. Mostly the phrase is used without any particular notice or explanation, and that has been true for a long time. (Rogers dubbed the last to lay the old aside “laggards” — those who take up innovations slowly (not until they are already obsolete) or not at all. I’m a laggard.)
The phrase has long had a strong correlation with favorable terms like “forward-thinking” or “progressive.” An early adopter typically is not seen as an uncritical, superficial customer who will walk out with anything that is sold as the dernier cri, but as a discerning shopper who is quick to see the advantages of the latest technology. Early adopters are usually thought to be knowledgeable and well-off — people you want to know and emulate. There’s no reason for this that I can see except that the people who use the phrase are also the people who have a strong interest in inducing early adopters to buy whatever it is they happen to be selling. So they need to flatter the adventurous ones willing to endure bugs and kinks, because success with that group portends general success. You don’t go describing your client base as gullible, hysterical, or lacking wisdom. With that goes a tendency to denigrate the laggards as stuck in the mud, out of the loop, and selfishly standing in the way of progress. So all those of us who didn’t spend money on Beta videocassettes, New Coke, the DeLorean, or WebTV, are losers. Time to repent. Go thou forth and bankrupt thyself on every crummy two-bit novelty that comes down the pike.