(2000′s | internese? journalese? | “stop reading now,” “I don’t want to give away the ending, but . . .,”)
The earliest use I found in LexisNexis, and it’s a toothy one, dates from 1994 in the Washington Post. The reporter noted, “on movie buffs’ discussion lists, for instance, there is wide use of the term ‘spoiler alert,’ which is a warning inserted before any comment that would give away a film’s ending.” Early internet slang, that would make it. The phrase wasn’t fully part of the language until 2000 or so, I would say. 2003 is the year it blooms on LexisNexis. That’s when it started popping regularly up among squares, though even at that late date, quotation marks and glosses were not unusual.
“Spoiler alert” seems to have ushered in the use of “spoiler” to mean that which divulges a significant or startling plot point. No less an authority than American Heritage first recorded it in the fourth edition (2000): “The third print edition of the American Heritage Dictionary, published in 1992, gives four common definitions for ‘spoiler.’ The fourth and latest edition, which came out in 2000, adds this notable fifth definition: ‘a post to a newsgroup that divulges information intended to be a surprise, such as a plot twist in a movie’” (Milwaukee Journal-Sentinel, November 5, 2003). I learned the word in childhood in the context of late-season baseball games in which one team was in playoff contention and the other was not. If the latter defeated the former and kept it out of the playoffs, it was a spoiler. This definition applies as well in politics and other varieties of competitive sport.
I’m glad to say that “spoiler alert” has always had solid ironic potential, regularly used to signal a particularly predictable or hackneyed development as well as the genuinely surprising. The phrase took a big boost from the rise of home videorecording, followed by streaming of previously broadcast television shows and TiVo. You saw the show last night, but your water-cooler buddy didn’t — so you may have to clam up. Not like the good old days, when everybody watched it at the same time.
Another, even newer, word “spoiler alert” reminds me of is “reveal,” now hearable as a noun. The reveal is what the spoiler alert (I typed “spoilert” just then — why not?) warns you away from. It’s nothing but a sloppy way to say “revelation,” as far as I can tell. What the hell. Verbs become nouns; nouns verb themselves. It all slops around in a grammatical slumgullion, and our once-proud linguistic distinctions and differentiators disappear at an ever greater rate. In a century, we’ll have about a thousand words left coupled with a rich repertoire of grunts, that being all that is left of our once-proud language. Of those thousand words, I’d guess fifty will have come into existence after 2014.
(1990′s | doctorese | “give out,” “die,” “flatten out”)
Perhaps I should list this in its nominal form, since “flatliner” was the first to burst into general consciousness thanks to Joel Schumacher’s 1990 film, a title that introduced the concept to most of us. Nevertheless, “flatline” (verb) turned up as early as 1980 in Safire’s language column (I have only LexisNexis’s word that it was there; I didn’t find it searching the Times archives), not to mention a year earlier in Stephen King’s The Dead Zone (in the phrase “gone flatline,” either noun or adjective). But the word lay largely dormant for another ten years before showing signs of life. It didn’t vault into ubiquity as some words do, but by 1990, we had all seen enough hospital shows to grasp the idea readily and fold it into our vocabulary, I guess. There’s no doubt it’s a compelling image. A line punctuated by little jagged peaks goes horizontal as the heart and wind fail. Time for emergency treatment, or last rites. A couple of frantic first responders always add drama to a scene, of course.
I shouldn’t give short shrift to the question of part of speech. “Flatline” is common both as a verb and adjective, slightly less so as a noun. You may even see it used transitively when a verb, as in this snippet from an article on Billy Squier: “the music video ["Rock Me Tonite"] that would flatline his career” (New York Post, November 17, 2013). I think of it as one word; the two-word and hyphenated versions, common twenty years ago, have all but disappeared. The term has an ambiguity built in. In the emergency room, it is the lowest point, beyond which medicine can do no more for you (except try to restore the jagged peaks). But when profits flatline, that just means you aren’t making money this quarter, not necessarily that you’re defunct. The word has lost some urgency as it has worked its way into common use. Now it’s associated particularly with financial discussions and sportswriting. The expression has taken on a few metaphorical possibilities but has never strayed far from its medical roots, always conveying passivity, stasis, inanition. If not death, then an absence of movement and progress. In color terms, monotone. In emotional terms, unresponsive. In physical terms, what effortless might mean but doesn’t.
The term has other meanings, including one that appears also to date from after 1980: in bureaucratese, a “flatline budget” calls for no spending increases (unless mandated by law). Chronologically, I believe this usage comes after the medical one, so it was probably a descendant of some kind. If you view a government budget as a living thing, always straining to grow and gobble up more and more, then the idea of eliminating all discretionary spending is roughly analogous to a cessation of vital signs, although you can’t get rid of a government that easily, as any libertarian will tell you. Sports scribes use “flatliner” to refer to an undemonstrative athlete, the kind of guy who has ice water in his veins, as we used to say. Just go out and play the game; never mind the shouting and chest-beating. Urban Dictonary defines the term more harshly: “Someone severely lacking in personality; a person who kills any festive mood with their utter dullness.”
(1990′s | teenagese (African-American)? | “around the clock,” “always on or up,” “constantly,” “non-stop”)
I had assumed that this expression came out of stodgy corporatese, shortly after (perhaps even before) the ubiquity of ATM’s and all-night shopping or possibly the tech support call center, but now I think it’s more likely that it arose in African-American youth culture, especially rap. The earliest instance I found in LexisNexis, from 1993, came up in a glossary of rap terms printed in the Straits Times of Singapore. I didn’t do an exhaustive search, and it may have appeared earlier, but African-Americans do seem to have been early adopters. Actually, the first allusion may date from 1986: an all-black band named 24-7 Spyz. They were “known for mixing soul, funk, reggae, and R&B with heavy metal and hardcore punk” (Wikipedia), so they weren’t rappers. It’s not clear to me that “24-7″ meant the same thing in the band name as it does now, but if it did it was ahead of its time. Hardly anyone uses the hyphen any more; the virgule has become standard, as if it were a fraction, but it isn’t. (I was bedeviled as a child by this brain-teaser: Where do you commonly see the fraction 24/31? The answer? On a calendar.) Fact is, the hyphen makes more sense, but history will not be denied.
Before our use of the expression crept into the language in the early nineties, you found numerous examples of this sequence of numbers, particularly in football scores and stock quotes, representing America’s favorite spectator sports. That made it more difficult than usual to figure out when this term really began to appear. By 2000, “24/7″ was widely understood in hipper circles, and it didn’t generally require a gloss by then. Whether out of African-American culture or not, this is a phrase that bubbled up from below, definitely not forced down our throats by corporate headquarters or celebrity central. Early uses of the expression were typically ordinary people talking about their lives, not executives bragging about all-night grocery stores. Now the expression may be used metaphorically to indicate something closer to “full-time,” rather than “available at any moment, day or night, including holidays (as in “24/7/365″). Like many new terms, it has become less rigorous over time. Exclusively an adverb at first, its part of speech has drifted so that now it can serve almost as easily as an adjective.
There’s no doubt that commercial forces have latched onto “24/7,” which sounds like a descendant of 7-Eleven (the name dates back to 1946). America’s favorite convenience store was named for its sixteen-hour day, and the first 7-Eleven stayed open all night in 1963. Before then, the only places open all night were hospitals, cheap restaurants, and a few factories that employed multiple shifts. The idea that you should be able to order a hamburger or buy milk at any hour barely existed outside of major cities. You closed the store at a decent hour and went home to your family. Nobody worked on national holidays. And hardly anyone was expected to be reachable at any time. The pager and the cell phone made us subject to summonses from the office at all hours; the funny thing was, hardly anyone seemed to mind.
It’s tempting for everyone on the political spectrum to see such changes as due to declines in some moral value or another, but I’m more inclined to blame this one on the curse of capitalism. In its purest form, the curse of capitalism says, “If one guy works harder, everyone has to work harder. If one guy stays open late, everyone has to stay open late.” Etc. We always look at competition from the point of view of the consumer — and to be sure, competition benefits customers, at least up to a point (having too many choices becomes confusing and onerous). But from the other point of view, competition places every producer at the mercy of every other. Thousands of eyes on the main chance, endlessly scheming, out to make a buck and the rest of ‘em be damned. Every time one person or firm comes up with a profitable innovation — of any kind — everyone else has to match it, if not surpass it (this is particularly true if stockholders are involved). The exception would be an innovation that reduces the expenditure of time or capital, but even a true labor-saving device just opens up more time for work of other kinds. It doesn’t matter who first had the idea to staff a 24-hour hotline to help you fix your computer. If you want to start or stay in business, you have to offer it now.
raise the bar
(1990′s | businese (faux-athletese) | “raise standards,” “one-up” “outdo (oneself)” “make progress”)
I was unsure whether this expression would qualify as post-1980, but its emergence traces a distinct path across the last few decades: first unmistakable sighting in LexisNexis, 1985 (Governor Thomas Kean of New Jersey), gradual rise for the next ten years, then boom! That path is not in the least atypical. Picked up first by politicians and executives, it got a boost around 1990 from the newly influential personal computer industry. By 2000, it had raised the roof. I’m not sure when I first encountered it, but I remember using it as if old hat shortly after 2000. Before 1995, the phrase was normally followed by a few words of amplification. So an executive might bloviate about raising the bar of excellence. Or a manager might blather on about raising the bar for (or “of,” or “on”) customer service, for example. But within a few years it could be uttered as easily by itself. Like most new expressions, it didn’t settle right away into its most compact form. It sounds much more natural than it did in 1995 to hear it without appendages, but it still takes them readily.
It’s suspiciously obvious, but I’m still inclined to think the expression descends from track and field, as in the high jump or pole vault. In competition, athletes raise the bar to test themselves and make it harder for everyone else; “raising the bar” means outstripping the competition, and that is its general implication today when used as a set phrase. It can be used neutrally to mean “improve one’s performance,” but it’s far more likely to come at someone else’s expense. Even when educators talk about improving test scores (or, heaven forbid, learning) and they urge students and teachers to raise the bar — meaning everyone should do better rather than some getting ahead by pushing others down — the competition is students in other countries. (For reasons unclear to me, educators are oddly fond of this phrase. One expects businessmen and public officials to resort early and often to athletic expressions, but not the educational profession.)
The phrase also has a legal meaning, something like “activate or invoke a prohibition,” as in “raise the bar of estoppel” (don’t ask me to explain what that means). In order to take the Fifth, you have to demonstrate that you have likely engaged in some kind of criminal behavior; you can’t just pretend that any old embarrassing answer is incriminating. But once you’ve demonstrated that, you can “raise the bar” against testifying against yourself. Fascinating, but probably not the origin of today’s expression. Another sport, weightlifting, yields a more likely influence: “raise the bar” just means “lift the barbell.” The higher you raise the bar, the better your score. So that could have been a factor in the genesis of the phrase.
From recent headlines: amazon.com asks certain employees to act as “bar raisers,” who help screen job candidates and work with management to determine who should be hired. “Bar raiser” turns up sporadically in LexisNexis, with Amazon providing a recent boost. It may start cropping up more often, in which case we’ll hear “that’s a real bar-raiser” about as often as “that really raises the bar.” Amish people in the audience will be excused for being confused. A campaign to pressure Hershey to use fair-trade chocolate is called “Raise the Bar.” Lots of organizations use the name, actually. “Raise the bar” became natural very quickly, back there in the nineties, and we’ve adopted it with little fuss or hand-wringing, even those of us who notice new expressions as a matter of crabbed habit.
(1980′s | journalese | “fanatic,” “admirer,” “follower”; “would-be,” “aspiring,” “obsessive”)
To the best of my recollection, 1986 was kind of a crummy year. Nonetheless, it is the year “wannabe” stormed into the language. Before that, there were very few references in LexisNexis or Google Books. Merriam Webster Online gives the first citation as 1981, but the first example I found came from New York Magazine (July 26, 1976): A story about fledgling gangsters described one as a “Jimmy Cagney wannabe.” The first occurrence in LexisNexis dates from 1981, an article in Newsweek, which yielded a lovely example from surfer culture. All hell broke loose in 1986, due to Madonna’s teen fans and Spike Lee’s movie School Daze, in which African-American assimilationists (the Wannabees) were pitted against African-American traditionalists (the Jigaboos). After 1986, it caught on quickly and never looked back. In the eighties, the spelling “wannabee” was roughly as common as “wannabe”; by now the simpler spelling has prevailed. It was still possible then to use the word as a slangy verb phrase, and an adjective form was already available (but see below). The verb phrase has disappeared definitively.
There seem to be two subcultures in which this word bubbled up first. One was surfers, the other Native Americans. Both used it scornfully to name people who yearned to be part of the group but were incapable for some reason (it even sounded vaguely like the name of a tribe, so “Wannabe Indians” was a natural). There may be some cross-pollination between those two groups; I don’t know. Maybe neither was the original source, that is, maybe it first arose somewhere else or maybe it’s impossible to establish that it came into regular use first here or there. But what pre-1986 history there was seems to have centered in those two camps. My money is on the surfers.
And along came 1986, the watershed year. Not only did “wannabe” start to pop up everywhere, it acquired another meaning — there was always an instability built into the Madonna/Spike Lee dichotomy. “Madonna wannabe” simply denoted a person who went to a lot of trouble to pretend to be her. Dressing like Madonna, wearing the same makeup and accessories, and massing at her public appearances and screaming (later the term was applied to pop artists who modeled their act or career on Madonna’s, says Wikipedia). There was some question about whether they wanted to be Madonna or merely wanted to be like Madonna, but either way, they wanted to be someone different. Teenage girls screamed over the Beatles, too, but they didn’t want to be Paul or Ringo. They weren’t wannabes — more like worshipers. But in the eighties, worship took on the added dimension of copying — in effect, going out over and over again in the same Hallowe’en costume. The wannabe, like the stalker, expresses an obsession with an unattainable person, and it’s no accident the two concepts burst into the national consciousness within a few years of each other.
In Spike Lee’s terms, the word was about wanting to be someTHING other than you were. Wannabes aspired to be white, to join the ruling race on its own terms and give up at least part of their black identity. You still have to look the part, but it’s not a matter of focusing on a particular person. It’s about gaining acceptance within a group, as in the cases of surfers and Native Americans noted above. In this sense, it reminds me of the older African-American word, “striver,” although that word carried more respect than “wannabe.” This sense persists, used either on its own or appended to a job, status, or some other category of felicitous human existence. Both senses share the implication of falling short, failing to measure up. The word always had a tinge of contempt; a wannabe was in some measure pathetic, unable to do what it takes to become an initiate but unable to give up. No matter how ardent your devotion, you were never going to become Madonna; no matter how hard you ached to ride the waves, you just weren’t going to fit in. Now and then you will see the word used neutrally or even as a compliment, but a contemptuous tone usually is in there somewhere.
Grammar question: is “wannabe” ever really an adjective? It comes just as easily before a noun as after, at least nowadays, but is a wannabe Madonna the same as a Madonna wannabe? Maybe it’s always a noun, but sometimes it comes before the adjective. Maybe the substantive thing in the phrase is always “wannabe” (because the person doing the wanting should properly be regarded as the subject), never Madonna, or whoever (the object). My best guess is it’s a compound noun.
(late 1990′s | journalese | “handcrafted,” “small-scale,” “designer”)
In terms of straight denotation, this word means pretty much what it always has, not that it’s been around very long. The OED dates “artisan” to the sixteenth century, but the first citation of “artisanal” comes from 1939. (The first citation for “artisanal” as we use it today doesn’t appear until 1983.) The main change lies not in the meaning so much as what the term is applied to. The older meaning (again quoting the OED): “Of, relating to, or characteristic of an artisan or skilled craftsperson; involving or utilizing traditional, small-scale, or non-mechanized methods or techniques.” Essentially the adjective form of “artisan.” The key difference now is that it applies mostly to products. Actually, that’s very nineties; now the word may apply to the place where the product is made (artisanal bakery vs. artisanal bread) or even raw material, such as artisanal wheat. By and large, though, we expect to see the word modifying a variety of food or drink that requires some processing and preparation. That was not particularly true in 1980. To illustrate the shift, we turn to the trusty New York Times. An article on handmade chocolates (December 17, 1980) did not refer to “artisanal chocolate” but a “painstaking, artisanal tradition.” By 2000 “artisanal” was commonly applied to chocolate, cheese, bread, and wine. Here’s a partial list of what LexisNexis fished up between December 1, 2013 and February 1, 2014: pizza, toast, jeans, ice cream, jewelry, bacon, popsicles, doughnuts, porridge, pasta chips, cigars. (Artisanal toast comes with “smallbatch” almond butter. How common will that word be in ten years?) The Economist headed an ecstatic article about Etsy “Artisanal capitalism.”
Naturally, now that the word has become risibly common and artisanalism has become “our national consumer religion,” as Details magazine put it in 2010, a backlash has begun. Writers regularly bewail the fact that its meaning has stretched beyond any reasonable bound — citing ad campaigns for Dunkin’ Donuts, Tostitos, or Progresso — or that it is grossly overused. Come to think of it, those two phenomena go together. A word formerly used only by the starry-eyed has become an easier and easier target of scorn. But the carpers are not going to get their way. Like it or not, “artisanal” has carved out its own niche, in the same wall as “organic,” “sustainable,” or “fair-trade.” Its surge of popularity coincides with the mania for the pursuit of exotic, unheralded, or ethically gratifying food sources, partly as a matter of responsibility to one’s fellow human beings and partly as a matter of proving that one is better than everyone else.
“Artisanal” has always promised small-scale production of handmade or partly handmade items, distinctive (even exquisite) and exclusive. It has always the opposite of “industrial” or “mass-produced,” even in a term like “artisanal mining,” which has nothing to do with technique, only with the scale of the operation. As the term has developed since 1980, two things have happened. One is the shift, noted above, toward using “artisanal” to modify comestibles. The association is not exclusive, but it remains strong, even as current trends suggest that the term will glom onto more and more different kinds of nouns as time goes on. The other change has more to do with politics and society: the yoking of artisanal production with various left-wing watchwords born, or promoted, as a reaction to industrial farming and food preparation. The shift has changed the connotation of the word to some extent by mitigating the taint of luxury and elitism. Consuming artisanal products now is no longer decadent but virtuous, a way to help save the planet. (There is a fly built into this ointment: When too many consumers revel in the self-satisfaction wrought by the same artisanal product, it cannot remain artisanal.) As our economy evolved through mercantilism and industrialism, two complementary trends emerged and fed off each other like yin and yang: first you turn luxuries into necessities (sugar, coffee, tobacco, television sets), then once everyone can afford them, you turn them back into luxuries. Artisanal — that is, unique and expensive — goods help keep that cycle going by attracting both wealthy left-wingers, who used to disapprove of luxury, and wealthy right-wingers, happy to have more ways to flaunt their money. (I owe most of the foregoing analysis, as well as the nomination of “artisanal” itself, to lovely Liz from Queens.)
The artisan is a craftsman, of course. Nowadays we think of “art” as creativity and inspiration and craftsmanship as the technical skill to carry it out. “Art” hasn’t always been used that way — anybody take industrial arts in high school? — as its survival in words like “artisan” reminds us.