(1990’s | journalese (politics) | “dirty trick”)
Push polls originated in political campaigns. They are presented as impartial, conducted by an organization not technically connected with a candidate (although if a respondent pushes back hard enough, the caller may be forced to reveal his true employer). The point is not neutral assessment of public opinion, but rather influencing it directly, often by making a dubious, if not outright false, imputation about an opponent — sometimes the mechanism relies on laudatory comments about oneself, but the intent is the same. It usually takes the form, “If you knew so-and-so about Candidate X, would it make you more or less likely to vote for her?” The phrase established itself in the mid-1990’s, with the first hit in LexisNexis due to David Broder in October 1994. By the 1996 election, the expression had common currency in political reporting, and many commentators no longer bothered to define it, which had been the rule only a few months earlier.
There’s nothing new about libeling political opponents, but the problem is the means. To be effective over the long term, opinion polls have to be fair, designed to avoid favoring one group over another. A survey has the presumption of fairness; therefore, it’s worse to use a poll to perpetuate slander than to use other means. The main point, as stated by Matthew Reichbach in the New Mexico Independent (September 22, 2009): “a push poll is not a poll at all. Its a fraud, an attempt to disseminate information under the guise of a legitimate survey.” It is a fraud in that it presents itself in a misleading way, but the “information” conveyed may be fraudulent, too. Reputable pollsters hate them and are forever calling for an end to push polls.
In 1996, the derivation of “push poll” was generally explained as a simple elaboration on the act of pushing voters away from a particular candidate. That’s folk etymology, but there is probably some underlying truth. The term comes out of pollsters’ jargon, by evolution or corruption. “Push poll” is actually a descendant of “push question,” described in 1982 by William Safire as “designed to squeeze respondents to come up with the answer the sponsors want.” It’s a variety of loaded question native to the survey business, and not necessarily unethical, although it does lend itself to unethical use.
A push poll may contain more than one push question, heaven knows. In 2001, I was on the receiving end of a Bloomberg-for-mayor push poll which consisted almost entirely of pro-Bloomberg statements masquerading as questions. I finally said something like “Aw, c’mon,” and the (apparently young and definitely inexperienced) questioner agreed that the bias was pretty obvious. I kept going, answering each question gamely in the most anti-Bloomberg manner I could muster, but the whole thing was a farce. Did Bloomberg pay for it? Who knows? Yes or no, the whole process showed nothing but contempt for our intelligence.
And now, the scope of push polls has broadened, and you don’t have to be a politician any more. Anyone trying to influence public opinion — a corporation (Walmart seeking to open a store in Chicago), a social movement (a group trying to promote, or scuttle, gun-control legislation), etc. — can initiate them. I’m pleased to note that the phrase still carries strong opprobrium, and it is thrown around in grim accusation or indignant denial, never in approbation. It may be true that one man’s push poll is another man’s opposition research, and political professionals once defended push questions as legitimate, if they raised a verifiable point about the rival candidate. No one defends them any more, at least not in public.
(2000’s | journalese)
I’m not real up on popular culture, despite rubbing elbows with teenagers with modest regularity, so I will go ahead and explain this term for those who share my plight. “Bucket list” was popularized by a movie of the same title released in 2007. It refers to the goals you aim to achieve before you die — that is, kick the bucket. (“Kick the bucket” goes back to the eighteenth century, and its origins remain uncertain.) The film received plenty of publicity, starring as it did Morgan Freeman and Jack Nicholson, and the phrase soon took root in the popular lexicon. According to one history of the term, it was first used in today’s sense in 2004, but the citation isn’t entirely convincing.
Not that the phrase was invented then; it had a previous life in engineering and computer science. My father, an electrical engineer, remembers using the phrase to denote instructions for placing integrated circuits on circuit boards to build this or that device. In computer science, a bucket is a storage place or buffer for disparate pieces of data that have a common feature important enough to warrant grouping them together. When you record the names and locations of the different buckets, presto! a bucket list. The term was also used by archeologists to mean an inventory of artifacts removed from a dig (also “basket list”), a surprisingly intuitive usage. “Bucket list” could also refer to a group of items set aside in a negotiation, for example — points to be considered later, or in a different phase of the discussion. None of these definitions was ordinary or widely understood. Now they never will be. The first three were strictly technical terms, and the fourth never took hold. Because it was unfamiliar to most of us, when the hype for the movie ramped up in 2006, reporters felt compelled to explain the phrase and its derivation. When the movie was released at Christmastime in 2007, reviewers followed suit. By 2008, most of us knew what it meant and why.
What we didn’t know was how. My research was not exhaustive, but I couldn’t find any sign that any of the two million or so reporters who covered the film ever thought to ask any of its creators how they arrived at “bucket list.” The story and screenplay were due to Justin Zackham, and Rob Reiner directed. There’s no reason to think Zackham doesn’t deserve credit for the coinage (or repurposing, as the kids today say), but did anyone ever ask him to expound on his language-changing idea? Here’s a new word that everyone is using all of a sudden, and it has an unusually unconvoluted path into our vocabulary. Not only that, there were only one or two people that had a plausible claim to originating it. How come no one asked them about it? Gee, Mr. Zackham, where did you come up with “bucket list”? For at least a few years before 2007, you can find citations of “life list” used to mean exactly the same thing, a term from birdwatching that used to denote a catalogue of every variety of bird one has sighted (I confess I don’t recall ever hearing it). “Wish list,” though much more common, is much less specific, lacking the urgency lent by death’s door. Other than that, I don’t know of another word for the phenomenon, old or new. So whoever came up with it did us all a favor.
There’s a tendency to believe that many new expressions come from movies or television, but in my experience it’s rare to find one that is both invented (or at least given what appears to be an entirely new definition) and popularized by a single film. Most new or newish expressions popularized by movies were definitely in use before the film came out. “Bucket list” may have been, but the evidence is very sparse and unconvincing. Other examples of film-borne expressions: “wingman,” “don’t go there,” “you’re toast,” “meltdown,” “perfect storm.” All had been sighted before appearing in the film that made them famous.
Thanks to my sister for nominating “bucket list” this week, and to Dad for pitching in with some old IEEE lore. The family that blogs together slogs together.
(1990’s | militarese | “chickens coming home to roost,” “fallout,” “consequences”)
So you think you know what this word means? Actually, it means several different things, so chances are you’re right.
A century ago, “blowback” had mainly to do with guns and ammunition. It still does, although the original meaning doesn’t show up much any more — fire or explosion caused by the breech of an artillery gun opening at the wrong moment, allowing flames and explosive gases to go out the back end of the gun rather than the front end. (That’s the best I can do, even though my uncle was a gunsmith.) I found several descriptions and definitions of the phenomenon, which are broadly similar but differ in significant details, and I don’t know enough about how guns work to make sense of it all. Anyway, this sort of blowback is dangerous and can cause death, among other things. Nowadays when “blowback” is used of a firearm, it almost always refers to a means of loading cartridges in a semi-automatic pistol or rifle. (See impassioned technical explanation here.) The idea seems to be that some of the gases under pressure generated in firing a bullet are directed toward pushing the next shell into the chamber, causing the weapon to reload automatically, until it jams.
Both of these gun-related usages are found in Webster’s Third and were available before the foreign-policy/CIA-type use of the word we are more likely to think of today. But even in this narrow field of definition, “blowback” has undergone a decided change. Around the time of the Church Committee hearings on CIA misdeeds in the mid-1970’s, this term began to creep into the press, meaning disinformation. More specifically, “blowback” was defined as a fake news story prepared by one of our intelligence agencies for dissemination abroad that later was reported as fact by the U.S. press. I think it’s still notionally true that the CIA is not supposed to do its dirty work within the borders of the U.S., although of course it does and always has. But politicians still considered it worthwhile in those days to object to Americans being subjected to lies intended for foreigners. (They had no comparable objections to lies intended expressly for domestic consumption.) Christopher Simpson’s book “Blowback: America’s Recruitment of Nazis and its Effects on the Cold War” (1988) had to do with “unexpected and undesired domestic effects of foreign covert actions,” according to one reviewer. Such a definition is vague enough to encompass the meaning limned above and the more specific meaning in use by the early 1990’s: attacks on U.S. people or facilities inspired by previous U.S. operations, covert or overt. (Or maybe simply in response to an unofficial provocation, like encouraging people to draw cartoons of the Prophet Mohammed.) Here is a definition-by-example offered by Charles G. Cogan, the former C.I.A. operations chief for the Near East and South Asia, in the wake of the first World Trade Center bombing in 1993: “The hypothesis that the mujahedeen would come to the United States and commit terrorist actions did not enter into our universe of thinking [during the covert war against the Soviets in Afghanistan].” Another book titled “Blowback,” published in 2000 by Chalmers Johnson, probably gave this meaning of the term a boost, though it was already current.
Cogan went on to use the expression “unintended consequence,” a favorite of bureaucrats caught with their pants down. “Blowback,” while it has a fairly clear definition, is also a bit slippery and seems more comfortable as part of a constellation rather than standing bravely on its own. Let’s not even consider the cases in which “blowback” substitutes for “pushback,” a usage that has become common. “Blowback” now is often used to mean “adverse reaction from almost anyone,” not just aggrieved foreigners. But let’s think instead about some related expressions. An obvious one is “blow up in one’s face,” not a precise synonym but clearly in the neighborhood. “Payback.” Definitely an element of vengeance in “blowback” as we use it today. “Karma,” which may apply to corporate bodies (though, as of 2015, not generally the government), also comes to mind. There’s a little jungle of overlapping expressions here, none of which means the same thing as any other but all of which call the others quickly to mind.
I can’t resist closing with an instance of “blowback” from U.S. Navy regulations propagated in 1913: “The danger of a broken firing pin point or on the fusing of metal on the face of the breech-block, due to a primer blowback, shall be constantly borne in mind and guarded against.” Isn’t that great? They don’t write ’em like that any more.
(1980’s | academese | “leader of the pack,” “take-charge guy,” “macho man,” “dominant male”)
This expression takes advantage of the fact that we are animals and there is something very satisfying about showing direct analogies between human and animal behavior. “Alpha male,” along with “alpha female,” goes back at least to the sixties (the thirties, says William Safire), used first to talk about pack animals, especially wolves and primates. Explanations of social organization generally centered on the the top dog (or whatever), who made all the decisions, got the females he wanted, and scared his inferiors into submission. The typical alpha male had won his place by defeating, perhaps even killing, the previous alpha male; in those days, it was understood purely as a matter of physical domination. The phrase seems to have been applied to humans first in the eighties, generally meaning some combination of “leader,” “the one who gives orders,” and “the one who gets his way.” Sometimes brawn and aggressiveness alone defined the human alpha male, but more often it was a matter of wielding power over others through sexual attractiveness, overweening wealth, political clout. Not infrequently the phrase is used as a straight synonym for man who has a lot of sex with a lot of women. In the nineties, the phrase was used sometimes of Bill Clinton, apparently reflecting both his executive primacy and his prowess. In 1999, Al Gore hired Naomi Wolf as an advisor, whose role was widely reported at the time as teaching Gore to be an “alpha male,” though Wolf denied that’s what she was actually doing. Anyway, use of the phrase went up sharply in 1999, according to LexisNexis, and that increase appears to be permanent.
All these meanings remain in play today. I even found a nice new one, courtesy of a senior editor at Harlequin Romances: “Werewolf and vampire heroes are examples of the alpha male, strong and protective.” I assure you that in the old days, no one ever called an alpha male “protective.” But the term has also acquired a negative tinge, or at least the possibility of one. Two examples from 2009: sportswriter Francis X. Clines of the New York Times referred to obnoxious football fans as “alpha male bellowers.” Professor Robert Sapolsky alluded to “‘totally insane son of a bitch'” types, the sort of alpha males “who respond to stress by lashing out.” These are not just admissions that alpha male behavior might alienate people now and then; they are twists on the term that provide a new field of connotation. The idea that an alpha male exerted anything less than total authority in his field, or had anything to apologize for, was almost unknown as late as 2000 — it was nearly always a term of admiration or envy. Urban Dictionary offers several examples of sardonic or derogatory definitions of the term, though in fairness, most of them have not been treated well by voters. “Alpha male” may be developing the same double life as “type-A personality” (or “control freak“), which might be used as a compliment but generally is not. As beta males conspire to get their slow revenge on the alphas, more such heretical definitions may creep into the language. Among humans as among animals, a group of lesser men acting in concert can bring down the most potent head man. Julius Caesar went from “he doth bestride the narrow world like a Colossus” to “Then fall, Caesar!” in two short acts.
If the expression continues to take on darker meanings, it will mirror the decline in primatology and other disciplines of the whole notion of alpha males lording it over their enclaves. Frans de Waal and L. David Mech, among others, have moved away from descriptions of social organization dependent on such rigid hierarchies. The very concept of the “alpha male” has little to do with the politics of group behavior among animals and crudely oversimplifies the ways they organize themselves. The idea probably was born more of the predilections of mid-century researchers, and a general urge to find easy explanations of complicated phenomena, than actual observations of wild animals. (In fact, many early studies used captive animals, who behave much differently from their counterparts in the wild.) It may well prove that the alpha male today, like the social Darwinist a century earlier, is no more than a pseudo-natural mandate for the most selfish and sociopathic among us to justify their promiscuous, arrogant, or exploitative desires. For now, “alpha male” still retains much of its old shine, but that may change in the next ten or twenty years.
(1990’s | journalese (arts))
It may surprise you to learn that there are some who don’t like hipsters. The concept seems too familiar to require summary, but I encourage everyone to spend an hour Googling “hipster definition” or something similar. Once you see through the fog of animus, you encounter amazingly precise definitions of the term, with detailed and occasionally exquisite catalogues of preferences in fashion, the arts, diet, transportation, grooming, and who knows what all else. Oh, and then there’s the attitude, thoroughly annoying to upstanding citizens everywhere. Pretentious. Hypocritical. Self-righteous. Suckers for fads. Even card-carrying hipsters deny membership in the group; that too is an oft-cited trait of hipsterism, one of the odder ones, it seems to me. There must be some hipster out there who will own up, for Pete’s sake. Diogenes, get busy and start searching for an honest hipster. (In our day, that would be a reality show, and it probably wouldn’t last a season.)
Now “hipster” goes back at least to the 1940’s, when it was a straightforward variation on “hip” or “hip (hep) cat.” Back then, “hipster” was a compliment, used mostly within a particular, and fairly small, subculture. The word was applied to devotees of the latest jazz, or more generally the language, habits, and attitude that went with it. By the mid-1950’s, it was a synonym for “beatnik”; Norman Mailer used it to talk about white people who wanted to be black (which was thought to be the same as being hip). The emphasis fell on expert knowledge and awareness of your cultural surroundings, but anyone considered to be in the know or up to date rated the term. And with that went “cool” and other affectations: avoidance of strong emotion or expression, lack of interest in the world outside the club, etc. Does any of that sound familiar? The connoisseurship, the detachment, the lassitude, the obsessions? “Hipster” was overtaken by “hippie” in the 1960’s, which drove every other derivative of “hip” out of the language for twenty years. (It lives on today as an insult, which is what it was in the first place. “Hippie” is another example of a derogatory term adopted and embraced by its target.) When “hipster” jostled its way back into common speech, it brought quite a bit of its former meaning with it.
At least up until the mid-1980’s, one encountered “hipster” generally in articles about jazz musicians of a previous generation. It’s not clear to me when the changeover happened, but as early as the late 1980’s, I found some citations that made me suspect that today’s meaning was in play by then. But nothing really unambiguous until the early 1990’s. By 1995 the word was used as we use it now, though not universally. It went along with the rise of luxury coffee and Quentin Tarantino. And in those halcyon days the word often had a nostalgic tinge, a sense of rediscovering a hipper past. That shading seems to be gone. Another change: “Hipster” now almost always carries opprobrium, which was not true when it was an in-group term sixty, or even thirty, years ago. Is it just because hipsters are more obnoxious than they were back then? Maybe they’re just more ubiquitous; so much ink has been spilled over the phenomenon that everyone got tired of it (including the hipsters themselves), and ennui became the only possible response. Urban Dictionary affords over 500 definitions, and the web abounds with takedowns of hipsterism. One clever deconstruction from Adbusters (2008) suffers only slightly from the rather feverish suggestion that the hipsters will be the ones to bring down Western civilization once and for all.
One thing generally associated with hipsters all along is youth, or at least the ability to fake it. When do you cross the boundary and get too old to be a hipster? You wake up one morning and the wrinkles are just a little too deep. OMG! We have to move to the suburbs! Briefcases and bow ties for everyone! You probably have to stop wearing tight jeans and cycling when you turn into a hipster emeritus, but let us hope the poor dears can hang onto their obscure bands and Pabst Blue Ribbon beer. The sense of superiority will be the last thing to go.
(1980’s | businese (sales) | “peddling”)
Wonderful thing about this expression — it hasn’t really changed since 1980, when it completed its shift from personal sales visit to approach by telephone, a shift already accomplished elsewhere in the culture. (It’s the difference between “paying a call” and “making a call.”) In 1978, Jonathan Kwitny defined “cold call” as “blind telephone solicitation” (Wall Street Journal, April 10). The phrase seems to have arisen in the sixties among salesmen, long after the practice of attempting to sell to people who had not been warned of your coming was well established. Many items were sold that way door-to-door for most of the twentieth-century, and “cold call” originally denoted such encounters. Actually, the phrase more often conjured up a salesman cooling his heels in an executive’s anteroom than the outdoor work of the traveling drummer. By the early eighties the phrase could also apply not just to selling a product or service, but selling oneself, as a job applicant telephoning potential employers without any previous contact. Today, “cold call” still means both these things and is still used almost invariably in sales-related contexts — though it can be used of any telephone call made without prior notice, even if it has nothing to do with selling. Beyond that it hasn’t broadened, or accreted any metaphorical uses.
Telemarketers and stockbrokers did more to popularize this expression — and give it a bad name — than any other breeds of salesman. In the hands of either, cold calls are rife with misleading promises, not to mention just plain getting on lots of people’s nerves. The federal Do Not Call Registry is partly intended to make it difficult for commercial enterprises to cold-call (the expression graduated to verbhood somewhere around 2000, as far as I can tell) those who would rather be left alone. But the registry makes exceptions for organizations that have done business with you before. That raises the question of whether “cold call” covers pitches for an unrelated product from a company that you’ve bought from before. I would say yes.
Cold calling is widely understood to be ethically dubious, and the phrase, like “upsell,” has a taint it can’t quite shake no matter how hard you try to make it reputable. The prejudice is an old one — traveling salesmen made lively objects of suspicion for decades, and not just because they knocked up unlocked-up daughters. Snake-oil vendors abounded, and they liked nothing better than to show up on the doorsteps of people who hadn’t requested their presence. Dealing with a stranger trying to sell you something on the phone is, if anything, more intrusive, especially if it’s a robocall. Salespeople, start-ups looking for clients, and hustlers all continue to make cold calls because they work, at least if you make enough of them. There are people out there who go for the sales pitch, and by the law of averages, you will find some of them. Many recipients express their displeasure by hanging up within a minute or two, so a failed cold call (the vast majority) doesn’t eat up much time. Never mind that everyone hates them, including the drudges who have to make them for a living.
Have you noticed that there isn’t a word for “person on the receiving end of a telephone call”? It seems like there ought to be an everyday term for it. You can use “recipient,” as I did above, but it doesn’t sound idiomatic. Nobody says “callee,” the ostensible opposite of “caller.” Why isn’t there a word for the one who lifts the handset (or receiver, as we said in the old days), or flips open the cell phone, when it’s such a common occurrence? “Phonee”? “Quarry”? “Callcatcher”? “Picker-upper”? Let’s get to work, America!
(1980’s | computerese, businese | “independent,” “unconnected,” “separate,” “isolated”)
The earliest instances of “standalone” (sometimes hyphenated, even in this day and age) in Google Books date from the sixties and seventies, nearly always in talking about non-networked computers. The first hits recorded in LexisNexis all date from 1979 in that trusty journal American Banker — but invariably in discussions of the use of computers in banking. The word was used often in the early days of ATM’s, which could, in the manner of computers, be divided into the ones clustered together for protection (e.g., in a bank lobby) and the ones out in the field, far from help. (The latter had to be connected to a mainframe somewhere or they wouldn’t have access to anyone’s account data, of course. And even a standalone computer had to be connected to a power source. No computer is an island; no computer stands alone.) ATM’s were brave and new in the eighties, and I suspect their spread pushed “standalone” into prominence. Other business types were certainly using the word by 1990, generally in reference to computers. It was widely understood by then but remained primarily a hardware term at least until 2000. One mildly interesting point about “standalone” is that it could apply to an entire system as well as to a single device. A standalone device can function even if it is not part of a larger system, but an entire system can also absorb the adjective if it doesn’t depend obviously on comparable systems.
“Standalone” retains a strong business bias, even today, but it is available to describe many things besides computers. A complicated piece of legislation might be broken up into standalone bills. Or a film for which no prequels or sequels are planned (or one in which a character that had been a supporting player in other films becomes the protagonist) might be so described. A government agency that doesn’t rely on another agency for its writ. A restaurant that isn’t part of a chain. “Standalone” is not generally used to mean “freestanding,” although it seems like it ought to be, literally speaking. I am a little surprised that I find almost no examples of the word used as a noun (one does see it used as a trade name), although that seems inevitable. All it takes is the careless insertion of one lousy one-letter article, and the deed is done. You’d think it would be harder to blur fundamental grammatical categories, but no.
The rise of this term inevitably accompanied a change in how we use computers. In the seventies and eighties, when we began to get serious about turning them into tools for business, the idea was that each employee’s work station had to be connected to the mainframe, where all the applications and data were stored. In the nineties, we shifted to the opposite model: each employee’s computer should have a big enough hard drive to store software and data; every work station became its own mainframe (or server, as we would say now). In the last few years, we’ve pushed the other way, and now minuscule laptops and tablets run software from the cloud, and store data there as well. The same shift has taken place outside the office; home computers have undergone a similar evolution. There are no doubt good reasons for the shift; the rules and conventions of the computer game have changed quite a bit. But like many such sizable shifts in our culture, it has taken place with little or no consideration of why we did it the other way. Are the once highly-touted advantages of standalone computers no longer real or significant? We don’t know, because the issue was never debated out where most of us could hear. We did it the old way because there was money in it, and now the powers that be have found a new way to make money. You’re stuck with it whether it helps you or not, and you’re not even entitled to an explanation. That should be surprising, but in practice, it isn’t. Our policy debates routinely fail to explore how things got to be the way they are. It’s as if we all woke up one day and said, “Look, a problem! Let’s fix it!” With insufficient historical understanding, we attack large-scale problems with little or no attention to how they arose and fail to acknowledge the evils the existing approach has successfully prevented.