Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 40 years


fraudster

(1990’s | businese (finance) | “con man,” “crook,” “trickster”)

A far as I can tell, we owe this one to the Brits, or maybe their ex-colonials. There are very few LexisNexis results from U.S. sources before 1995; nearly all come from Great Britain and the recent colonies. We’re certainly not above borrowing Briticisms and their cousins (“at the end of the day,” “over the moon,” “selfie“) in these parts, and the venerable “-ster” suffix (see below) rolls off the tongue easily in America, the land of gangsters and mobsters.

This word probably is not necessary in either hemisphere, but it does have the advantage of incorporating the very act its embodiments practice. The old equivalents (see above) do not. “Con[fidence] man” is itself a bit deceptive; it really refers to someone you should not have confidence in, and he has to convince you to do so. “Crook” is more general than “fraudster,” but the two words line up pretty well, so it’s rarely jarring to substitute one for the other. A fraudster sees an opening and uses it dishonestly for personal advantage. The word seems indifferent to distinctions like preying on individuals vs. cheating corporate bodies, or large-scale vs. small-scale crime. As the OED (first citation: 1975) notes, “fraudster” denotes in particular one who lies or cheats in the course of a business transaction. But it is not restricted to such use and has already spread out.

I cannot help but wonder if the rise of this expression since the seventies does not result from a simple (or rather geometric) increase in chicanery. It may be that an ever more complicated and less regulated financial system, coupled with increased criminal activity enabled by widespread use of computers, has made it ever easier to pull scams, causing a new expression to erupt, a boon to harassed writers if no one else. It’s such a relief to have a new, yet easily grasped, synonym to haul out once in a while.

The “-ster” suffix repays study. My feeling is that it has had a bit of an underbelly for centuries, but any negative connotation probably became more pronounced in the twentieth century, at least in the U.S. “Gangster” and “mobster” both date back to somewhere around 1900, according to Random House; I suspect that “gangster” relies on “teamster,” an eighteenth-century expression that did not, as far as I know, develop a dark side until the twentieth, when the Teamsters’ Union for a time became synonymous with corruption. (Tapster, tipster, and trickster, all dubious trades, are much older words. Speaking of deplorable occupations, where do “barrister” and “monster” fit into all this?) The suffix doesn’t always have a negative connotation, even today; when connected to a name, it may be affectionate. For example, Tom Bergeron used to call Whoopi Goldberg “Whoopster” on Hollywood Squares. Then again, after fifty years as a compliment, “hipster” finally became a dirty word somewhere around 2000. I’m not enough of a linguist to offer a proper history of the suffix, but “baxter” (female baker) and “brewster” (female brewer) are very old. According to Chambers Etymological Dictionary (thanks, Liz!), “-ster” comes from Anglo-Saxon, where it denoted specifically a female practitioner, but well before the Elizabethan era the gender distinction had disappeared. Chambers also notes that the suffix, originally attached to verbs (bake, brew), as befits an equivalent of the “-er” suffix, now hooks more readily to nouns. It has gone on yoking itself to new words for centuries now, and it usually seems to have something shady or untrustworthy about it. “Fraudster” thus takes its place in a long, rich tradition.

Tags: , , , , , , , , , ,


push poll

(1990’s | journalese (politics) | “dirty trick”)

Push polls originated in political campaigns. They are presented as impartial, conducted by an organization not technically connected with a candidate (although if a respondent pushes back hard enough, the caller may be forced to reveal his true employer). The point is not neutral assessment of public opinion, but rather influencing it directly, often by making a dubious, if not outright false, imputation about an opponent — sometimes the mechanism relies on laudatory comments about oneself, but the intent is the same. It usually takes the form, “If you knew so-and-so about Candidate X, would it make you more or less likely to vote for her?” The phrase established itself in the mid-1990’s, with the first hit in LexisNexis due to David Broder in October 1994. By the 1996 election, the expression had common currency in political reporting, and many commentators no longer bothered to define it, which had been the rule only a few months earlier.

There’s nothing new about libeling political opponents, but the problem is the means. To be effective over the long term, opinion polls have to be fair, designed to avoid favoring one group over another. A survey has the presumption of fairness; therefore, it’s worse to use a poll to perpetuate slander than to use other means. The main point, as stated by Matthew Reichbach in the New Mexico Independent (September 22, 2009): “a push poll is not a poll at all. Its a fraud, an attempt to disseminate information under the guise of a legitimate survey.” It is a fraud in that it presents itself in a misleading way, but the “information” conveyed may be fraudulent, too. Reputable pollsters hate them and are forever calling for an end to push polls.

In 1996, the derivation of “push poll” was generally explained as a simple elaboration on the act of pushing voters away from a particular candidate. That’s folk etymology, but there is probably some underlying truth. The term comes out of pollsters’ jargon, by evolution or corruption. “Push poll” is actually a descendant of “push question,” described in 1982 by William Safire as “designed to squeeze respondents to come up with the answer the sponsors want.” It’s a variety of loaded question native to the survey business, and not necessarily unethical, although it does lend itself to unethical use.

A push poll may contain more than one push question, heaven knows. In 2001, I was on the receiving end of a Bloomberg-for-mayor push poll which consisted almost entirely of pro-Bloomberg statements masquerading as questions. I finally said something like “Aw, c’mon,” and the (apparently young and definitely inexperienced) questioner agreed that the bias was pretty obvious. I kept going, answering each question gamely in the most anti-Bloomberg manner I could muster, but the whole thing was a farce. Did Bloomberg pay for it? Who knows? Yes or no, the whole process showed nothing but contempt for our intelligence.

And now, the scope of push polls has broadened, and you don’t have to be a politician any more. Anyone trying to influence public opinion — a corporation (Walmart seeking to open a store in Chicago), a social movement (a group trying to promote, or scuttle, gun-control legislation), etc. — can initiate them. I’m pleased to note that the phrase still carries strong opprobrium, and it is thrown around in grim accusation or indignant denial, never in approbation. It may be true that one man’s push poll is another man’s opposition research, and political professionals once defended push questions as legitimate, if they raised a verifiable point about the rival candidate. No one defends them any more, at least not in public.

Tags: , , , , , , , , ,


bucket list

(2000’s | journalese)

I’m not real up on popular culture, despite rubbing elbows with teenagers with modest regularity, so I will go ahead and explain this term for those who share my plight. “Bucket list” was popularized by a movie of the same title released in 2007. It refers to the goals you aim to achieve before you die — that is, kick the bucket. (“Kick the bucket” goes back to the eighteenth century, and its origins remain uncertain.) The film received plenty of publicity, starring as it did Morgan Freeman and Jack Nicholson, and the phrase soon took root in the popular lexicon. According to one history of the term, it was first used in today’s sense in 2004, but the citation isn’t entirely convincing.

Not that the phrase was invented then; it had a previous life in engineering and computer science. My father, an electrical engineer, remembers using the phrase to denote instructions for placing integrated circuits on circuit boards to build this or that device. In computer science, a bucket is a storage place or buffer for disparate pieces of data that have a common feature important enough to warrant grouping them together. When you record the names and locations of the different buckets, presto! a bucket list. The term was also used by archeologists to mean an inventory of artifacts removed from a dig (also “basket list”), a surprisingly intuitive usage. “Bucket list” could also refer to a group of items set aside in a negotiation, for example — points to be considered later, or in a different phase of the discussion. None of these definitions was ordinary or widely understood. Now they never will be. The first three were strictly technical terms, and the fourth never took hold. Because it was unfamiliar to most of us, when the hype for the movie ramped up in 2006, reporters felt compelled to explain the phrase and its derivation. When the movie was released at Christmastime in 2007, reviewers followed suit. By 2008, most of us knew what it meant and why.

What we didn’t know was how. My research was not exhaustive, but I couldn’t find any sign that any of the two million or so reporters who covered the film ever thought to ask any of its creators how they arrived at “bucket list.” The story and screenplay were due to Justin Zackham, and Rob Reiner directed. There’s no reason to think Zackham doesn’t deserve credit for the coinage (or repurposing, as the kids today say), but did anyone ever ask him to expound on his language-changing idea? Here’s a new word that everyone is using all of a sudden, and it has an unusually unconvoluted path into our vocabulary. Not only that, there were only one or two people that had a plausible claim to originating it. How come no one asked them about it? Gee, Mr. Zackham, where did you come up with “bucket list”? For at least a few years before 2007, you can find citations of “life list” used to mean exactly the same thing, a term from birdwatching that used to denote a catalogue of every variety of bird one has sighted (I confess I don’t recall ever hearing it). “Wish list,” though much more common, is much less specific, lacking the urgency lent by death’s door. Other than that, I don’t know of another word for the phenomenon, old or new. So whoever came up with it did us all a favor.

There’s a tendency to believe that many new expressions come from movies or television, but in my experience it’s rare to find one that is both invented (or at least given what appears to be an entirely new definition) and popularized by a single film. Most new or newish expressions popularized by movies were definitely in use before the film came out. “Bucket list” may have been, but the evidence is very sparse and unconvincing. Other examples of film-borne expressions: “wingman,” “don’t go there,” “you’re toast,” “meltdown,” “perfect storm.” All had been sighted before appearing in the film that made them famous.

Thanks to my sister for nominating “bucket list” this week, and to Dad for pitching in with some old IEEE lore. The family that blogs together slogs together.

Tags: , , , , , , , , , ,


blowback

(1990’s | militarese | “chickens coming home to roost,” “fallout,” “consequences”)

So you think you know what this word means? Actually, it means several different things, so chances are you’re right.

A century ago, “blowback” had mainly to do with guns and ammunition. It still does, although the original meaning doesn’t show up much any more — fire or explosion caused by the breech of an artillery gun opening at the wrong moment, allowing flames and explosive gases to go out the back end of the gun rather than the front end. (That’s the best I can do, even though my uncle was a gunsmith.) I found several descriptions and definitions of the phenomenon, which are broadly similar but differ in significant details, and I don’t know enough about how guns work to make sense of it all. Anyway, this sort of blowback is dangerous and can cause death, among other things. Nowadays when “blowback” is used of a firearm, it almost always refers to a means of loading cartridges in a semi-automatic pistol or rifle. (See impassioned technical explanation here.) The idea seems to be that some of the gases under pressure generated in firing a bullet are directed toward pushing the next shell into the chamber, causing the weapon to reload automatically, until it jams.

Both of these gun-related usages are found in Webster’s Third and were available before the foreign-policy/CIA-type use of the word we are more likely to think of today. But even in this narrow field of definition, “blowback” has undergone a decided change. Around the time of the Church Committee hearings on CIA misdeeds in the mid-1970’s, this term began to creep into the press, meaning disinformation. More specifically, “blowback” was defined as a fake news story prepared by one of our intelligence agencies for dissemination abroad that later was reported as fact by the U.S. press. I think it’s still notionally true that the CIA is not supposed to do its dirty work within the borders of the U.S., although of course it does and always has. But politicians still considered it worthwhile in those days to object to Americans being subjected to lies intended for foreigners. (They had no comparable objections to lies intended expressly for domestic consumption.) Christopher Simpson’s book “Blowback: America’s Recruitment of Nazis and its Effects on the Cold War” (1988) had to do with “unexpected and undesired domestic effects of foreign covert actions,” according to one reviewer. Such a definition is vague enough to encompass the meaning limned above and the more specific meaning in use by the early 1990’s: attacks on U.S. people or facilities inspired by previous U.S. operations, covert or overt. (Or maybe simply in response to an unofficial provocation, like encouraging people to draw cartoons of the Prophet Mohammed.) Here is a definition-by-example offered by Charles G. Cogan, the former C.I.A. operations chief for the Near East and South Asia, in the wake of the first World Trade Center bombing in 1993: “The hypothesis that the mujahedeen would come to the United States and commit terrorist actions did not enter into our universe of thinking [during the covert war against the Soviets in Afghanistan].” Another book titled “Blowback,” published in 2000 by Chalmers Johnson, probably gave this meaning of the term a boost, though it was already current.

Cogan went on to use the expression “unintended consequence,” a favorite of bureaucrats caught with their pants down. “Blowback,” while it has a fairly clear definition, is also a bit slippery and seems more comfortable as part of a constellation rather than standing bravely on its own. Let’s not even consider the cases in which “blowback” substitutes for “pushback,” a usage that has become common. “Blowback” now is often used to mean “adverse reaction from almost anyone,” not just aggrieved foreigners. But let’s think instead about some related expressions. An obvious one is “blow up in one’s face,” not a precise synonym but clearly in the neighborhood. “Payback.” Definitely an element of vengeance in “blowback” as we use it today. “Karma,” which may apply to corporate bodies (though, as of 2015, not generally the government), also comes to mind. There’s a little jungle of overlapping expressions here, none of which means the same thing as any other but all of which call the others quickly to mind.

I can’t resist closing with an instance of “blowback” from U.S. Navy regulations propagated in 1913: “The danger of a broken firing pin point or on the fusing of metal on the face of the breech-block, due to a primer blowback, shall be constantly borne in mind and guarded against.” Isn’t that great? They don’t write ’em like that any more.

Tags: , , , , , , , , , , ,


alpha male

(1980’s | academese | “leader of the pack,” “take-charge guy,” “macho man,” “dominant male”)

This expression takes advantage of the fact that we are animals and there is something very satisfying about showing direct analogies between human and animal behavior. “Alpha male,” along with “alpha female,” goes back at least to the sixties (the thirties, says William Safire), used first to talk about pack animals, especially wolves and primates. Explanations of social organization generally centered on the the top dog (or whatever), who made all the decisions, got the females he wanted, and scared his inferiors into submission. The typical alpha male had won his place by defeating, perhaps even killing, the previous alpha male; in those days, it was understood purely as a matter of physical domination. The phrase seems to have been applied to humans first in the eighties, generally meaning some combination of “leader,” “the one who gives orders,” and “the one who gets his way.” Sometimes brawn and aggressiveness alone defined the human alpha male, but more often it was a matter of wielding power over others through sexual attractiveness, overweening wealth, political clout. Not infrequently the phrase is used as a straight synonym for man who has a lot of sex with a lot of women. In the nineties, the phrase was used sometimes of Bill Clinton, apparently reflecting both his executive primacy and his prowess. In 1999, Al Gore hired Naomi Wolf as an advisor, whose role was widely reported at the time as teaching Gore to be an “alpha male,” though Wolf denied that’s what she was actually doing. Anyway, use of the phrase went up sharply in 1999, according to LexisNexis, and that increase appears to be permanent.

All these meanings remain in play today. I even found a nice new one, courtesy of a senior editor at Harlequin Romances: “Werewolf and vampire heroes are examples of the alpha male, strong and protective.” I assure you that in the old days, no one ever called an alpha male “protective.” But the term has also acquired a negative tinge, or at least the possibility of one. Two examples from 2009: sportswriter Francis X. Clines of the New York Times referred to obnoxious football fans as “alpha male bellowers.” Professor Robert Sapolsky alluded to “‘totally insane son of a bitch'” types, the sort of alpha males “who respond to stress by lashing out.” These are not just admissions that alpha male behavior might alienate people now and then; they are twists on the term that provide a new field of connotation. The idea that an alpha male exerted anything less than total authority in his field, or had anything to apologize for, was almost unknown as late as 2000 — it was nearly always a term of admiration or envy. Urban Dictionary offers several examples of sardonic or derogatory definitions of the term, though in fairness, most of them have not been treated well by voters. “Alpha male” may be developing the same double life as “type-A personality” (or “control freak“), which might be used as a compliment but generally is not. As beta males conspire to get their slow revenge on the alphas, more such heretical definitions may creep into the language. Among humans as among animals, a group of lesser men acting in concert can bring down the most potent head man. Julius Caesar went from “he doth bestride the narrow world like a Colossus” to “Then fall, Caesar!” in two short acts.

If the expression continues to take on darker meanings, it will mirror the decline in primatology and other disciplines of the whole notion of alpha males lording it over their enclaves. Frans de Waal and L. David Mech, among others, have moved away from descriptions of social organization dependent on such rigid hierarchies. The very concept of the “alpha male” has little to do with the politics of group behavior among animals and crudely oversimplifies the ways they organize themselves. The idea probably was born more of the predilections of mid-century researchers, and a general urge to find easy explanations of complicated phenomena, than actual observations of wild animals. (In fact, many early studies used captive animals, who behave much differently from their counterparts in the wild.) It may well prove that the alpha male today, like the social Darwinist a century earlier, is no more than a pseudo-natural mandate for the most selfish and sociopathic among us to justify their promiscuous, arrogant, or exploitative desires. For now, “alpha male” still retains much of its old shine, but that may change in the next ten or twenty years.

Tags: , , , , , , , , , , , ,

designated driver

(1980’s | bureaucratese?)

A new expression that has stayed put, sober and responsible. “Designated driver” first poked its head out in 1982, says LexisNexis, and its sense has never changed. Metaphorical uses are uncommon, and literal uses not much less so. Oh, a race car pilot may be “designated driver” of a particular car for a particular race, though it’s not clear which part of speech “designated” is in such a case. Now and then a paid driver (bus, ambulette, taxi) winds up being referred to as a designated driver. But the set phrase that we grasp as second nature today is pure eighties. It grew slowly but steadily with the rise of the movement against drunk driving.

Mothers Against Drunk Drivers (soon changed to “Drunk Driving”) was founded in 1980 by actual mothers whose children had been killed in accidents caused by drunk drivers. It has been enormously successful, an example of a do-gooder public-service organization that has won respect (or deference, which is more important) across the political spectrum and changed a nation’s behavior. Plenty of people still drink and drive, but they do it much more cautiously than they did two generations ago. Attitudes have changed, and a multidisciplinary structure has been built to make driving under the influence shameful and criminal. Part of that structure is the designated driver, born (in the U.S., at least) near the beginning of the eighties, worming its way into beer commercials by the end of the decade, by which time all us reprobates had learned the expression. Actually, Congress declared “National Drunk and Drugged Driving Awareness Week” as early as December 1982, and the phrase was part of the proclamation. The first use recorded in LexisNexis (October 27, 1982) is due to St. Louis Cardinals catcher Darrell Porter, who had just won the World Series MVP Award and was known at the time as a player who had completed drug rehabilitation successfully (“drugs” included alcohol, as Porter was careful to point out). “‘I didn’t even drink in high school,’ he said with a smile. ‘I was what you’d call the designated driver.'” I myself was in college during those crucial early years when the new expression was struggling to make its way, and I don’t remember hearing it then, but may have. I do remember “Friends don’t let friends drive drunk.”

Porter’s use of the expression is significant, not just as a matter of historical precedence, but in heralding a radical change in the group behavior of young men. Simply put, non-drinkers became extremely popular when the designated driver took its place in the arsenal of defenses against drunk driving. For decades, centuries, teetotalers were objects of scorn and generally avoided (ironically, the old insult “wet,” meaning something like “lame” as we use it today with an extra touch of wimpiness, fit teetotalers nicely). But when you need a designated driver, that’s exactly the guy you want to bring along — he was gonna drink soda all night anyway. (Wise friends repay the designated driver occasionally, perhaps by providing wingman services.)

“Designated” is a bureaucrat’s word, generally used to refer to something named or assigned by legal authority. It was thus a rather odd choice for the new line-up spot created in 1973 by the American League. (The player was assigned to the Designated Hitter position by the manager, so it wasn’t unreasonable. At first, one heard “Designated Pinch Hitter,” but that disappeared quickly, just as well, since it was confusing.) The designated hitter is the most likely — actually the only — forerunner I can think of. The “designated driver” is not named by authority, generally. Someone within the group has to volunteer, or members of the group take turns. More like a nominated driver, at least if being nominated consists of drawing the short straw.

Tags: , , , , , , , , , , , ,


hipster

(1990’s | journalese (arts))

It may surprise you to learn that there are some who don’t like hipsters. The concept seems too familiar to require summary, but I encourage everyone to spend an hour Googling “hipster definition” or something similar. Once you see through the fog of animus, you encounter amazingly precise definitions of the term, with detailed and occasionally exquisite catalogues of preferences in fashion, the arts, diet, transportation, grooming, and who knows what all else. Oh, and then there’s the attitude, thoroughly annoying to upstanding citizens everywhere. Pretentious. Hypocritical. Self-righteous. Suckers for fads. Even card-carrying hipsters deny membership in the group; that too is an oft-cited trait of hipsterism, one of the odder ones, it seems to me. There must be some hipster out there who will own up, for Pete’s sake. Diogenes, get busy and start searching for an honest hipster. (In our day, that would be a reality show, and it probably wouldn’t last a season.)

Now “hipster” goes back at least to the 1940’s, when it was a straightforward variation on “hip” or “hip (hep) cat.” Back then, “hipster” was a compliment, used mostly within a particular, and fairly small, subculture. The word was applied to devotees of the latest jazz, or more generally the language, habits, and attitude that went with it. By the mid-1950’s, it was a synonym for “beatnik”; Norman Mailer used it to talk about white people who wanted to be black (which was thought to be the same as being hip). The emphasis fell on expert knowledge and awareness of your cultural surroundings, but anyone considered to be in the know or up to date rated the term. And with that went “cool” and other affectations: avoidance of strong emotion or expression, lack of interest in the world outside the club, etc. Does any of that sound familiar? The connoisseurship, the detachment, the lassitude, the obsessions? “Hipster” was overtaken by “hippie” in the 1960’s, which drove every other derivative of “hip” out of the language for twenty years. (It lives on today as an insult, which is what it was in the first place. “Hippie” is another example of a derogatory term adopted and embraced by its target.) When “hipster” jostled its way back into common speech, it brought quite a bit of its former meaning with it.

At least up until the mid-1980’s, one encountered “hipster” generally in articles about jazz musicians of a previous generation. It’s not clear to me when the changeover happened, but as early as the late 1980’s, I found some citations that made me suspect that today’s meaning was in play by then. But nothing really unambiguous until the early 1990’s. By 1995 the word was used as we use it now, though not universally. It went along with the rise of luxury coffee and Quentin Tarantino. And in those halcyon days the word often had a nostalgic tinge, a sense of rediscovering a hipper past. That shading seems to be gone. Another change: “Hipster” now almost always carries opprobrium, which was not true when it was an in-group term sixty, or even thirty, years ago. Is it just because hipsters are more obnoxious than they were back then? Maybe they’re just more ubiquitous; so much ink has been spilled over the phenomenon that everyone got tired of it (including the hipsters themselves), and ennui became the only possible response. Urban Dictionary affords over 500 definitions, and the web abounds with takedowns of hipsterism. One clever deconstruction from Adbusters (2008) suffers only slightly from the rather feverish suggestion that the hipsters will be the ones to bring down Western civilization once and for all.

One thing generally associated with hipsters all along is youth, or at least the ability to fake it. When do you cross the boundary and get too old to be a hipster? You wake up one morning and the wrinkles are just a little too deep. OMG! We have to move to the suburbs! Briefcases and bow ties for everyone! You probably have to stop wearing tight jeans and cycling when you turn into a hipster emeritus, but let us hope the poor dears can hang onto their obscure bands and Pabst Blue Ribbon beer. The sense of superiority will be the last thing to go.

Tags: , , , , , , , , , , , ,


flyover country

(1990’s | journalese (film) | “the heartland,” “middle America,” “America’s breadbasket”)

While this term ought to be restricted to areas between the east and west coasts of the United States, it can be used for any region that feels marginalized in national politics or popular entertainment. At its largest, flyover country includes everything except the metropolitan areas of Los Angeles, San Francisco, Washington, D.C., and New York. In other words, it’s where the elites don’t live and never go. As you might expect, when people refer to their own town or state as flyover country, they usually direct a healthy dose of resentment at the powers that be. It is generally attributed to such arrogant, out-of-touch snobs by residents of the Midwest or where have you, and has been from the beginning.

The phrase begins to appear in LexisNexis after 1985 — generally attributed at the time to Hollywood jet-setters — and seems to have become firmly established in the nineties. The McVeigh bombing in 1995 may have helped push it into prominence; terrorism wasn’t supposed to happen so far from the power centers, and the term got more of a workout than usual. By 2000, it shed quotation marks and explanations and became easily understood shorthand for average Americans unrepresented in political and media centers. The distinction between a small cadre of officials in the capital who make the laws and the great majority of the population intentionally excluded from such deliberations is very old; it has underlain our entire political history as a nation and has animated both left-wing and right-wing activism. Without flyover country there would be no populism, although the term would have made no sense until the fifties, long after populism was invented. There are some old expressions for the salt of the earth and the great mass of land they call home, but few capture quite the depth of elite contempt (“great unwashed” referred to a different group of people, for example).

Whether the elite actually feel the disdain assigned to them is an open question. Federal officials can’t possibly craft policies that serve every locality; no matter what you do, someone somewhere will be unhappy. What looks like disregard for local necessities and wisdom may just result from balancing of a complicated set of interests. Hollywood types may well scorn the vast majority of their audience, but their scorn extends beyond flyover country to include most of their local compatriots, many of whom are just plain folks. But the expression is far more often used by its supposed targets than by bureaucrats yanking away another right or film directors offending more sensibilities. Movie executives or powerful officials are rarely quoted using the expression, and if the term ever flourished among the wielders of power, it was quickly co-opted by its targets to attack the aristocrats. Like other terms defiantly adopted by oppressed minorities, it often has a defensive cast.

The adoption of the expression by the slighted majority marks an unusually quick, smooth instance of linguistic jiu-jitsu. “Flyover country” apparently never had a chance to establish itself among the elites before the hoi polloi laid claim to it and turned it into a badge of honor. It’s a nice trick: a quick, widely-grasped phrase turned on its head to attack the adversary. The biter bit! The process is not unusual, but the speed and seamlessness are. “Queer” really was a put-down for decades before the gay community co-opted it (and that process took some time). Same with “redneck” or “Chicano.” But no sooner does “flyover country” appear in the lexicon than its targets snap it up and make it theirs.

Tags: , , , , , , , , ,

cold call

(1980’s | businese (sales) | “peddling”)

Wonderful thing about this expression — it hasn’t really changed since 1980, when it completed its shift from personal sales visit to approach by telephone, a shift already accomplished elsewhere in the culture. (It’s the difference between “paying a call” and “making a call.”) In 1978, Jonathan Kwitny defined “cold call” as “blind telephone solicitation” (Wall Street Journal, April 10). The phrase seems to have arisen in the sixties among salesmen, long after the practice of attempting to sell to people who had not been warned of your coming was well established. Many items were sold that way door-to-door for most of the twentieth-century, and “cold call” originally denoted such encounters. Actually, the phrase more often conjured up a salesman cooling his heels in an executive’s anteroom than the outdoor work of the traveling drummer. By the early eighties the phrase could also apply not just to selling a product or service, but selling oneself, as a job applicant telephoning potential employers without any previous contact. Today, “cold call” still means both these things and is still used almost invariably in sales-related contexts — though it can be used of any telephone call made without prior notice, even if it has nothing to do with selling. Beyond that it hasn’t broadened, or accreted any metaphorical uses.

Telemarketers and stockbrokers did more to popularize this expression — and give it a bad name — than any other breeds of salesman. In the hands of either, cold calls are rife with misleading promises, not to mention just plain getting on lots of people’s nerves. The federal Do Not Call Registry is partly intended to make it difficult for commercial enterprises to cold-call (the expression graduated to verbhood somewhere around 2000, as far as I can tell) those who would rather be left alone. But the registry makes exceptions for organizations that have done business with you before. That raises the question of whether “cold call” covers pitches for an unrelated product from a company that you’ve bought from before. I would say yes.

Cold calling is widely understood to be ethically dubious, and the phrase, like “upsell,” has a taint it can’t quite shake no matter how hard you try to make it reputable. The prejudice is an old one — traveling salesmen made lively objects of suspicion for decades, and not just because they knocked up unlocked-up daughters. Snake-oil vendors abounded, and they liked nothing better than to show up on the doorsteps of people who hadn’t requested their presence. Dealing with a stranger trying to sell you something on the phone is, if anything, more intrusive, especially if it’s a robocall. Salespeople, start-ups looking for clients, and hustlers all continue to make cold calls because they work, at least if you make enough of them. There are people out there who go for the sales pitch, and by the law of averages, you will find some of them. Many recipients express their displeasure by hanging up within a minute or two, so a failed cold call (the vast majority) doesn’t eat up much time. Never mind that everyone hates them, including the drudges who have to make them for a living.

Have you noticed that there isn’t a word for “person on the receiving end of a telephone call”? It seems like there ought to be an everyday term for it. You can use “recipient,” as I did above, but it doesn’t sound idiomatic. Nobody says “callee,” the ostensible opposite of “caller.” Why isn’t there a word for the one who lifts the handset (or receiver, as we said in the old days), or flips open the cell phone, when it’s such a common occurrence? “Phonee”? “Quarry”? “Callcatcher”? “Picker-upper”? Let’s get to work, America!

Tags: , , , , , , , ,


standalone

(1980’s | computerese, businese | “independent,” “unconnected,” “separate,” “isolated”)

The earliest instances of “standalone” (sometimes hyphenated, even in this day and age) in Google Books date from the sixties and seventies, nearly always in talking about non-networked computers. The first hits recorded in LexisNexis all date from 1979 in that trusty journal American Banker — but invariably in discussions of the use of computers in banking. The word was used often in the early days of ATM’s, which could, in the manner of computers, be divided into the ones clustered together for protection (e.g., in a bank lobby) and the ones out in the field, far from help. (The latter had to be connected to a mainframe somewhere or they wouldn’t have access to anyone’s account data, of course. And even a standalone computer had to be connected to a power source. No computer is an island; no computer stands alone.) ATM’s were brave and new in the eighties, and I suspect their spread pushed “standalone” into prominence. Other business types were certainly using the word by 1990, generally in reference to computers. It was widely understood by then but remained primarily a hardware term at least until 2000. One mildly interesting point about “standalone” is that it could apply to an entire system as well as to a single device. A standalone device can function even if it is not part of a larger system, but an entire system can also absorb the adjective if it doesn’t depend obviously on comparable systems.

“Standalone” retains a strong business bias, even today, but it is available to describe many things besides computers. A complicated piece of legislation might be broken up into standalone bills. Or a film for which no prequels or sequels are planned (or one in which a character that had been a supporting player in other films becomes the protagonist) might be so described. A government agency that doesn’t rely on another agency for its writ. A restaurant that isn’t part of a chain. “Standalone” is not generally used to mean “freestanding,” although it seems like it ought to be, literally speaking. I am a little surprised that I find almost no examples of the word used as a noun (one does see it used as a trade name), although that seems inevitable. All it takes is the careless insertion of one lousy one-letter article, and the deed is done. You’d think it would be harder to blur fundamental grammatical categories, but no.

The rise of this term inevitably accompanied a change in how we use computers. In the seventies and eighties, when we began to get serious about turning them into tools for business, the idea was that each employee’s work station had to be connected to the mainframe, where all the applications and data were stored. In the nineties, we shifted to the opposite model: each employee’s computer should have a big enough hard drive to store software and data; every work station became its own mainframe (or server, as we would say now). In the last few years, we’ve pushed the other way, and now minuscule laptops and tablets run software from the cloud, and store data there as well. The same shift has taken place outside the office; home computers have undergone a similar evolution. There are no doubt good reasons for the shift; the rules and conventions of the computer game have changed quite a bit. But like many such sizable shifts in our culture, it has taken place with little or no consideration of why we did it the other way. Are the once highly-touted advantages of standalone computers no longer real or significant? We don’t know, because the issue was never debated out where most of us could hear. We did it the old way because there was money in it, and now the powers that be have found a new way to make money. You’re stuck with it whether it helps you or not, and you’re not even entitled to an explanation. That should be surprising, but in practice, it isn’t. Our policy debates routinely fail to explore how things got to be the way they are. It’s as if we all woke up one day and said, “Look, a problem! Let’s fix it!” With insufficient historical understanding, we attack large-scale problems with little or no attention to how they arose and fail to acknowledge the evils the existing approach has successfully prevented.

Tags: , , , , , , , , ,

Follow

Get every new post delivered to your Inbox.

Join 128 other followers