Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 40 years

special needs

(1980’s | therapese | “handicapped,” “disabled”)

Presumably descended from the already widespread phrases, “special education” and “Special Olympics.” The crucial change in recent years has to do with part of speech; “special needs” has gone from noun-adjective phrase to unhyphenated compound adjective — not that the old formulation has disappeared. The compound adjective started getting tossed around in the eighties. Before it was applied wholesale to students, it went with orphans and foster children. As one commentator put it in 1984, the old word for “special needs” was “unadoptable.” (Another was “problem,” as in “problem child.”) Now it can apply even to pets. “Special needs” come in many forms, from familiar physical handicaps to mental or emotional instabilities of various kinds, or maybe your kid is just slow (excuse me, has a developmental disability). It has become standard to talk about special needs kids, or the institutions that serve them — classes, programs, transportation — or the group that they are part of; “special needs community” is a common expression now, and it wasn’t twenty years ago. When you’re talking about children, “special needs” refers to disorders of individuals; when it is used to talk about the elderly or anyone else, it normally encompasses conditions common to most members of the group.

That distinction is interesting, and to see why we’ll have to go back to the noun-adjective construction, which has been available for a long time. Kids generally do not claim special-needs status for themselves; there are plenty of people anxious to claim it for them. But other kinds of special needs are advertised by the group they belong to. Take a phrase like “special needs of the oil industry.” In 1975, this phrase could easily have been used (in fact, it was) not to emphasize the burdens fossil-fuel barons labored under, but the privileges that their circumstances entitled them to. It was the sort of thing a lobbyist or legislator might remark upon just before pushing through a big tax break. You didn’t have to be underprivileged (does anyone use that word any more? — it was all the rage back in the seventies) to have special needs. And you don’t now. But we are much more inclined to hear it that way thanks to the last thirty years’ worth of education policy. Before 1985 or so, “special needs” meant “I’m better than you” rather than “I’m worse off than you.”

What does “special” mean, anyway? When it doesn’t mean “specific or distinct” (as it did in the Middle Ages and the Renaissance) or “extraordinary,” as it did then and still does, it means “unique,” a much more recent definition dinned into us by pop psychology. When I was a kid, this use of “special” was common, but it had grown up only in the previous couple of decades. “I am special” came to mean “I am unique,” with the corollary that uniqueness entitled you to respect. It was a word used by eager kindergarten teachers to reassure children that they were valued. “Special needs” doesn’t rely on that definition, though there is a clear echo in parents’ insistence that each special needs child is unique (and adorable, and so forth). But lots of kids may have the same, or very similar, maladies, so that they can be grouped together for purposes of education or therapy. “Special needs” doesn’t have to refer to extreme or bizarre conditions; almost any kid with a problem may qualify if their parents are persistent enough, and some of ’em are, because special needs is where the money is.

The phrase seems more like a euphemism than anything else, a way of coating disabilities — mild or severe — with kindergarten cheer. Language so used is ripe for parody, and “special,” which for centuries had a generally favorable connotation, has become an insult. Uttered with a smirk, it means “substandard,” and every kid knows it, just as they understand that students with special needs have something wrong with them. Yet the expression has hung onto a palliative quality in spite of all the currents running the other way.

Tags: , , , , , , , , ,

fashion statement

(1980’s | journalese | “statement”)

A bit of a cheat by my avowed chronological limits, but only a bit, “fashion statement” arose in the 1970’s and became available for use outside the industry in the 1980’s. Google Books shows effectively no instances before 1970, when it started to creep into fashion journalism. By 1985 it could turn up anywhere in the entertainment press, from sports to theater reviews, and even in political reporting. “Fashion model” and “fashion sense” are much older, “fashion plate” is older still; any of them might have provided a model for the new coinage. “Fashion police” and “fashionista” came along later.

The phrase can mean a lot of things. As of 2015, it applies loosely to any wearing of clothes or accessories to get any sort of attention. Like a muumuu, the vagueness conceals many meanings. Let’s try a few on:

-announcement of new line or even trend, normally at a major show, but only by means of the clothes themselves (a designer’s description of her new line at a press conference would not be considered a fashion statement)

-declaration of allegiance to a particular designer or trend.

In these two cases, the statement is delivered by the clothes themselves, and it centers on a designer or trend. But fashion statements may say more about the person making them:

-using clothes and accessories to show that you are independent of the current mode, or have an interesting variation on it

or, more broadly,

-any expression of one’s character, preferences, passions, etc., etc. through the medium of apparel. I’m not sure if wearing an Aeropostale shirt counts as a fashion statement. More loosely still, the phrase means

-doing or wearing something because it’s chic

-drawing attention to oneself by means of what one is wearing.

But references to the world beyond the runway are possible, too. Fashion statements may take aim at a social or political issue (as in students wearing Confederate flag t-shirts, or showing solidarity with gay peers by wearing denim.)

It doesn’t even have to be wearables: I came across an article in the Oberlin Review (April 3, 2015) about “decorative beards,” which are adorned with flowers, miniature Christmas lights, and who knows what. More than one student used “fashion statement” to talk about the new phenomenon. True, the donning of a three-dimensional object is still required to trigger use of the expression, but one wonders how long before beards or tattoos become potent fashion statements in themselves.

What you really have to watch with this expression is who (or what) makes the statement. It may be a designer, stunning this year’s audience with sheer audacity. It may be you or I, or it may just be the clothes. Who you are and what you wear may blend seamlessly, with your garments reflecting, nay, expounding your inner self through your carefully chosen wardrobe. Or you make your wardrobe as discordant or opinionated as possible in order to provoke reactions from bystanders. The gregarious looseness of this expression — abetted by the word “statement,” more general than declaration, announcement, or testimony — lets it cover such a broad range.

Fashion is often derided as superficial and trivial, but fashion statements, even light-hearted ones, are rarely dismissed out of hand. They are influential, or at least have the potential to be, and the power of a designer to inspire imitation through bold novelties remains considerable. Frankly, I would have expected the phrase to have taken on a negative cast over time, like hipster or comfort zone. No such derogatory usage has ever become the norm.

Tags: , , , , , , ,

be careful out there

(1980’s | journalese (television) | “watch yourself,” “stay alert,” “pay attention”)

Now here is a phrase brought to us by television — or at least propelled into our vocabulary by television. In its full form, “Let’s roll, and hey, let’s be careful out there,” it was delivered near the beginning of each episode of Hill Street Blues by Sgt. Esterhaus, played by Michael Conrad. The drama debuted in 1981, and it seems to have been one of those rare instances of an offering of the popular arts that survived on sheer critical acclaim for a long time before it found a loyal audience. There was a lag of a year or two before the phrase began to appear regularly in the press, but its upward progress was swift. By the time President Reagan used it in May 1983, reporters cited Hill Street Blues knowingly, and there was no doubt about what had made it a household word. Of course the phrase, at least in its condensed form, is not catchy, an utterance utterly ordinary semantically and syntactically and a poor candidate for a cliché, yet it has gone from tag line to stock phrase. People who use the expression today may not know they are quoting Hill Street Blues, but they know they are quoting something. According to the New York Times (June 8, 1986), the writers of Hill Street Blues probably adapted the expression from The Police Tapes, a series of cinema vérité documentaries on police work shot in the South Bronx, where the sergeant ended roll call with a similar injunction.

The expression is as self-explanatory as any, I suppose, but it has one distinctive feature: as far as I can tell it is rarely used jocularly. Telling a person or group to be careful out there is not to be taken trivially. You say it when there is genuine danger, whether physical or financial. We have become more preoccupied with safety and security in the last thirty years, which may account partly for the spread of the expression.

There was a time when cop shows were a fertile source of catch phrases. “Who loves ya, baby?” (Kojak), “Book ’em, Danno” (Hawaii Five-O), “I pity the fool” (The A-Team). (The grandaddy of them all, “Just the facts, ma’am” from Dragnet, never appeared on the show in any of its incarnations, according to multiple on-line sources, although Friday did say “All we want are the facts, ma’am” once.) That era appears to have ended in the mid-eighties. Maybe I’ve missed something, but cop shows of the last thirty years or so don’t seem to have spawned any linguistic fads. Did Homicide or The Wire lend any expressions to the language? Miami Vice? NYPD Blue? Cops? I wouldn’t know, but I did come across several lists of cop-show catch phrases on the web, and none of them had anything later than Hill Street Blues or The A-Team. If this is so, can anyone explain why? Faithful readers?

I don’t think I ever watched Hill Street Blues back when it was new, but I watched an episode (o.k., half an episode) on Hulu to hear Sgt. Esterhaus for myself. More or less at random, I chose the first episode from the third season (1983), in which a nun has been raped and murdered and a man has gotten his head wedged immovably between a filthy toilet bowl and the wall. (That’s only two of the story lines, and the others were also pretty lurid.) The intent seemed to be to extort the most raw and violent emotional response from viewers by assaulting them at every turn. If we do not respond viscerally, the producers have failed the advertisers. I had the same sensations about ten years ago, when I watched Desperate Housewives and Grey’s Anatomy back-to-back one fraught night. In each program, all of the half-dozen or so plot lines were grotesque, or nauseating, or perverse to the point of absurdity, almost as if the writers were challenging each other to make each plot twist more appalling than the last. I watch almost no television drama, and maybe I just have a knack for tuning in on the most excessive evenings. But if this sort of bombardment, or anything like it, is the norm, I don’t see how regular viewers can be anything other than numb. After weeks and years of this, how can pity and terror make themselves known? How can stories pushed far beyond anything like everyday experience — even the everyday experience of cops and emergency room doctors — tell us anything?

Tags: , , , , , , , , , ,


(1990’s | businese, computerese | “new business, firm, etc.”)

Hyphenated or not, this expression was well established as both a noun and an adjective by 1975, particularly in the business press, and it doesn’t appear to be much older than that. Google’s n-gram viewer finds almost no instances before 1950, and the curve doesn’t start to rise sharply until 1970, so it was fairly new then, but easy to use and soon absorbed. When it was a noun, it meant “commencement of operations,” or more colloquially, “opening” or “launch.” Normally it went with heavy industry, so it was common to talk of the startup of a plant or pipeline, for example. But businessmen love to scoff at grammar distinctions — there’s no denying startups invariably entail startup costs, a startup period, or, heaven forbid, startup problems — so they converted it effortlessly into an adjective. “Startup” may also clock in as a verb, but in that part of speech it is usually two words, even today.

By 1990, the concept of a “start[-]up company” had emerged, and occasionally the noun disappeared, leaving “startup” on its own. That wasn’t normal then, but today it is the rule. Back in the eighties, the shift from galumphing old factories to nimble new firms that didn’t make anything three-dimensional was driven by a hostile takeover of American life by the personal computer, a fait accompli by 1995. So many new companies concerned themselves with computer hardware and software that “startup” became common in computer publications by the late eighties. The word is older, but the way we use it today was probably driven by increasing computer sales, and computerese became the funnel for a businese expression — no surprise there. Michael Dell (of Dell Computers) was quoted recently on the “startup ecosystem” in India, and he even spoke of “meeting” (without “with”) several startups, not a use of “meet” I’ve encountered before. Since I haven’t actually offered a definition, here’s one I encountered on a German web site that does the job pretty well: “Startups sind Jungunternehmen mit besonderen Ideen – sehr oft im digitalen Bereich.” (Roughly, “Startups are new enterprises with unusual ideas, most often in the computer sector.”)

My sense is that “startup” had primarily a favorable connotation when it was getting established between 1985 and 1995. Such budding concerns were generally pegged as plucky or scrappy, determined pioneers taking on long odds with heads held high and a sound business plan. In that respect, it was more or less the opposite of “upstart,” which was always uncomplimentary. But as the term has lost novelty, it may have lost its sheen. Anyway, I don’t have the sense any more that it is complimentary. It seems more neutral than anything else.

The key related concept is the entrepreneur, always a figure celebrated in American mythology. Entrepreneurs breed startups, or shed them, or bring them forth from their heads, like Zeus giving birth to Athena. The crashes and recessions that have become frequent since the Nixon years may have dampened the spirits of some of these go-getters who start their own companies, but their flame burns bright as ever in our official worship of business. Entrepreneurs take the initiative, do their homework, embody healthy risk-taking, create jobs and prosperity, and otherwise exemplify the American way. Entrepreneurs are lauded especially on the right, because entrepreneurism is all about me rather than all about us. (That’s an oversimplification, but I’ll stick with it.)

According to my calculations, this is the 300th expression I have written about, at greater or lesser length. (I have become more loquacious over time, not less. Brevity is the soul of wit, indeed.) I chose “startup” as anniversary fodder partly because no operation was ever more shoestring or quixotic than this blog. I say thank you to my readers, to the ones who landed here once off of Google and never came back as well as the ones who read every post and comment faithfully. (You know who you are, and there ain’t very many of you.) I don’t do enough to encourage comments and feedback, but at least here I will say, if you ever feel an impulse to fire off a reply to one of my posts, or to send me an e-mail at usagemaven at verizon dot net, do it. Even if I don’t answer, I am grateful that you took the time, and I will profit from your wise words.

Tags: , , , , , , , , , , ,


(2010’s | “overindulging,” “spending too much time in front of the TV”)

A binge has always had something disreputable about it, and the mixture of pride and shame with which binge-watchers confess their latest debauchery proves that it still does — it’s been but a year since the Washington Post declared binge-watching socially acceptable. A word that goes back to the nineteenth century, “binge” means the same thing as “spree.” A prolonged drunk, spending too much money in a short period of time, that sort of thing. It always meant excess. People started talking about “binge eating” and “binge drinking” in the seventies and eighties, probably the first time “binge” was used as an adjective in any widespread way. There was a rough equivalent to binge-watching in my youth, but we named the actor rather than the activity: couch potato (still in use, though it need not have anything to do with television any more). Couch potatoes’ preferred verb was “view,” anyway. Some people do say “binge-viewing,” though it is less common, at least in the States.

What is this thing called “binge-watching”? One psychologist notes that all it really means is “spending a longer time than normal watching television. . . . Netflix conducted a survey in 2014 where viewers defined binge watching as viewing between two to six episodes of a show in one sitting.” The phrase does conjure up red-eyed, addled viewers losing entire weekends to the new season of their favorite Netflix series, but does prolonged viewing become “binge-watching” only when it is obviously harmful? According to my limited research, the consensus answer is no; “binge-watching” may just denote a harmless way to spend a few stray hours. But the dubious heritage of the word “binge” will make that innocuousness hard to keep up.

The earliest unmistakable instance of “binge watching” in LexisNexis comes from Australia in 2006, and it trickled into American English shortly thereafter. Before the advent of home video recording, such a thing wasn’t really possible, and it didn’t become feasible until the practice of issuing entire seasons of television programs on DVD became prevalent — archaic as that seems in the days of Netflix and Hulu and lots of hipper streaming services I’ve never heard of. In my younger days, a complete retrospective of a certain director’s films, say, might have been called a marathon, or a festival, or maybe just a complete retrospective. (You come across expressions like “Game of Thrones marathon” even today.) In the nineties, it was possible to buy complete runs of at least a few television series on VHS, but the term did not arise then. So maybe this is a millennial thing: the idea that watching hours and hours of your favorite show, and dropping everything to do it, is a worthy activity. Not that you have to be a millennial. And now, new series must be written with an eye to the preferences of binge-watchers.

When I was in college, “Wheel of Fortune” turned the 1968 song “I’m a Girl Watcher” into an advertisement for itself. Then “Baywatch” was all the rage. The act of watching seems to have become linked ever more suffocatingly with television in the seventy years we have been groveling before the tube — I guess we have to call it “the screen” now, since there’s no tube any more, unless your television set is as old as mine. After “binge-watching” settled into our vocabulary, “hate-watching” arrived as well, meaning simply “binge-watching a show you hate,” with the implication that it’s the sort of show you love to hate, at least according to one writer. Perhaps inevitably, “purge watching” has sprung up, meaning “hate-watching” with less passion, more out of a desire to get the offending show over with than to enjoy noting how awful it is. Who knows what other “watch”-words will come?

Tags: , , , , , , , , , , ,


(1990’s | businese (finance) | “con man,” “crook,” “trickster”)

A far as I can tell, we owe this one to the Brits, or maybe their ex-colonials. There are very few LexisNexis results from U.S. sources before 1995; nearly all come from Great Britain and the recent colonies. We’re certainly not above borrowing Briticisms and their cousins (“at the end of the day,” “over the moon,” “selfie“) in these parts, and the venerable “-ster” suffix (see below) rolls off the tongue easily in America, the land of gangsters and mobsters.

This word probably is not necessary in either hemisphere, but it does have the advantage of incorporating the very act its embodiments practice. The old equivalents (see above) do not. “Con[fidence] man” is itself a bit deceptive; it really refers to someone you should not have confidence in, and he has to convince you to do so. “Crook” is more general than “fraudster,” but the two words line up pretty well, so it’s rarely jarring to substitute one for the other. A fraudster sees an opening and uses it dishonestly for personal advantage. The word seems indifferent to distinctions like preying on individuals vs. cheating corporate bodies, or large-scale vs. small-scale crime. As the OED (first citation: 1975) notes, “fraudster” denotes in particular one who lies or cheats in the course of a business transaction. But it is not restricted to such use and has already spread out.

I cannot help but wonder if the rise of this expression since the seventies does not result from a simple (or rather geometric) increase in chicanery. It may be that an ever more complicated and less regulated financial system, coupled with increased criminal activity enabled by widespread use of computers, has made it ever easier to pull scams, causing a new expression to erupt, a boon to harassed writers if no one else. It’s such a relief to have a new, yet easily grasped, synonym to haul out once in a while.

The “-ster” suffix repays study. My feeling is that it has had a bit of an underbelly for centuries, but any negative connotation probably became more pronounced in the twentieth century, at least in the U.S. “Gangster” and “mobster” both date back to somewhere around 1900, according to Random House; I suspect that “gangster” relies on “teamster,” an eighteenth-century expression that did not, as far as I know, develop a dark side until the twentieth, when the Teamsters’ Union for a time became synonymous with corruption. (Tapster, tipster, and trickster, all dubious trades, are much older words. Speaking of deplorable occupations, where do “barrister” and “monster” fit into all this?) The suffix doesn’t always have a negative connotation, even today; when connected to a name, it may be affectionate. For example, Tom Bergeron used to call Whoopi Goldberg “Whoopster” on Hollywood Squares. Then again, after fifty years as a compliment, “hipster” finally became a dirty word somewhere around 2000. I’m not enough of a linguist to offer a proper history of the suffix, but “baxter” (female baker) and “brewster” (female brewer) are very old. According to Chambers Etymological Dictionary (thanks, Liz!), “-ster” comes from Anglo-Saxon, where it denoted specifically a female practitioner, but well before the Elizabethan era the gender distinction had disappeared. Chambers also notes that the suffix, originally attached to verbs (bake, brew), as befits an equivalent of the “-er” suffix, now hooks more readily to nouns. It has gone on yoking itself to new words for centuries now, and it usually seems to have something shady or untrustworthy about it. “Fraudster” thus takes its place in a long, rich tradition.

Tags: , , , , , , , , , ,

push poll

(1990’s | journalese (politics) | “dirty trick”)

Push polls originated in political campaigns. They are presented as impartial, conducted by an organization not technically connected with a candidate (although if a respondent pushes back hard enough, the caller may be forced to reveal his true employer). The point is not neutral assessment of public opinion, but rather influencing it directly, often by making a dubious, if not outright false, imputation about an opponent — sometimes the mechanism relies on laudatory comments about oneself, but the intent is the same. It usually takes the form, “If you knew so-and-so about Candidate X, would it make you more or less likely to vote for her?” The phrase established itself in the mid-1990’s, with the first hit in LexisNexis due to David Broder in October 1994. By the 1996 election, the expression had common currency in political reporting, and many commentators no longer bothered to define it, which had been the rule only a few months earlier.

There’s nothing new about libeling political opponents, but the problem is the means. To be effective over the long term, opinion polls have to be fair, designed to avoid favoring one group over another. A survey has the presumption of fairness; therefore, it’s worse to use a poll to perpetuate slander than to use other means. The main point, as stated by Matthew Reichbach in the New Mexico Independent (September 22, 2009): “a push poll is not a poll at all. Its a fraud, an attempt to disseminate information under the guise of a legitimate survey.” It is a fraud in that it presents itself in a misleading way, but the “information” conveyed may be fraudulent, too. Reputable pollsters hate them and are forever calling for an end to push polls.

In 1996, the derivation of “push poll” was generally explained as a simple elaboration on the act of pushing voters away from a particular candidate. That’s folk etymology, but there is probably some underlying truth. The term comes out of pollsters’ jargon, by evolution or corruption. “Push poll” is actually a descendant of “push question,” described in 1982 by William Safire as “designed to squeeze respondents to come up with the answer the sponsors want.” It’s a variety of loaded question native to the survey business, and not necessarily unethical, although it does lend itself to unethical use.

A push poll may contain more than one push question, heaven knows. In 2001, I was on the receiving end of a Bloomberg-for-mayor push poll which consisted almost entirely of pro-Bloomberg statements masquerading as questions. I finally said something like “Aw, c’mon,” and the (apparently young and definitely inexperienced) questioner agreed that the bias was pretty obvious. I kept going, answering each question gamely in the most anti-Bloomberg manner I could muster, but the whole thing was a farce. Did Bloomberg pay for it? Who knows? Yes or no, the whole process showed nothing but contempt for our intelligence.

And now, the scope of push polls has broadened, and you don’t have to be a politician any more. Anyone trying to influence public opinion — a corporation (Walmart seeking to open a store in Chicago), a social movement (a group trying to promote, or scuttle, gun-control legislation), etc. — can initiate them. I’m pleased to note that the phrase still carries strong opprobrium, and it is thrown around in grim accusation or indignant denial, never in approbation. It may be true that one man’s push poll is another man’s opposition research, and political professionals once defended push questions as legitimate, if they raised a verifiable point about the rival candidate. No one defends them any more, at least not in public.

Tags: , , , , , , , , ,

bucket list

(2000’s | journalese)

I’m not real up on popular culture, despite rubbing elbows with teenagers with modest regularity, so I will go ahead and explain this term for those who share my plight. “Bucket list” was popularized by a movie of the same title released in 2007. It refers to the goals you aim to achieve before you die — that is, kick the bucket. (“Kick the bucket” goes back to the eighteenth century, and its origins remain uncertain.) The film received plenty of publicity, starring as it did Morgan Freeman and Jack Nicholson, and the phrase soon took root in the popular lexicon. According to one history of the term, it was first used in today’s sense in 2004, but the citation isn’t entirely convincing.

Not that the phrase was invented then; it had a previous life in engineering and computer science. My father, an electrical engineer, remembers using the phrase to denote instructions for placing integrated circuits on circuit boards to build this or that device. In computer science, a bucket is a storage place or buffer for disparate pieces of data that have a common feature important enough to warrant grouping them together. When you record the names and locations of the different buckets, presto! a bucket list. The term was also used by archeologists to mean an inventory of artifacts removed from a dig (also “basket list”), a surprisingly intuitive usage. “Bucket list” could also refer to a group of items set aside in a negotiation, for example — points to be considered later, or in a different phase of the discussion. None of these definitions was ordinary or widely understood. Now they never will be. The first three were strictly technical terms, and the fourth never took hold. Because it was unfamiliar to most of us, when the hype for the movie ramped up in 2006, reporters felt compelled to explain the phrase and its derivation. When the movie was released at Christmastime in 2007, reviewers followed suit. By 2008, most of us knew what it meant and why.

What we didn’t know was how. My research was not exhaustive, but I couldn’t find any sign that any of the two million or so reporters who covered the film ever thought to ask any of its creators how they arrived at “bucket list.” The story and screenplay were due to Justin Zackham, and Rob Reiner directed. There’s no reason to think Zackham doesn’t deserve credit for the coinage (or repurposing, as the kids today say), but did anyone ever ask him to expound on his language-changing idea? Here’s a new word that everyone is using all of a sudden, and it has an unusually unconvoluted path into our vocabulary. Not only that, there were only one or two people that had a plausible claim to originating it. How come no one asked them about it? Gee, Mr. Zackham, where did you come up with “bucket list”? For at least a few years before 2007, you can find citations of “life list” used to mean exactly the same thing, a term from birdwatching that used to denote a catalogue of every variety of bird one has sighted (I confess I don’t recall ever hearing it). “Wish list,” though much more common, is much less specific, lacking the urgency lent by death’s door. Other than that, I don’t know of another word for the phenomenon, old or new. So whoever came up with it did us all a favor.

There’s a tendency to believe that many new expressions come from movies or television, but in my experience it’s rare to find one that is both invented (or at least given what appears to be an entirely new definition) and popularized by a single film. Most new or newish expressions popularized by movies were definitely in use before the film came out. “Bucket list” may have been, but the evidence is very sparse and unconvincing. Other examples of film-borne expressions: “wingman,” “don’t go there,” “you’re toast,” “meltdown,” “perfect storm.” All had been sighted before appearing in the film that made them famous.

Thanks to my sister for nominating “bucket list” this week, and to Dad for pitching in with some old IEEE lore. The family that blogs together slogs together.

Tags: , , , , , , , , , ,


(1990’s | militarese | “chickens coming home to roost,” “fallout,” “consequences”)

So you think you know what this word means? Actually, it means several different things, so chances are you’re right.

A century ago, “blowback” had mainly to do with guns and ammunition. It still does, although the original meaning doesn’t show up much any more — fire or explosion caused by the breech of an artillery gun opening at the wrong moment, allowing flames and explosive gases to go out the back end of the gun rather than the front end. (That’s the best I can do, even though my uncle was a gunsmith.) I found several descriptions and definitions of the phenomenon, which are broadly similar but differ in significant details, and I don’t know enough about how guns work to make sense of it all. Anyway, this sort of blowback is dangerous and can cause death, among other things. Nowadays when “blowback” is used of a firearm, it almost always refers to a means of loading cartridges in a semi-automatic pistol or rifle. (See impassioned technical explanation here.) The idea seems to be that some of the gases under pressure generated in firing a bullet are directed toward pushing the next shell into the chamber, causing the weapon to reload automatically, until it jams.

Both of these gun-related usages are found in Webster’s Third and were available before the foreign-policy/CIA-type use of the word we are more likely to think of today. But even in this narrow field of definition, “blowback” has undergone a decided change. Around the time of the Church Committee hearings on CIA misdeeds in the mid-1970’s, this term began to creep into the press, meaning disinformation. More specifically, “blowback” was defined as a fake news story prepared by one of our intelligence agencies for dissemination abroad that later was reported as fact by the U.S. press. I think it’s still notionally true that the CIA is not supposed to do its dirty work within the borders of the U.S., although of course it does and always has. But politicians still considered it worthwhile in those days to object to Americans being subjected to lies intended for foreigners. (They had no comparable objections to lies intended expressly for domestic consumption.) Christopher Simpson’s book “Blowback: America’s Recruitment of Nazis and its Effects on the Cold War” (1988) had to do with “unexpected and undesired domestic effects of foreign covert actions,” according to one reviewer. Such a definition is vague enough to encompass the meaning limned above and the more specific meaning in use by the early 1990’s: attacks on U.S. people or facilities inspired by previous U.S. operations, covert or overt. (Or maybe simply in response to an unofficial provocation, like encouraging people to draw cartoons of the Prophet Mohammed.) Here is a definition-by-example offered by Charles G. Cogan, the former C.I.A. operations chief for the Near East and South Asia, in the wake of the first World Trade Center bombing in 1993: “The hypothesis that the mujahedeen would come to the United States and commit terrorist actions did not enter into our universe of thinking [during the covert war against the Soviets in Afghanistan].” Another book titled “Blowback,” published in 2000 by Chalmers Johnson, probably gave this meaning of the term a boost, though it was already current.

Cogan went on to use the expression “unintended consequence,” a favorite of bureaucrats caught with their pants down. “Blowback,” while it has a fairly clear definition, is also a bit slippery and seems more comfortable as part of a constellation rather than standing bravely on its own. Let’s not even consider the cases in which “blowback” substitutes for “pushback,” a usage that has become common. “Blowback” now is often used to mean “adverse reaction from almost anyone,” not just aggrieved foreigners. But let’s think instead about some related expressions. An obvious one is “blow up in one’s face,” not a precise synonym but clearly in the neighborhood. “Payback.” Definitely an element of vengeance in “blowback” as we use it today. “Karma,” which may apply to corporate bodies (though, as of 2015, not generally the government), also comes to mind. There’s a little jungle of overlapping expressions here, none of which means the same thing as any other but all of which call the others quickly to mind.

I can’t resist closing with an instance of “blowback” from U.S. Navy regulations propagated in 1913: “The danger of a broken firing pin point or on the fusing of metal on the face of the breech-block, due to a primer blowback, shall be constantly borne in mind and guarded against.” Isn’t that great? They don’t write ’em like that any more.

Tags: , , , , , , , , , , ,

alpha male

(1980’s | academese | “leader of the pack,” “take-charge guy,” “macho man,” “dominant male”)

This expression takes advantage of the fact that we are animals and there is something very satisfying about showing direct analogies between human and animal behavior. “Alpha male,” along with “alpha female,” goes back at least to the sixties (the thirties, says William Safire), used first to talk about pack animals, especially wolves and primates. Explanations of social organization generally centered on the the top dog (or whatever), who made all the decisions, got the females he wanted, and scared his inferiors into submission. The typical alpha male had won his place by defeating, perhaps even killing, the previous alpha male; in those days, it was understood purely as a matter of physical domination. The phrase seems to have been applied to humans first in the eighties, generally meaning some combination of “leader,” “the one who gives orders,” and “the one who gets his way.” Sometimes brawn and aggressiveness alone defined the human alpha male, but more often it was a matter of wielding power over others through sexual attractiveness, overweening wealth, political clout. Not infrequently the phrase is used as a straight synonym for man who has a lot of sex with a lot of women. In the nineties, the phrase was used sometimes of Bill Clinton, apparently reflecting both his executive primacy and his prowess. In 1999, Al Gore hired Naomi Wolf as an advisor, whose role was widely reported at the time as teaching Gore to be an “alpha male,” though Wolf denied that’s what she was actually doing. Anyway, use of the phrase went up sharply in 1999, according to LexisNexis, and that increase appears to be permanent.

All these meanings remain in play today. I even found a nice new one, courtesy of a senior editor at Harlequin Romances: “Werewolf and vampire heroes are examples of the alpha male, strong and protective.” I assure you that in the old days, no one ever called an alpha male “protective.” But the term has also acquired a negative tinge, or at least the possibility of one. Two examples from 2009: sportswriter Francis X. Clines of the New York Times referred to obnoxious football fans as “alpha male bellowers.” Professor Robert Sapolsky alluded to “‘totally insane son of a bitch'” types, the sort of alpha males “who respond to stress by lashing out.” These are not just admissions that alpha male behavior might alienate people now and then; they are twists on the term that provide a new field of connotation. The idea that an alpha male exerted anything less than total authority in his field, or had anything to apologize for, was almost unknown as late as 2000 — it was nearly always a term of admiration or envy. Urban Dictionary offers several examples of sardonic or derogatory definitions of the term, though in fairness, most of them have not been treated well by voters. “Alpha male” may be developing the same double life as “type-A personality” (or “control freak“), which might be used as a compliment but generally is not. As beta males conspire to get their slow revenge on the alphas, more such heretical definitions may creep into the language. Among humans as among animals, a group of lesser men acting in concert can bring down the most potent head man. Julius Caesar went from “he doth bestride the narrow world like a Colossus” to “Then fall, Caesar!” in two short acts.

If the expression continues to take on darker meanings, it will mirror the decline in primatology and other disciplines of the whole notion of alpha males lording it over their enclaves. Frans de Waal and L. David Mech, among others, have moved away from descriptions of social organization dependent on such rigid hierarchies. The very concept of the “alpha male” has little to do with the politics of group behavior among animals and crudely oversimplifies the ways they organize themselves. The idea probably was born more of the predilections of mid-century researchers, and a general urge to find easy explanations of complicated phenomena, than actual observations of wild animals. (In fact, many early studies used captive animals, who behave much differently from their counterparts in the wild.) It may well prove that the alpha male today, like the social Darwinist a century earlier, is no more than a pseudo-natural mandate for the most selfish and sociopathic among us to justify their promiscuous, arrogant, or exploitative desires. For now, “alpha male” still retains much of its old shine, but that may change in the next ten or twenty years.

Tags: , , , , , , , , , , , ,


Get every new post delivered to your Inbox.

Join 128 other followers