Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 40 years


microaggression

(2010’s | therapese? academese? | “little thing,” “insult,” “slight,” “dig”)

Now that Jim Crow is no longer legal (not that it has disappeared), we are left with microaggressions: words or actions directed at members of a minority group that appeal to negative stereotypes, intentionally or not. They do not violate any law, sometimes not even social convention, and in some cases the oppressed person can’t even explain why he is offended. But they can have a powerful cumulative effect, causing people to feel as degraded as their forebears felt under more immediately threatening conditions. To such victims, the microaggression is only a more subtle means of keeping women, African-Americans, Latinos, gays and lesbians, Jews, the homeless, trans- people, et al. in their places. It’s not just white men who commit microaggressions, though we do it more than anyone else, partly because we have the biggest pool of people to commit them against. But pecking orders are observed here as elsewhere, and each group looks for another group to feel superior to. In U.S. culture, everybody gets to pick on African-Americans, but African-Americans get to pick on LGBTQ people. Men lord it over women; the sharp mulct the dull. There must always be a way to define yourself such that there exists a class lower than you. As long as we seek such imbalances of power, we will have fertile fields for microaggressions, among other things.

Many sources attribute the coinage to Professor Chester Pierce, ca. 1970, an African-American professor of psychology at Harvard. The New York Times also pointed to a 2007 article by Professor Derald Sue that pushed the term out of the academic ghetto into wider use. (I certainly don’t recall hearing it before then.) To this day, the word is used far more often at universities than anywhere else. We have a lot of “micro” words now: microfiber, microloan, microblogging. “Microcephaly” has reared its ugly head recently thanks to the Zika virus. Two more examples sometimes seen near “microaggression” are “microinequality” or “microinequity.” I can’t help but hear an echo of the medical term “microabrasion,” which has little semantic connection but a strong phonological one. The word “aggression” does get people riled up, but the reason “microaggression,” despite its technical, academic sound, has some punch and poignancy stems from the fact that such acts occur only in situations when both the aggressor and aggressee are in direct contact, normally in a public place; they cannot be committed remotely, except by telephone, but even there you have two people engaging each other. Personal interaction is required.

Microaggressions have emerged as the latest fodder in an old debate: Are the oppressed overreacting to unexceptionable behavior, or are the oppressors using any means available to remind everyone else who the boss group really is? The more fundamental question — who gets to decide? — may be shunted aside. Straight, well-off white people are quick to suggest that microaggressions are symptoms of hypersensitivity or political correctness, a means to make us feel guilty even after we’ve made the reforms we were asked to make (well, most of us). But SWOW’s likewise dismissed much more brutal and intimidating means of subjection, from segregation of public amenities to lynching. You know, “They don’t have it so bad. Look at all the nice things we do for those people.” Not much comfort when you’re hauled off to jail for sitting in the wrong place or getting killed for an imagined offense against some white man’s code of honor. That old feeling of domination, whether backed up or not by formal legal sanction, counted for a lot. Treating as equals those you have been discriminating against for generations is a hard pill to swallow, and lots of people are tired of trying. It’s easier to say, “Wait a minute. I’m a victim, too!”

The rise of the microaggression may be taken optimistically: Except in a few extreme cases, physical and economic violence have gone out of the practice of racism, etc., leaving only petty snubs and well-meant gaucheries, which do much less real damage and will in turn become unacceptable in another generation or two. Or pessimistically: There’s no end to it. We get rid of one layer of abuses, and there’s another below that, and another below that. Microaggressions definitely damage some individuals, and that will ultimately hurt the larger society. My two cents: I haven’t thought this through, and it may be untrue, but it seems to me that if a half-concealed sneer can cause significant harm, then small kindnesses may also have an effect greater than their magnitude. It would be awfully nice to think so.

Tags: , , , , , , , , , ,


power nap

(1990’s | teenagese? therapese? | “siesta,” “catnap,” “forty winks”)

I grew familiar with this term as businese, through articles about frazzled employees needing a way to get back on track during the workday. That’s probably where you learned it, too, but the phrase more likely saw the light of day elsewhere. It was in use among college students in the late eighties, and still is, but it became much more familiar to the rest of us in the nineties when psychologists started pushing the benefits of resting and recharging at the office. The businese definition has largely won out, yet students even today may assign the phrase a slightly different meaning. Businesspeople use the term to mean a short period of sleep intended to increase alertness, vigor, and therefore productivity. Students use it that way, too, but it can also mean a period of deep sleep without any indication of duration. In 1988, New York Times columnist Richard Bernstein defined it as “deep sleep induced by extreme exhaustion,” and cited it as an example of college slang. That sense has not disappeared completely, though it has been largely eclipsed.

The reason it sounds like businese is that it goes with “power lunch” and “power tie,” which became clichés in the eighties, when the cult of the world-bestriding businessman, brought low for a couple of generations by the Great Depression, ramped up again. Flaunting was in, and executives took pride in asserting their prerogatives. In the early nineties, when psychologists like Dennis Shea, James Maas, and Bill Anthony began writing about the benefits of brief rest periods for white-collar workers, “power nap” made our vocabulary more productive and efficient. (I can’t resist: “Feeling logy at work? There’s a nap for that!”) But powerful people don’t generally sleep on the job if they want to stay that way, and a power nap wasn’t a way to project one’s own muscle (like a power tie) or extend one’s dominion (like a power lunch). The fit isn’t as neat as it sounds, more evidence that “power nap” was not native to businese.

In 1992, the Guardian, reporting on the U.S. military’s methods of keeping soldiers minding sensitive or complex equipment as sharp as possible, noted that those charged with such duties were instructed to rest regularly: “to avoid implications of sissiness, such rests are called ‘power naps.’” Another possible origin story for “power nap,” one I don’t find very convincing. There’s no doubt that our armed forces are a great source of euphemisms (collateral damage, anyone?), and it’s also true that there is a lot of stubborn machismo in the ranks. But even the Army must put aside long-cherished prejudices when science and experience team up to demand it. “Soldier, I order you to take a power nap before your next eighteen-hour shift!” “Yes, sir!”

No matter how many studies demonstrate that short rests during the workday improve employee performance, most bosses still view power naps as proof that workers aren’t serious about their jobs. I’m as prone as anyone to get sleepy after lunch, but I shudder to think of how my boss would react if he caught me in an actual doze. Your average boss just can’t get past that rock-bottom-line calculation: time spent sleeping is time spent not working, and you’re here to work, so sleeping on the job is dereliction, dress it up as you will. American bosses are not, on the whole, a very imaginative or innovative lot. The experts can talk till they’re blue in the face, but the boss knows what he knows. Power naps are for weaklings.

Tags: , , , , , , , , , , , , ,


vape

(2010’s | hipsterese? teenagese?)

Primarily a verb, I would say, but available as a noun (short for “vaporizer” or for the practice of “vaping”), or for modifying fanciful store names (there’s one on 14th Street called Beyond Vape). One who vapes is a vaper, which may remind antiquarians of “viper,” a very old word for marijuana smoker. “Vape” was not entirely new when we first encountered it between 2005 and 2010 — 2009 is the first time it shows up in mainstream press sources, says LexisNexis — it had seen limited use before that as short for “vaporizer,” but that was before anyone thought of a vaporizer as a way to ingest nicotine or anything else. For that we had to wait until the early 2000’s, when a Chinese pharmacist invented the battery-powered nicotine delivery device, which heats liquid to form vapor rather than leaf to form smoke. It took a few years, but by 2010 electronic cigarettes had become noticeable. They looked suspiciously like cigarettes — and plenty of people were and remain suspicious — but they produced far less dangerous fumes, though probably not perfectly safe. A few short years later, vaping need have nothing to do with nicotine, and dispensers need not look like cigarettes, though the ever-popular vape pen retains the slim, cylindrical shape. It’s become an art and science and commerce all its own. Shops have sprung up everywhere, and vaporizers have supplanted hookahs as the hip smoking device. I see people vaping all the time now on the streets of New York. Professional worriers have stopped worrying about hookah languages and started worrying about kids taking up vaping.

There are a number of associated terms, of course, (and a legion of brands to match); if you want a chuckle, check out the alphabetical list of headwords on the right of Urban Dictionary’s “vape” page. I won’t try to go into all of them, but here’s one glossary (here‘s another). The medium for the nicotine, flavoring, or whatever you put in your vaporizer is generally called “e-juice” or “e-liquid.” Another term for the device is “PV,” for “personal vaporizer.” Basic tools of the trade have been shortened to “atty” (atomizer), “cart” (cartridge) and “bat” (battery). A souped-up PV is called a “mod” (short for “modified”), which should not conjure up visions of the Mod Squad. A “clone” is a fake, basically, a knock-off or counterfeit. The sensation of a puff of vapor going down is called a “throat hit.” Regular old tobacco cigarettes are known as “analog cigarettes,” though there’s nothing digital about an e-cigarette; the association with e-mail and other computer-spawned e’s is fortuitous.

We are entitled to wonder why vaping became so popular so fast. Much is made of its role as an aid to giving up smoking, with accompanying debates over how safe it really is — debates that continue to rage, though most observers agree that they are less toxic than old-fashioned cigarettes. It seems likely that many vapers took it up for that reason. Vaping is cool rather in the way that smoking used to be: not rebellious exactly, but a bit transgressive, a little dangerous, developing a subculture recognized by the general population. But there’s also the technological factor. Vaping is in because it has produced new gadgets and lots of opportunities to mess around with them. The engineer types like having things to play with, and the techno-buffs revel in the latest improvements. There’s also the rage for anything new that occupies a surprising number of our fellow citizens, which I have cited before as a powerful force behind new concepts and expressions in our discourse.

Tags: , , , , , , , , ,


decompress

(1980’s | athletese | “unwind,” “relax,” “take it easy”)

This word first came to our attention primarily as a result of the Iran hostage crisis, or rather its end in January 1981. The hostages flew first to a U.S. base in Germany and stayed there for several days. The State Department discouraged family members from visiting them, because they needed time to “decompress.” The word had appeared before with a similar meaning, but it showed up in all the major news outlets and was treated as a novelty. The word was also used on occasion to talk about Vietnam veterans returning too quickly to civilian life.

Much older in the contexts of medicine, engineering, and particularly diving, “decompression” is extremely important to deep-sea divers, who must avoid the bends by returning to the surface very gradually, resting at certain depths along the way so their bodies can get accustomed to lower pressure. This use seems to be the direct ancestor, and it is definitely echoed in both the cases of ex-hostages and ex-soldiers. Moving from a high-pressure environment to less intense surroundings requires time to adjust; the more time taken, the more likely the transition will be smooth. In engineering and medicine, “decompress” meant simply “relieve pressure,” obviously a related usage, though normally transitive. (Why didn’t Jimi Hendrix do a song called “Manic Decompression”?) In computerese, “compress” was in use by the mid-eighties to denote making computer files more compact, or combining them, without deleting data, and “decompress” was its usual antonym; it can still be used that way, though my ear says that “extract” has become the most common term for restoring the files to their original size and configuration.

Soldiers in Vietnam and the hostages in Iran both went through terrible ordeals, and “decompress” was often used in such contexts in the eighties. Now we are more likely to talk about a vacation from work or a little r&r rather than recovering from prolonged physical and emotional strain. One can find instances of “decompress” even in the seventies referring to respite from much less arduous circumstances. Even so, my own feeling is that the word still bears some weight. If you need to decompress, you’ve been under significant stress — “stress” itself has evolved into the verb “de-stress,” which is a competitor — and probably for some time. Or perhaps the average daily stress level (I propose a new statistic to the Labor Department: ADSL) has gone up in forty years to the point that a garden-variety vacation from the office seems tantamount to a break from captivity or jungle warfare. “Decompress” has been helped into prominence by its association with “stress,” not only by virtue of rhyme but by contiguity of sense as well.

Tags: , , , , , , , , , , ,


relatable

(2010’s | teenagese | “engaging,” “relevant,” “familiar,” “accessible,” “personable”)

“Relatable” is one of those expressions thrown up by our younger contingent. (Other examples: “take a chill pill,” “peace out,” “sketchy,” “stoked,” and possibly “love handles” and “no pressure.” “Based off of,” “I know, right?,” and the “because + noun” construction have swept the under-18’s decisively in recent years.) Teachers report periodically new words or phrases bubbling up in the classroom, and “relatable” had its moment somewhere around 2010 and has become widespread since. I certainly did not know the word in 2010, and probably not for three or four years after that. It’s tempting to blame such eruptions on social media, but consumable popular culture for teens has been omnipresent for decades and did not always require Instagram or Tumblr. Once the kids adopt an expression, it has a strong chance of entering the language, because the rest of us spend so much time talking about what they’re up to and what it bodes for the rest of us (ill, generally). Also because some day those kids are going to take over the world, or at least this corner of it.

The teenagers didn’t invent this one, mind you. “Relatable” was available in the early 1980’s, especially in writing on film and television; it meant roughly “agreeable” or “comfortable” — more accurately, “characteristic of something most Americans can identify with” — doubtless descended from “relate to” as used in the sixties. The new sense of the word has hung around ever since, so the teenagers of 2010 had had many opportunities to learn it. The old meaning, “capable of being told,” has grown rare, and we are left with the inescapable fact that “relatable story” means something much different from what it did fifty years ago.

Every teenage addition to our vocabulary calls forth a phalanx of teachers and professors to bewail it, and “relatable” has been written up in The New Yorker, Slate, and the Chronicle of Higher Education, among other places. (Ben Zimmer provided a non-judgmental history in the New York Times.) The good professors have a number of reasons for objecting to the term, all of them cogent and stoutly defended. Use of the word proves students self-centered, closed-minded, unwilling to try new things or broaden their horizons. But let’s not forget that the older generation always says as much about the younger, often with justice. It is true that most kids don’t want to do a lot of work to absorb their lessons, and therefore they prefer everyday language, stories, and characters they can understand without effort. But plenty of these same kids will grow up and open out, and it’s no use pretending that this is some unprecedented defect never encountered before millennials stuck a trembling toe into adulthood. Grousing about the rising generation is as old as civilization, at least.

“Relatable” doesn’t always mean likable. When used to talk about everyday situations, it is more likely to connote awkwardness or embarrassment than triumph. You can find collections of mottoes, truisms, and slice-of-life stories all over the web that advertise themselves as relatable. Maybe my sample size isn’t large enough, but I came away with the distinct impression that the most of them have to do with unpleasant contretemps that we try to get past without humiliation. We are all supposed to sympathize and see ourselves in others’ tales of woe, or the nuggets of wisdom acquired from them. Any pleasure we take in such misfortunes is rueful. But we are also to take away the unstated conclusion that those who encounter the same predicaments or feel the same way about etiquette as we do make up the only world that matters — our experience is universal, and everyone else’s? — well, we’ll make room for that around the edges, if we feel like it. “Relatable” is seductive to the extent that it assures us that our group is the center of the universe.

Thanks to that inspirational teacher and observer of the language, Lovely Liz from Queens, for pointing out that this expression needed an airing. I hope I pass.

Tags: , , , , , , , , , , ,


my work here is done

(2000’s | businese? | “I’ve done what I set out to do,” “I’ve done what I can”)

Little wonder this phrase has become a popular meme in recent years; it can convey exactly the sort of self-satisfaction and superiority that express themselves so often and so malignantly on the internet. Of late it has become popular in right-wing circles as a means of bashing Obama — they never tire of it — as in this cartoon. It doesn’t have to be this way. The phrase can have a benevolent sound, the sort of thing Gandalf or Obi wan Kenobi might say, though as far as I know neither of them ever did. But juvenile irony is winning the day; now the expression goes readily with scenes of catastrophe and chaos.

Most on-line sources cite three possible sources of the catchphrase: the Lone Ranger, Mary Poppins, and Blazing Saddles. I’ve been able to verify only the last, not the other two. Some cite Errol Flynn in The Mark of Zorro (1940). “My work is done here” is a variant (used by Leonard Nimoy in the monorail episode of The Simpsons, for example); it means the same thing and has the same weight. This formulation is technically ambiguous, but the alternate meaning (I do my job in this place) does not obtrude. “My work here is done” began to appear sporadically in LexisNexis right after 1990. Then as now, it was a favorite of departing CEO’s who wish to convey the impression that they have completed their stint with honor and can safely hand responsibility to their successor, provided they get their severance package. Perhaps that’s when the phrase picked up its odor of smugness. Despite, or because of, the ironic turn, it still bears a hint of hipness and remains the property of college kids, middle-aged columnists, and corporate consultants alike.

But the question is not “When did this expression originate?,” because the phrase is not fixed, and any normally equipped English speaker could utter it in the course of conversation. It’s an ordinary English sentence, after all, and doesn’t require a mythical origin. True, it is a bit more elaborate than what you might call the ground-level expression, “My work is done,” which has an almost Biblical simplicity. The question is when did it become the sort of thing people ask about in chat rooms and forums? According to LexisNexis, it started turning up regularly in the press not long after 2000. It might be used to end an article, or, conversely, as a blogger’s headline. As early as 2004, I found an example of the now familiar meme: “Chaos, panic, disorder — my work here is done.” (Google it and despair.) The phrase has always had a bias toward the smug, but now it has a healthy dose of snark as well, as we use it to crow about the mess we (or someone else) have made rather than acknowledge an edifying experience. The expression’s grandiloquence is real but easily subverted. The trend toward using it sarcastically continues and may win entirely in another ten or twenty years.

Tags: , , , , , , , , , ,


peace dividend

(1990’s | militarese, bureaucratese | “post-war boom”)

I was surprised to learn that “peace dividend” began to crop up as we were ending the Vietnam War by expanding it into the rest of southeast Asia. According to the Congressional Quarterly, the phrase was born in 1968, as pressure mounted to end the war, and Nixon won the election partly on the strength of a promise to do so — neither the first nor the last of his brazen lies. It occurred to a number of people that we would save a lot of money if we weren’t garrisoning a huge army and manufacturing and destroying vast arsenals of weapons. If we were to spend that money, or part of it, on education, infrastructure, clean air and water, or other components of the much-maligned general welfare, it would be analogous to dividends from stocks and bonds (in fact, the analogy is very weak, but there’s no rule that says new expressions have to be plausible). It’s worth noting that the phrase does not refer to more general benefits conferred by the cessation of conflict; it nearly always has a purely economic cast. It didn’t really get popular until the Berlin Wall came down in 1989; for five or ten years after that, we heard a lot about the peace dividend that would arise from the end of the Cold War. Anybody seen it?

The phrase rarely takes metaphorical uses, although soon-to-be former New York City Police commissioner Bill Bratton has used it to talk about improved community relations following a drop in stop-and-frisks and other small-time arrests. Another meaning did emerge briefly, though it never gained much traction, after the Camp David accords of 1978 between Israel and Egypt. One of the means we used to get the parties to agree was to promise lots of military aid to both sides, which inspired some commentators to talk of a “peace dividend” to Israel and Egypt. A kind of bribe, in other words, to give both sides incentive to agree to a rather unpalatable set of conditions. (Arms manufacturers and their shareholders received quite literal dividends as well, but that was not pointed out in the mainstream press.) One can find examples from the late seventies and the early eighties, but that sense of the phrase was never more than a distant runner-up.

The end of the Cold War marked the last time the peace dividend played a significant part in U.S. politics. Maybe that’s just because the various wars and police actions* we’ve undertaken were either on a very small scale (former Yugoslavia) or are still more or less in progress (Iraq, Afghanistan, terrorism). But the fact is our officials have abandoned alternatives to the national security state. No one considers, much less proposes, eliminating policies that seek to impose our will on the rest of the world whether they like it or not. Concomitantly, evidence that our efforts to do so rarely succeed and often backfire are blotted out of public discourse. There is no alternative to meddling and warfare. And there may be no discussion of the fact that there is no alternative. Naked emperors are as embarrassing as ever.

The problem with the peace dividend is that it is more a fiction of accounting than anything else. It makes sense in theory, but economists are always quick to point out that it won’t amount to much in practice. True, the army never gets much smaller and the pace of armament production never slows for long. But even if they did, government budgeting bears so little relation to personal or family budgeting — where reducing spending in one department might well lead directly to increased spending in another — that savings disappear without a trace into a complex web of interests and bureaucracies. The only way to change that would be to reduce the size of the federal government to something closer to what it was in the early days of the republic, when the population was much, much smaller. Despite a lot of big talk, neither the right nor the left wing has any interest in doing so, or any idea of how to go about it.

* Remember that phrase? It goes back to to the Korean War, when it still made some tradition-minded Constitutional scholars squeamish to refer to a “war” that Congress had not declared.

Tags: , , , , , , , , , , ,


boots on the ground

(1990’s | miltarese | “infantry,” “combat troops,” “invaders”)

The army has been the butt of jokes for a long time. In my childhood, we all learned the “biscuits in the army” song, and as an organization it has long been associated with inefficiency, rigidity, stupidity, profiteering, etc. For those reasons and others the army is seen as the least appealing branch of the service. The navy is also pretty plebeian but has better uniforms and goes on voyages, the Air Force has its own lofty glamor, and the Marines are our badass fighting force, taking on the jobs no one else will take. These rough distinctions form part of the background for this week’s expression.

In the 1990’s, it came to the attention of policymakers that the Cold War was over, and the basis of our military strategy for the last fifty years had vanished. That didn’t bother anyone too much (somehow the money kept rolling in, despite ominous talk of a “peace dividend”), but debate over the proper response to this revolting development smoldered for a few years. One prong of it boiled down to a conflict between the Air Force, touting the virtues of long-range warfare relying on satellite missile guidance and precision airstrikes, and the Army, for whom there is no substitute for sending lots of soldiers and tanks to slog through the mire to victory. Clinton’s foreign policy predictably developed a strong preference for avoiding U.S. combat deaths, which meant fewer mess tents and more smart bombs. We all know that sending in platoons of grunts means more casualties, more brutality, billions of dollars down the drain, and quagmires. Who wouldn’t prefer a nice, clean missile? But presidents who try to pull troops out of combat zones usually find themselves putting them back in sooner or later. Boots on the ground have never quite gone out of style; missiles and drones have their cachet but can’t do it all by themselves.

“Boots on the ground” began to appear in the mid-1990’s among military officials and their pretorian guards — members of Congress, think-tank warriors, and journalists. By 2000, it was making its way into everyday life; the expression was evocative and easy to understand, and readers and hearers were quick to grasp it. I was surprised to see recent examples in a variety of civilian sources, not just law enforcement, for which the military analogy is obvious, but wherever efficient action is needed to counter a threat: disaster response, political campaigns, trade missions, NASA (I’m serious), even hospitals. Here’s one fresh from a Frontline Technologies press release (June 21, 2016): “Teachers are the ‘boots on the ground’ in your school district. More than anyone, they have their finger on the pulse of the student body, they look at the data and know the student needs in their particular building, and they know the areas where they need to grow as educators.” It conjures images of dedicated people fanning out and getting the job done. The notion of response to a direct threat is fading; sometimes the phrase is little more than a way of saying “taking action” or “doing something about it.” Available as a hyphenated adjective for a long time, but I’ve never seen it used as a predicate complement. (As in “That’s very boots-on-the-ground.”) The speed with which the phrase has spread is impressive.

An unfortunate and now forgotten assistant secretary of the Army, Sara Lister, was forced to resign in 1997 after saying, “I think the Army is much more connected to society than the Marines are. The Marine Corps is — you know, they have all these checkerboard fancy uniforms and stuff. But the Army is sort of muddy boots on the ground.” She noted that the Marine Corps was more prone to political extremism than the Army, which may well have been true. The relations between the military and the rest of us were much discussed in the nineties; some commentators were concerned that our men and women in uniform were becoming hostile to American society, because it was too soft, or liberal, or heathen, or whatever. The trend is unlikely to have changed since then. There is plenty of evidence that eliminating conscription has led to a definite rightward shift in the politics of the average soldier, but it no longer fashionable to point it out.

Tags: , , , , , , , , , , , ,


warfighter

(1990’s | militarese | “combat soldier”)

My libertarian readers will need no reminding that this week’s expression became necessary only after dramatic changes in the functions of U.S. armed forces over the course of the twentieth century. But armies have always had numerous soldiers and hangers-on essential to the functioning of the machine who never see combat — who wants to serve in a battalion where all the cooks got shot? — and “warfighter” merely denotes a combat soldier as opposed to all the other kinds. Right-wingers like to grouse about Our Troops used for the dreaded Nation-Building, and they are correct that we ask our armed forces to perform more, and more varied, duties and take on more roles in the world than we did before World War II. But that fact is but a sidelight as far as this term is concerned.

Even now, I’m not sure the term counts as everyday language, since it still turns up predominantly in military or at least government publications, or journals published by and for military contractors. I ran across it last week in Newsday, which conjured up a few other foggy memories of seeing it in the last few years. The first instance I found in LexisNexis came from the illustrious pen of Sen. Mark Hatfield, but it was uncharacteristic (see below). Today’s meaning of the term stared turning up regularly in the nineties, when it made occasional incursions into the mainstream press. Perhaps a few years earlier, military commanders began to talk about “warfighter exercises” designed to simulate combat situations more accurately than the old exercises had. (The use of the word as an adjective, or first half of a compound noun, still appears, but it has not become the norm.) It’s important to remember that “warfighters” is not the same as “boots on the ground”; a drone pilot thousands of miles away is every bit as much a warfighter as a wretched infantryman in Kabul (if we have any wretched infantrymen left in Kabul). It is settled wisdom in the military that the entire infrastructure and bureaucracy is there to serve the warfighter, to give U.S. soldiers the best possible chance in whatever sort of combat they are pursuing at the moment, most often in terms of technology and training. Yet so far the word has not come into use as a means of glorifying soldiers or making them objects of pity (as in “support the troops” or “brave men and women in uniform”).

Occasionally one sees this week’s expression used as the second part of a compound, as in “nuclear warfighter” or “guerrilla warfighter.” (The former appeared in Hatfield’s New York Times op-ed in 1986.) It turns up infrequently, but it’s not an unreasonable broadening of usage, actually. “Warrior” has a definite old-fashioned sound, more suited nowadays to movie franchises and computer games than actual warfare, though it might still be used of elite fighters. I think “warfarer” should be given consideration, but it looks too much like “wayfarer,” I suppose. By the way, there’s an Android game called Galaxy Warfighter; maybe this will be the rising generation’s cue to adopt the expression and push it irreversibly into our vocabulary.

“Warfighter” is an accidental addition to an accidental series formed loosely around the idea of strife, or making it go away. See “conflicted” and “-whisperer.”Pushback” and “win-win” are other terms in this category. Peace out, y’all.

Tags: , , , , , , , , , ,


conflicted

(1980’s | therapese? | “torn,” “ambivalent”)

“Conflict (with)” has been a verb for quite some time now, and “conflicted” was its past participle, so it has long been able to serve as an adjective, but it rarely did before 1970. And when it did start making adjective appearances, it didn’t quite seem to be doing the work of the past participle of “to conflict.” Why don’t we say “conflictful” or even “conflicting” (as in “conflicting schedules”)? When you’re divided within yourself, two parts of you are in disagreement, so it’s not a completed action, and the present participle seems more suitable. (When a conflict is settled, it ceases to exist, after all.) Maybe I’m being too fussy about grammar, but there’s something irregular about the way we use “conflicted” today. Yet it doesn’t sound strange, even to me.

The definition doesn’t require much explanation, but using the word with the right force is important. You don’t use it when you’re trying to decide between chicken soup or a TV dinner; there must be pretty strong currents at work to invoke the term. One is conflicted about major issues or in the face of important decisions. Powerful emotions or principles must be reconciled in order to make one’s course clear.

In 1977, sportswriter Thomas Boswell referred to the New York Yankees as “wealthy, conflicted and almost-too-talented.” But he meant strife between rather than within, more like “fractious” or “confrontational.” The Yankees were famous for having too many players who didn’t like or respect each other, so the word presumably meant they fought all the time. Today, it’s more common to use “conflicted” to describe a single person, but if you view a team as a single organism, the meaning is basically the same as ours. Instead of everyone pushing toward the same goal, too many people are going in different directions, so the team isn’t single-minded. (The weakness of the baseball team as metaphor for the individual may be seen in the Yankees’ three straight pennants while in such a “conflicted” state; people mired in a dither are rarely so successful.)

If “conflicted” can be used to talk about groups or organizations, why not nations? It has become normal to talk about the U.S. as conflicted about this issue or that, or just across the board. Lovely Liz from Queens suggested last week that the U.S. needs a “republic-whisperer” to help calm all of us down and start working together to identify and solve problems, or at least agree that probably not everyone on the other side is guilty of treason. When a single person is conflicted, maybe you can help him sort it all out, but when half of us are unable to agree with the other half about anything, the task seems impossibly daunting. Our house has been divided before and we’ve survived, but as the retirement fund managers like to say, past performance does not guarantee future results.

Why do I place a question mark after “therapese” as the source of this expression? Could there be a clearer example? The early instances of the term I have found don’t come invariably or even consistently from shrinks and counselors; it turns up in social science and other branches of academese as well. One strong indication: as “conflicted” was taking on its new usage, it turned up in arts writing, especially book reviews, a lot. Arts journalists being more neurotic than average, they tend to be early adopters of therapese, before editorialists or sportswriters. Arguably, journalists do more than anyone — with the occasional exception of an actor or screenplay writer — to make new expressions common to us all. Many of the expressions I have treated started life in a specific professional or demographic subdivision of vocabulary before seeping or exploding into everyday language. Each type of journalist, unsurprisingly, tends to prefer certain subdivisions. Arts journalists are lucky to draw on such a fertile source of new expressions as therapese, sportswriters mine the rich veins of new vocabulary generated by athletese, and editorial writers enjoy the fruit of our prolific military men and bureaucrats.

Tags: , , , , , , , , ,