Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 50 years

cultural appropriation

(1990’s | academese? | “exploitation,” “cultural imperialism”)

Another Lex Maniac special, wherein The Author chooses an expression of little linguistic import so he can indulge in a couple of paragraphs of shallow-profound political comment. The only point of interest of this phrase is its British origins; it was all over the Canadian press by 1990, but it showed up only occasionally in the U.S. Otherwise . . . no one knows what “culture” is, exactly, but we know it when we see it, and its adjective form is “cultural.” Got that? “Appropriation” is a bit more complicated, because of the divergence of “appropriate” (v.) and “appropriate” (adj.), which has lost quite a bit of power, gradually pulling away from its origins into a quite different (albeit related) area. Both words are rooted in the word for property, and until at least the sixteenth century they had closely related definitions, expressing the concept of taking something for oneself, meaning no one else could have it, especially the person you were taking it from. In the verb world, that meaning has held up pretty well, but the adjective has devolved to mean fitting, suitable, or applicable. “Cultural appropriation” evokes the verb, of course. But the idea is more that the appropriator is taking the other’s culture for his own use rather than taking it away tout court — the victim may get to keep it, or at least what’s left of it.

What changed was not the thing itself, which has been going on for eons and is essential to the development of our species. (As Lovely Liz from Queens pointed out, cultural appropriation is the same thing as culture.) But a growing sense that co-opting elements from a different culture is morally wrong has taken hold in the academy and perhaps here and there outside it. On the surface, it looks similar to the white supremacist’s position, which decries mixing our customs and theirs, whether black people acting like white people (which was comical or threatening, depending on the context) or vice-versa (which leads to the utter destruction of the white race — which we know when we see). But their motivations are antipodal. The white supremacist fears defilement of an imagined racial purity, while the academic is indignant because some first-world jerk is making a buck, or just acting disrespectfully — and taking advantage of someone in a weaker position.

There was, in my youth, a milder word that sounds like “appropriation,” which was “appreciation.” It didn’t usually go with “cultural” but might have, in the hands of a sufficiently tin-eared educator. Art appreciation had more to do with pleasant acceptance of others’ esthetic conventions, or at least a duffer’s knowledge of art history and genre. Appreciation of others’ cultures went with what we now call diversity and multiculturalism (not to mention political correctness), and cultural appropriation is their perverse product. Take an American teenager with a narrow ken, and expose her to art, or literature, or customs and habits that are new to her. She is struck; they speak to her somehow, and she believes she has learned something valuable and become richer for it. That’s how it’s supposed to work, right? Broaden the kids’ horizons, give them something new to think about, and they might become wiser and more humane. But then say she takes a piece of what she learned and incorporates it into a poem, a dance piece, or a sculpture. Now it’s cultural appropriation — she’s taken something from another culture and corrupted or stolen it by adopting it into her own. It’s fine to admire it, but not to use it. Alas, that’s not how art works. Art propagates itself through small strands as well as broad strokes.

We raid cultures of other times as well as of other places, which results in what I would call “chronological appropriation.” It’s the same phenomenon, but no one gets offended when Americans participate in Renaissance Faires or Civil War re-enactments. (It might be different if we drew on historical practices or events from southern Africa or Japan.) The troubadour, long extinct, no longer has rights we are bound to respect; neither does nineteenth-century cannon fodder. This illustrates again the fundamental tort of cultural appropriation, which is depriving someone else — in a weaker position — of either their way of life, or their chance to make money off of it for themselves. If no one is using the culture, we can do as we please with it.

Advertisements

Tags: , , , , , , , ,

exit strategy

(1980’s | businese | “way out,” “cutting one’s losses,” “covering all the bases”)

You would think this is a militarese term, but it isn’t, or it wasn’t. Used almost exclusively in the financial press until 1990, its definition was straightforward: a contingency plan to get out of an unwanted obligation, partnership, or any foreseeable situation in the most advantageous way possible. In case everything goes south, figure out an escape route that will spare you penury or embarrassment. Before 1990 it appeared typically in quotation marks, but that was no longer true by the mid-nineties; by the time William Safire immortalized it in a December 1995 column, it was common currency. The financial usage has not disappeared, but in the public mind it has been overwhelmed by the political. Candidates looking to get out of a losing campaign picked it up before 1990, at least in a couple of cases. After 1990, foreign policy analysts grabbed the expression, and it soon became de rigueur for invasion planning. How do you take out the bad guys, bring your people home, and avoid a quagmire?

You don’t have to be an astute observer of foreign policy to know that the advent of the new expression has not made our military leaders any better at formulating or executing workable plans — we are still stuck in Afghanistan and Iraq. Which raises a significant point: In warfare, only invaders need an exit strategy. If you’re being invaded, you just have to sit there and take it, unless you can force the invader out by causing enough casualties and mayhem. (An official of the invaded country might also need an exit strategy, a way to leave town quickly and quietly if the political winds shift.) When a financial institution needs an exit strategy, it’s usually a matter of extricating itself from an internal decision that isn’t working out, or of getting out of a contract between two more or less equal parties. In the military, you have to commit offensive action; there’s no need for an exit strategy if you never leave the base. When one field borrows an expression from another, naturally the meaning may change, but this is quite a twist.

The odd thing about the expression is that it is not used literally. You don’t hear one nervous moviegoer ask another, “What’s our exit strategy?,” in a crowded theater. It doesn’t sound right when you’re talking about a building or vehicle. It may, however, be used whimsically to talk about a job, relationship, or some other important sector of our lives. It’s not hard to imagine two brokers discussing the most effective ways to get away from their employers, or two men discussing how to get away from the girlfriend if she loses her appeal. (Mercifully, Paul Simon didn’t call the song “Fifty Exit Strategies for Leaving Your Lover.”)

It’s a little far-fetched, but I hear in this phrase the echo of stage direction. Here’s how it might be used: Imagine a king suffering a reverse and announcing that he intends to lash out blindly and abandon reason from now on. Like Macbeth, for example. When he finishes, the stage directions say, “Exit King. Exit strategy.” Yes, it is a little far-fetched.

Tags: , , , , , , , , , , , , , ,

lean in

(2010’s | athletese? | “give your all”)

I sense the need for an anatomy of this odd expression, changed forever by Google and Facebook executive Sheryl Sandberg. The first fork in the family tree branch generates “lean in” and “lean into.” The latter has been used for some time by sportscasters to denote exerting extra force in a certain direction (as a batter leaning into a pitch), or shifting weight on a skateboard or in a car to assist the steering (leaning into a curve). “Lean in” is more complicated. At the simplest level, it denotes a motion or posture understood to express attention, interest, or excitement. That is, it’s another way to say “lean toward.” Some time after 2000, the phrase became an adjective current among advertisers and entertainment executives, as in “lean-in experience” or “lean-in factor.” The latter was typically used in connection with exciting moments on television, conjuring the image of audience members on the edge of their seats, breathlessly awaiting the next utterance. “Lean in” has another application as well, as an antonym of “lean back” or “back away” — that is, as the opposite of taking it easy or retreating. In such contexts, leaning in is a sign of toughness and resolve. That would seem to be the most direct ancestor of Sandberg, but I don’t think it’s much older. The earlier athletic usage has a claim as well.

Sheryl Sandberg published her book in 2013, though she was quoted using the phrase before that. She preached ambition and assertiveness for women in the work force, or, as Lovely Liz from Queens summarized: women need to act more like men. Sandberg’s dicta have permeated the culture and spawned a women’s empowerment movement; the Lean In Foundation is a big organization, helping women all over the world learn from each other and move up the ladder. Yet a Washington Post writer declared the Lean In movement dead at the end of last year, after Michelle Obama drove a stake through its heart. More recently, Marissa Orr published a critique of Sandberg called “Lean Out.” Will “lean out” take its place alongside “lean in”? Will Sandberg’s addition to the lexicon lose momentum? Stay tuned . . .

It all starts with “lean,” which is tricky because it may suggest both a casual or relaxed tendency and much more concentrated force, as in the cases of “lean in” and “lean into.” “Lean” strictly speaking denotes any departure from the vertical in a normally upright object, and at least when people and animals do it, we usually have a specific purpose; we lean toward something or someone. “Lean in” has always shared that sense of purposefulness. To reach its present eminence, it had to lose its appendages, a step in the evolution of several expressions, including “give back” (other examples here). “Leaning in” once was invariably followed by “a certain direction,” “favor,” etc. Now it is a set phrase all on its own. In most similar cases, this slimming process results from a distillation of a number of competing longer phrases into a single shorter one. But in this case, the casting off seems to have come with the establishment of a new definition, imbuing the phrase with attributes of superior dedication and willpower. Not boiling down, but striding forth in a new direction.

Tags: , , , , , , , , , , , , ,

randomize

(1980’s | academese (science) | “randomly generate”)

A term born of empirical science — experiment design and statistics. Now it is used primarily to talk about clinical trials; an essential part of testing a medication or treatment is “randomizing” the patients — that is, making sure that those getting the treatment and those getting the placebo are sorted by non-human means, to eliminate as much bias in the results as possible. Such processes are easiest to envision in a binary world, where there are only A and B, and the category you belong to is “decided” by mechanical means. Computer programmers picked it up very soon, before most of us knew there was such a thing as computer programming, so by 1980 “randomize” had a number of technical uses, which for the most part it still has. In the eighties and nineties, I found examples from other endeavors as well: poker; esthetics (choreographer Merce Cunningham “randomized” his decisions at particular junctures by throwing the I Ching to determine the outcome); CD players; creating standardized tests; listing candidates on a ballot. It most often has to do with some sort of testing, medical or otherwise.

An “-ize” verb, “randomize” doesn’t sound as clunky (to me, at least) as “incentivize,” “weaponize,” or “monetize.” Probably because it’s rooted in science and mathematics; ize-itis is easier to take with technical terms. And “randomize” hasn’t filtered into everyday speech much. It’s a word you come across in print occasionally, but it hasn’t exactly taken the vernacular by storm. It seems like a modest enough word, filling a need without taking up too much room.

A related yet unrelated word is “rando.” It’s sort of a portmanteau of random and weirdo — the rando has a definite hint of unpleasantness, not someone you want to have to deal with. (Though the highest-ranked definitions on urbandictionary.com don’t give the term a negative implication, and at least one on-line source thinks randos are a good thing, so the jury is out.) An unrelated yet related word is “anonymize,” to which my attention was drawn by Lovely Liz from Queens, as in “anonymize data.” It’s how to divorce you from your personal information and preferences; more precisely, it’s how internet titans vacuum up everything worth knowing about your on-line habits while creating the illusion that your name and identity can’t be connected with any of it. But anonymizing is also part of randomizing; in fact, removing patients’ names is an essential step in the process.

Random isn’t as simple as it sounds. Take a simple example: if you flipped a coin and it came up heads ten times in a row, you wouldn’t think that was random at all. Some ordering force must be at work, right? Yet it’s perfectly possible for a fair coin to land on the same face ten times in row. There doesn’t even have to be a balancing streak of ten tails later on, but over time the number of heads and tails will even out. In a truly random sequence or assortment, you will almost certainly find stretches that appear to be grouped logically, but that’s just how it shakes out; it’s not proof, or even evidence, of a master intelligence running things. We want to call random only that which is jumbled, devoid of an obvious organizing principle. But the random may look very organized if you focus on a small section.

Tags: , , , , , , , , , , , , ,

trickle-down

(1980’s | academese (economics))

Although the Reagan administration gave us many new expressions, it cannot be blamed for this one, which long predates Reagan’s ascent to the presidency. It’s unlikely he ever used it himself, at least in public. But everyone else did, and we continue to associate the phrase with him, especially if we’re my age. The term was accurate in the case of Reaganomics; the much-discussed supply-side theory was a smoke screen to disguise the massive (and ongoing) redistribution of wealth upward, a process well underway before Reagan got in, but which he accelerated, and, more culpably, made to feel normal and inevitable. Now a generation or two of Americans senses that trickle-down economics is just how we do things. Advocates of the theory care far more about the first part — putting more money at the top — than the second — making sure a lot of it actually reaches those who have less. The real point is not that wealth trickles down. The real point is that it gushes up.

No question this expression is older, dating back at least to mid-century. It was not rare before 1980 and therefore ready to hand when Reagan came along. Apparently “trickle-down” did not carry the opprobrium Reagan’s adversaries hoped it would. (John Kenneth Galbraith used the phrase “horse and sparrow economics,” which lacks rhetorical vim but makes the relationship between the tricklers and the tricklees clear.) The word “trickle,” suggesting a sluggish and paltry stream, ought to raise hackles or at least spark discussion, but it doesn’t seem to have bothered very many people back in the eighties, or today, though union spokespersons and political candidates still use the phrase with intent to defame. It may not scare voters very much, but that doesn’t mean politicians advertise their own policies in such terms; it’s one of those expressions you would hear only from an opponent.

“Trickle-down” is not used exclusively to talk about money and distribution of wealth, but that has always been its métier. Today you see it in sportswriting a fair amount, where it comes closer to “ripple effect,” the idea that small changes will be amplified and lead to larger changes. It’s a different axis: “trickle down” insists that the wealthy occupy a higher position, but “ripple effect” is more horizontal and egalitarian. In an economy where more people have a larger share of the money, it washes around; when only a few people have most of the money, it can only trickle down.

The trouble with trickle-down is that it’s deeply un-American. It posits a small aristocratic class that receives large benefits from the king — er, uh, ahem, government — in exchange for a certain amount of fealty and service. The government shovels more and more money onto the aristocrats — a tiny minority of the population — further strengthening their hold on political, and purchasing, power. In theory, anyone can make their way into this minuscule aristocracy, but in practice it’s much easier if you start in the top tax bracket, or in the right family (yes, bloodlines still help). Now there have always been prominent American politicians and philosophers who preferred aristocracy, and they have wielded considerable influence since 1789. But each go-round they seem to get a little more immune to the masses’ resistance. Or maybe that’s just hubris repeating itself. After all, ruthless, amoral greedpigs make mistakes like the rest of us.

Tags: , , , , , , , , , , , , , ,

off-label

(late 1980’s | bureaucratese? businese? doctorese?)

The pharmaceutical industry has given us many new terms, but a lot of them are brand names. It’s not really clear to me on whose turf this expression arose, actually: the businessmen, the bureaucrats, the doctors, or a combination. But it couldn’t exist without the pharmaceutical industry. The “label” represents officially sanctioned conditions — symptoms, diseases, syndromes — for which the drug may be prescribed. In the U.S. (and many other countries), it depends on a regulatory body whose job it is to decide when a given drug has been fully tested and demonstrated reasonably safe and effective for a given condition. A doctor goes off-label when she deploys the medication against anything not on the approved list. The practice is quite common and also quite legal; unlike insurance executives, legislators have hesitated to take too much discretion away from physicians.

“Off-label” remains an adjective or spot adverb. It begins showing up in LexisNexis in the late eighties. The first notable examples appeared in the Australian press, but the expression probably arose here. The AIDS crisis ushered it into the language; that was before there was any approved treatment and doctors threw everything they could think of at this terrifying new disease, regardless of the stated purpose of the medication. (Nowadays we use the expression more in the context of cancer treatment, but any condition for which there is no generally recognized treatment will do.) It was pretty ordinary by 2000. Within the next ten years, “on-label” emerged to mean following the FDA-approved pathway, rather as “cis” has evolved so there will be an opposite of “trans.” Even today, the phrase, while capable of ironic shading or application in a different field (themselves off-label uses), still pretty much applies to medication, or medical devices.

Off-label drug use stirs up a very familiar, and ultimately intractable, debate, the same sort that swirls around “self-medicate.” Strictly limiting the uses for which any given drug may be prescribed is the safest course, saith the insurance industry, concerned above all with avoiding liability (off-liable?). In 1990, the FDA went after off-label marketing — advertising by pharmaceutical companies touting the benefits of non-FDA-approved uses. (Marketers now are free to engage in off-label promotion. Off-label prescribing — case-by-case by a physician — has never been discouraged by the FDA.) That was around the same time that insurance companies tried to stop paying for off-label prescriptions, because such uses were not officially sanctioned — though it doesn’t follow that they were unsafe. Doctors should have sufficient powers of observation to see when a medication consistently has a beneficial side effect and use the information to improve their treatment methods. It’s sound empirical knowledge, particularly when it comes from several independent sources. If we have to wait around for the FDA to approve it, lots more patients will die. Both sides are right at least part of the time; off-label uses are likely to involve more risk, but it is foolish to ignore repeatedly observed beneficial effects just because the FDA hasn’t done the work. There is no right answer, but in each unique case, one course is usually more likely to succeed than another.

Every generation comes up with its own “label” compound: “union label” served my parents’ generation and “record label” was the most likely instance in mine. Both are nouns; “off-label” has made a grammatical leap and may bring forth similar new labels in the future. Why not “under-label” or “low-label” or “label playing field”? Just don’t get it mixed up with “labile”; that would be “lip-label,” or libel.

Tags: , , , , , , , , , ,

branding

(1980’s | businese | “marketing (strategy),” “image”)

You are wondering about the connection between branding oneself or one’s organization and branding cattle, so I will tell you. They are both ways of marking salable property. Your brand is the quality, whatever it may be, that sets you apart from the competition, just as branding a calf designates it exclusive property. A possible intermediary would be “brand” used as a verb meaning “accuse someone of being,” as in “he branded his opponent a liar.” (The occurrence of “as” in between the object and the article was already possible in 1980, though perhaps less common then.) “Branding” in this sense is the act of pinning a disagreeable attribute on someone, but “brand” does not refer to the scarlet A that dogs the victim (à la Hawthorne). Nowadays, individuals and organizations improve their brands — which would have sounded very strange back on the range — in order to increase their appeal, rather than repulse customers. It is the flavor or feature or je ne sais quoi that renders them more worthy of the sacred ritual of opening the wallet. It must be tenderly nurtured and aggressively developed, with much overtime and expensive consultation.

Not just for-profit businesses; universities, foundations, hospitals, even nations are expected to burnish their brands in order to attract more people and make themselves more relevant — that is, closer to the money spigot. Just as IT departments became necessary a generation ago, branding consultants (or in-house staff) are now de rigueur for any business serious about staying in business. Anything an organization does to increase its status or revenue might qualify as a branding venture. For now, at least, it remains grounded in consumer behavior; the true measure of branding success is consumer appeal. Thus such projects tend to take on an anxious or abject tone; consumers are capricious gods whose whims must be catered to in order to part them from their money. Americans have seen a steady erosion of their political power for a century or more. To some extent it has been replaced by consumer power, but consumers don’t get to hire and fire corporate executives.

“Brand” and “branding” broke new ground in the eighties; it was rare before then to see either term as we use it now. By 1990 they both showed up regularly in the business press, though not perhaps in everyday vocabulary. One team of researchers defined “brand” as consisting of three components: “physical make-up, functional characteristics, and characterization — i.e., personality.” “Branding” goes with words like “messaging” (conveying a selling point) and “positioning” (proving yourself superior to the competition). “Brand” meaning simply “name of manufacturer” or “name of particular product” has been superseded, though it has not disappeared. It’s not enough to be Heinz or Kleenex any more. Heinz and Kleenex have to get out there every day and prove they’re better, or at least more compelling. You can’t just maintain a good reputation and rest your name on it. You have to build, respond, and work, work, work to make sure you remain irresistible.

There seems to be a strong tendency in corporate America to find or create new methods and theories of improving sales or employee retention or customer loyalty (cf. the recent entry on “emotional intelligence“). They don’t all involve branding directly, but they do involve purchasing books, hiring consultants, and supporting researchers who seem more and more like a parasitic class, feeding off their high-powered hosts and justifying it by dispensing advice that doesn’t — and can’t — work most of the time, because in most fields winners must always be in the minority. Even if you follow your consultant’s report to the letter, it probably won’t improve your market share much. But another consultant will come along next year, and you’ll have to shell out for that one, too. It’s just another way to make the money trickle down, I suppose, but one can’t help but wish that all these corporate geniuses might put a bit more effort into innovation and investment than convincing us by more or less fraudulent and manipulative means that we should buy their product. Maybe it will turn out that the best long-term branding strategy is finding a gizmo nearly everyone uses and making it better than anyone else can. But it’s a lot easier to talk about what the logo should look like and where it should go than to re-envision the entire chain of people and duties required to improve the merchandise. The point is not the product, it’s your ability to convince the gullible to pony up. I’m beginning to think we should put P.T. Barnum on our money, not George Washington or Harriet Tubman.

This is the five hundredth expression I have written about, assuming I’ve counted correctly. I encourage everyone to head over to the alphabetical entry list and look around to see if I’ve covered a favorite expression, or a pet peeve. If so, comment! If not, send it in (usagemaven at verizon dot net).

Tags: , , , , , , , , , , , , , ,

age in place

(1980’s | therapese? | “stay put”)

“Age in place” has social and personal dimensions, referring to entire communities or single dwellings that contain upper-middle-aged people who must decide whether or not to move away. When they decide to remain at home or at least in the same town, they age in place. Retirement communities — because they blur the line between home and neighborhood — allow you to do both by moving from your house into an apartment while you can still look after yourself, and then rise through the assisted living ranks as your capacities dwindle. So without leaving the area, you create a new home to live out your life. It seems to me that most of the time when the phrase comes up now, it refers specifically to remaining in one’s own home through one’s retirement years.

Since the eighties, polls have shown consistently that most older people want to stay where they are rather than move away. It was assumed back then that baby boomers would not be content to age in place, but so far they seem to be. The continuing preference for growing old in familiar surroundings probably says more about the nature of the elderly than about demographics or American culture. Or simply the fact that moving is a hell of a lot of disagreeable work. It may be the strength of the roots you’ve put down, or it may just be inertia; either way it adds up to aging in place.

This week’s expressions have some significant forebears, most notably “stay in place” and “run in place.” Aging and sheltering in place capture the same refusal to pull up stakes. If you prefer your prepositions accusative, “snap into place” or “lock into place” are for you (both could function transitively or intransitively). Finally, I can’t help but hear an echo of “rest in peace” when I encounter this expression. Doesn’t matter how long you age in place; some day you’ll rest in peace. In your final resting place.

shelter in place

(1990’s | businese? bureaucratese? | “ride it out,” “hunker down”)

I believe we owe this expression to the good people of West Virginia, or maybe it’s the bad people. Starting in the 1980’s, as far as I can tell, chemical industry spokespersons began introducing the phrase to answer the question, “What do I do if a huge cloud of poison gas is enveloping my house?” A leak or explosion at a chemical plant is a big deal anywhere, and there’s a particularly high density of chemical manufacturing in those parts. The industry representatives were in the awkward position of spending half their time explaining why leaks and explosions couldn’t possibly happen, and the other half explaining what to do when they did. The vast majority of sightings of this phrase, well into the 1990’s, come from West Virginia newspapers or press releases, as far as LexisNexis is concerned. It is very unusual for a new expression to arise so exclusively from a particular state.

“Shelter in place” can be a verb, a noun phrase, or an adjective, as in “shelter-in-place drill.” It means more than just stay in your house and hope for the best. Paul Hill, president of the National Institute for Chemical Studies, put it this way in 1994: “Go inside the nearest structure and into a room with no or few windows. Pets should be brought indoors. A radio or television should be turned to a local Emergency Broadcast Service station for information and directions. If the emergency involves hazardous materials, heating and cooling systems and fans should be turned off, windows and doors should be shut and cracks covered with wet rags or tape. If directions call for protected breathing, the nose and mouth should be covered with a wet cloth. Wait for an ‘all-clear’ signal. In addition, residents should stay calm and stay off the phone.” Nowadays, when the governor tells residents of hurricane-prone beach towns to shelter in place, there’s more emphasis on boarding up windows and lashing everything down, but the idea is the same: retreat into your house and make it as impenetrable as possible, turning it into a temporary fallout shelter. The net result is rather like “lockdown” as it might be practiced in less densely populated areas.

I suppose the cynic in me hears an echo of that lovely expression I learned in my youth, “Bend over backwards and kiss your ass good-bye.” When the authorities tell us to shelter in place, we’re on our own. If it gets bad, we’re stuck.

Tags: , , , , , , , , , ,

disruptive

(1990’s | businese? athletese? | “shaking things up,” “causing a stir”)

A word of long standing, but when did it take on a favorable connotation? Not everywhere, of course, but executives use it approvingly now, unthinkable in the days of Henry Ford or even Lee Iacocca. Successful corporations have traditionally avoided boat-rocking and sought the even keel, but now executives congratulate each other on their disruptive business practices. It is not solely a matter of hobbling the competition; a certain amount of disruption is tolerated within the organization if it keeps employees on their toes, for example, or pushes a complacent division into activity. The buttoned-down set seems to have loosened their vests.

The first occurrences in the press that I found date from the late nineties, a few due to far-sighted business gurus but more from coaches describing the defensive unit, particularly in football and basketball. (Often it applied to a single defensive player.) I couldn’t guess which source influenced the other, but there’s nothing new about businessmen borrowing vocabulary from athletes — in this case, giving it more of an offensive than a defensive cast. By 2010 the word was ordinary in business contexts. Nowadays artificial intelligence and business models or strategies attract the label “disruptive.”

It’s a very forward-looking buzzword, associated with innovation, technology, and improved corporate management. Senior executives sling it around confidently, extolling the virtues of novelty and adroit exploitation of one’s strengths, or just crowing about how they’re going to mess with their competitors. There’s the usual tension between the goal of making the world a better place (if only for p.r. purposes) and simply extracting greater profit from it.

“Disruptive” is close to a newer expression — “game-changing” — and an older one, “revolutionary.” But these are both stronger than “disruptive,” which encompasses lesser shocks to the system. You can be disruptive without altering the playing field permanently or overthrowing an old order. It reminds me of Joseph Schumpeter’s notion of “creative destruction,” a hallmark of capitalism, which requires not just that single enterprises should fall so that better ones might rise, but that the rules of doing business, or other received wisdom, must fall to the new and improved. (Schumpeter believed strongly in innovation and entrepreneurism, by the way.) In today’s world, disruptive tactics are mainly intended to weaken or drive out competitors, but getting rid of rivals was always part of the entrepreneur’s toolbox. The fine talk of less able businesses fertilizing their successors didn’t disguise the fact that Schumpeter was merely peddling social Darwinism dressed up as economic law — yet another instance of trahison des clercs.

We owe this week’s expression to Will from Paris, a first-rate student of the language and a damn fine host to boot. He says, based on recent dealings with the corporate set, that this word will soon take over the world, and Lex Maniac wants nothing more than to get in on the rez-de-chaussée. Merci!

Tags: , , , , , , , , , , , ,

lockdown

(1970’s | legalese? | “quarantine”)

A grim word. Before 1970 or so, “lockdown” pertained to hardware, describing a mechanism that held something firmly in place. During the seventies, lawyers and prison wardens began using the term to talk about a way to control prisoners by confining them to their cells, forbidding gatherings, visits from the outside, etc. The usage became standard quickly, commonplace in the mainstream press by the mid-eighties. At some point in the nineties, the word kicked over the traces and spread to other contexts, anywhere there was unrest (just as prison lockdowns were a typical response to riots or smaller-scale violence). Some incidents in the late nineties in particular gave the word a boost — the Columbine High School shootings, the WTO protests in Seattle — each of which drove a spike in sightings of “lockdown.” It was already shifting from something imposed by corrections officers to something enforced by police. At the same time, lockdowns took on the flavor of safety and security rather than punishment.

It was arguably a safety measure even when first discussed. (What was the old word for it? Was there one?) To us, “lockdown” suggests an entire building or at least a wing, but in the seventies, it was not unusual for a single prisoner to be put in lockdown (solitary confinement) if they got a little too crazy. The whole premise of prison is that you get put away in a holding pen, away from society, and that’s just another level — prison squared. But soon lockdown became a much more general affair, imposed on hundreds of prisoners at a time. That does have to do with safety, of the guards if nobody else. But as in the case of a single prisoner, it’s very easy to confuse with retribution. When you lock down a school, a civic building, or a whole neighborhood because there’s a killer roaming loose nearby, we’re all supposed to have a warm feeling, like everyone is doing their job and protecting the kids from harm.

We already used “lockup” as a synonym for “jail” — for some reason, you don’t hear “lockup lockdown” — if we hadn’t, “lockup” might have become the accepted term instead. I don’t know exactly why, but “lockdown” works better somehow. It sounds more drastic, more final than “lockup,” and therefore better suited to widespread danger and panic. (Cf. “shutdown,” “breakdown,” or even “patdown.”)

The spread of “lockdown” to hospitals, hotels, or even entire cities demonstrates two things. One is that lockdown is primarily a response to contagion, whether of violence or disease. That’s why it sounds strange when sportswriters use it to describe an outstanding defensive player; we understand but it sounds a little off somehow. But the continuing creep of the term into other fields (itself a form of contagion) reveals the seductiveness of the concept. Here’s an easy way to prevent harm to the defenseless, and who wouldn’t be for that? The fact that it also represents an expansion of power — of government or administrators of private institutions — doesn’t seem so important against the backdrop of pious evocation of security for all. Pretty much everyone would agree that lockdowns are at least occasionally necessary to prevent dangerous situations from getting completely out of hand at prisons, hospitals, or schools. But how often? Should we carry them out as preventive measures rather than as responses to unfolding crisis? Is it true that the more lockdowns that occur within a society, the more authoritarian it becomes?

Tags: , , , , , , , , , , , ,