(2000′s | therapese | “uncomfortable in one’s own skin,” “genderbending,” “confused”)
When I mentioned to some friends that my next blog post would cover “transgender,” I added that the word seemed fraught with contention over its meaning and appropriateness. They all replied that they thought the meaning was pretty well settled. At this point in history, they are probably right. GLAAD, queerdictionary.com, and Planned Parenthood all generally agree on the definition of “transgender” and rule that it is not the same as “transsexual.” Broadly speaking, a “transgender” person wishes to be the opposite gender of that suggested by his or her physical characteristics; a “transsexual” (note that many people object to using “transgender” as a noun, or the adjective “transgendered,” but “transsexual” can be either), is someone who does something about it, anything from merely trying to pass to full-blown SRS (that’s “sex reassignment surgery,” or a sex-change operation). In more contemporary terms, if your gender expression (how you present yourself) conflicts with or fails to match your gender identity (what the doctor said when you were born), you’re transgender. The “what the doctor said” part is significant, because in principle your gender is assigned to you — the expectations following you around about how you ought to behave were set before you had any say in the matter.
The language gets tricky because many of us have very deep and strong — some would say built-in — feelings about men and women. How you know who’s who, how people ought to look, dress, talk, etc. Many people still take traditional gender roles seriously; ambiguity makes them nervous, occasionally violent. But a few of us just never feel right acting out the roles, or rules, established for us. To take an oversimplified hypothetical example: you have a penis, but you feel like, or wish you were, a girl. I’ve never felt that way myself, so I can only imagine how it feels to struggle with something so elemental, so profound. What I can’t imagine is being casual about the question. If you really feel like you’re stuck in the wrong body, it must be wrenching, with terrible mental and emotional pressure (here is one impassioned and painful story). Using language to attack someone in such straits becomes a great cruelty, a luxury born of thoughtless and unexamined assumptions about who we really are.
Another reason language gets controversial has more to do with the fact that transgender people themselves, who presumably ought to be called what they want, can’t afford to take lightly the terms others use to describe them. What people call you may be hazardous to your health. It also inevitably assigns you to a particular group, which also may be hazardous to your health. Under these high-stakes circumstances, the words we choose will undergo a lot of scrutiny, and it’s not surprising that some people don’t want to be called “transgender” or “transsexual” (generally recognized as old-fashioned, though not actually objectionable, since a significant number of people continue to prefer it). While it was acceptable at one time to use “transsexual” to mean what we would now call “transgender,” it’s much less so now and likely will not be at all in another generation.
“Transgender” appeared only in clinical and therapeutic circles in the seventies. “Transsexual” was already in general use then, usually to refer to someone who had taken steps to alter their bodies to match their preferred gender. Now we have a whole slew of new words that a generation of increasingly complex sexual politics has pushed into the language. I’ll pause here over two of the newer ones: “cisgender” (meaning something like “normal” — that is, feeling o.k. about the gender expectations society has placed on you) and “genderqueer” (generally an adjective, as far as I can tell) meaning something like objecting to any gender assignment at all. Remember “It’s Pat” on Saturday Night Live? It’s more than having a uterus but feeling like a boy — you don’t want to be pinned down as a boy or girl, period. I can’t guess how commonplace these words will become, but they are coming to a neighborhood near you.
I mustn’t fail to mention gender-neutral pronouns, but I’ll stand back and let others do the work. Here are four nice links I found in about 30 seconds on Google. Can’t go wrong with an Oxford University Press blog post. How about an entire blog devoted to the subject? Still not satisfied? Here’s another. Had enough commentary already? Here’s a handy (if incomplete) chart.
Before I sign off, I must direct your attention to an essay that is not only of great linguistic interest, but is probably the best blog post I have ever read anywhere, on any subject. While discussing the fissile nature of terminology and communities, the author engages a central political question: To what extent is the ability of large numbers of people to unite behind a common term for themselves a necessary precondition for gaining the power to change society? Is it possible to improve your lot if everybody in the group spends all their time arguing about what to call themselves, and who’s justified in sterotyping whom? Would the gay community have been as successful if most of its members hadn’t united behind descriptive terms that allowed them to focus on their shared goals rather than all the things they disagreed about?
(1990′s | athletese | “sure bet,” “sure thing,” “lead-pipe cinch,” “guarantee(d)”)
“Dunk,” now. It seems to come from Pennsylvania Dutch in colonial days. The Dunkers (or Dunkards) were a German sect that believed in full immersion baptism, like the Southern Baptists, a more successful sect historically that took fervor much further. The name was in use by the mid-eighteenth century. According to Matthews’s Dictionary of Americanisms and Random House, the use of “dunk” as a verb came along just after the Civil War. That use has shown staying power, now immortalized in the name of our leading donut chain. As late as 1970, “dunk” was not in common use as a basketball term; by 1980 it was essential. Our figurative use of “slam dunk” today is descended from basketball. It is invariable, even though there are, or were, other equivalents in basketball jargon. Either “slam” or “dunk” can be used without its complement, “jam” is a common verb (less common as a noun), and you used to hear “stuff” (still used sometimes as a verb) or even “stuff shot.”
Dunking wasn’t even legal in college basketball when I was a boy, but certain NBA players gave the practice great cachet. One thinks of Julius Erving (Dr. J), David Thompson, or Darryl Dawkins, who was known for destroying backboards. Now it’s not a specialty any more. Just about every college or professional player is capable of dunking, with or without choreography. The rise of the phrase in sports journalism made it possible for it to pass into more fanciful use, so that the phrase now refers to a can’t miss proposition, or something so easy you can’t mess it up. (Actually, it is possible to miss a dunk, and it even happens occasionally, but it remains the highest percentage shot in basketball.) Once in a while, it is used as a verb to mean “ram” or “force.” In aviation, a “slam-dunk approach” refers to an unusually steep descent to the runway.
As I recall, “slam dunk” earned cliché status during the Clinton impeachment proceeding, but it was certainly around before then. It became omnipresent during the hearings, when Republican Congressmen gloated relentlessly over their “slam-dunk” case against Clinton for perjury. Turned out to be more of a free throw clanking off the front rim. Anybody remember Henry Hyde any more? (Conveniently, his initials also stood for “Homewreckin’ Hypocrite.”) Clinton-haters positively salivated over the president’s disgrace, which only seemed to increase his standing with the rest of the electorate; Clinton remains popular now, but Ken Starr will never sit on the Supreme Court. Our language got a little boost from his travails, gaining for good this punchy, spondaic expression. It enjoyed a renaissance during the run-up to the Iraq War, thanks to CIA director George Tenet’s claim of conclusive evidence that Saddam Hussein was harboring unguessed (turned out they were nothing but guessed — no one ever found any) caches of weapons of mass destruction. Another overconfident Republican overstating his case.
The expression always pretends to certainty (and therefore may be used to disguise its absence): we know this is the moral thing to do, or it’s what the people want, or we have overwhelming evidence that it’s true. But in the incidences above, a slam-dunk case proceeds on the basis of violence, of exasperated righteousness left with no choice but to take drastic action. That seems to be present at least as an undercurrent when people use this phrase. Since it is often used in legal or faux-legal contexts, it has an adversarial bent. Furthermore, your opponent must be willfully blind or gumming up the works to ignore the overwhelming evidence against him. So when you use the phrase, it generally conveys a flavor of retribution, even taunting — it strikes me as loaded that way.
(1990′s? | academese (economics) | “employable population,” “what one has to offer”)
In 1979, an economist named Theodore Schultz won the Nobel Prize. He was noted for studying “human capital”; in fact, he used the term in his acceptance speech. At that time, the word remained the exclusive property of economists, in or out of academia. (The first citations in LexisNexis come from Paul Samuelson’s Newsweek columns in the mid-1970′s.) President Carter used the phrase in a Labor Day Proclamation in 1980. After that, it began to show up more often in reporting and editorials. Politicians and journalists started to use it, and it has become pretty ordinary by now.
This phrase bears a slippery resemblance to another expression that has flourished since my youth, “human resources.” If we are human capital en masse, then each of us might be considered a human resource, just another bit of carbon-based raw material for the all-embracing economy, from whom all blessings flow. But that isn’t how we use “human resources,” which doesn’t exist in the singular. It’s part of a company — the part known as “personnel” when I was a boy — in charge of hiring and firing and employee relations. Oxford Online defines “human capital” to mean “the skills, knowledge, and experience possessed by an individual or population, viewed in terms of their value or cost to an organization or country,” which covers pretty thoroughly the ways in which the term is used.
Most of the time the emphasis falls on “capital” when this expression rears its head. The purpose of human capital is to benefit an employer — that is, it’s what you bring to the job. That means the employee can be treated as a commodity, whose salary and benefits amount to rent for whatever attributes she has that boost the employer’s profits. (Here‘s a useful distillation of that point of view.) Economists blandly employ this sort of thinking every day: You are what you’re worth. But it also possible to place the emphasis on “human.” I found a brief but rather touching post on deloitte.com that urges thinking about your employees as more than additions and subtractions on the balance sheet. Unlike physical capital, human capital needs to be nurtured and recognized for its good work. If not, it can always leave the employer high and dry if it feels mistreated. As long as there’s another boss out there willing to act a little more humane and less capitalist. (Of course, the employer is also free to rescind investments in human capital, in the form of education, vocational training, affordable housing, better health care (or child care), etc. If the boss isn’t satisfied with the return, he can always cancel the benefits.)
Many screeds stand to be written about this phrase, so glibly tossed around by bureaucrats and technocrats. To me its most disturbing aspect is the way it makes us worth anything only insofar as we contribute to the gross domestic product — only as long as someone is making a buck off us. The category “human capital” is generally opposed to “physical capital,” but they are both judged by their profit potential; all other talents, abilities, and attractions are strictly subservient. Another point against the phrase: it turns us all into servants — in fact, you don’t have to mumble much for it to resemble “human chattel,” which may in turn remind us of cattle. It’s true that even the few at the top are, strictly speaking, part of the whole economy’s pool of human capital, and therefore serve the same remorseless, soulless capitalist machine as the rest of us. But the one percent — who may, like the machine, have little in the way of soul — have grasped the levers of power. They may serve the system, but they don’t serve the boss.
chick flick (1990′s | journalese | “tearjerker,” “movie pitched at women”)
chick magnet (2000′s | “something that draws the ladies,” “Adonis”)
These two phrases came along at about the same time — the mid-nineties — and both seem to reflect Commonwealth influence. The case is especially clear for “chick magnet,” which appeared almost exclusively in Australian, British, and Canadian sources until 2000 or so, and remains more common there to this day, according to LexisNexis. “Chick flick” started out at about the same frequency in the U.S. as in other Anglophone countries. It took a few years for “chick flick” to settle as the invariable term (Phrase Finder has a good history). In “Sleepless in Seattle” (1993), Tom Hanks uses the phrase “chick’s movie,” and variants with and without the possessive could all be heard for a few years there. “Chick magnet” never experienced the same flux in form beyond the odd apostrophe-s, but it could (and can) mean different things. One: A person (generally a man) unusually attractive to women. (You might prefer not to be reminded, but “chick magnet” became a minor epithet — as opposed to all the major epithets — for Bill Clinton during the Lewinsky scandal.) Two: a creature that attracts women (e.g., “Get a dog. They’re real chick magnets.”) Three: an object that attracts women. My girlfriend’s daughter showed me a “vine” (ten-second YouTube video) in which a teen-age boy calls a Lamborghini “a real chick magnet.” My sense is that when the term first slipped into the language, the first usage predominated. Now I think the second and third have overtaken it, but all three are still available.
A “chick flick” denotes a film designed to appeal to a specifically female audience; that is, to attract a more abstract population of millions of women rather than the handful of women hanging around the park, or the bar, at any given time. Chick flicks may rely on weepy or Harlequin Romance clichés to do their work, but they may also draw their effectiveness from strong women characters that crowd out or overshadow the men. (“Thelma and Louise” and “A League of Their Own,” both released near the dawn of the chick-flick era, did not send women flocking to the cinema because they were fuzzy, heartwarming stories with lots of muscular men with hearts of gold.) The very strong implication is that the men in the audience are also crowded out. We’d rather go watch James Bond or Jim Carrey. Confession: I loved “Dumb and Dumber.” Out of character, I hope, but I cannot tell a lie.
The real question here is how did “chick,” a word already unpleasantly musty and at least vaguely insulting when I was a kid, worm its way back into our vocabulary? If these phrases really did arise in England and Australia, it may be the word was less ominous over there. I believe “chick” used to mean girl or woman is primarily a U.S. locution that had its moment in the sun in the early and mid-twentieth century — it’s tempting to suggest that it descends from W.C. Fields’s primordial “chickadee,” but that’s pure folk etymology, and I abjure it in the absence of evidence. By the time “chick flick” and “chick magnet” came along, it had been at least a generation since discreet people stopped using “chick” that way, and no doubt it had lost most of its sting. But I don’t hear adults calling women “chicks” even now, except maybe jocularly, and if kids do it today, it’s retro-slang. Now this may be a simple case of hipster irony taking an old word or concept, bending it a bit, and breathing new life into it (“chick lit” is a related example). It is not an example of an oppressed minority twisting a term of contempt into a proud epithet, however. (At least, I don’t think so; here’s another point of view.) Women may use these phrases (particularly “chick flick”), but they did not arise among women or feminists. A recent movie and video game both used “Chick Magnet” as the title, and both exemplify the purest male fantasy about effortless sexual conquest. The recrudescence of “chick” does not strike me as harmless; the forces of degradation never sleep, and lots of people (not all of them men) continue to resent the gains women have made in the last fifty years. And if “broad” starts to sneak back into the language in the guise of lighthearted cultural commentary, you’ll know I’m right.
(late 1980′s | miltarese | “raid”)
I had thought this phrase first appeared during the Vietnam War, but most probably it came along earlier, at the time of the Cuban missile crisis, when some high up in the administration advocated “surgical strikes” against Soviet installations (they were overruled). Several authors who were privy to these deliberations recorded the phrase in later memoirs, although it doesn’t seem to have made it into the public record that early. Every few years, an event would breathe new life into the phrase. A proposed but unexecuted Soviet attack on Chinese nuclear plants in 1969. The raid on Entebbe (1976). The hostage crisis in Iran. The war in Panama — when it was decided that a surgical strike to kill or capture Noriega was too risky, so we had to send in troops. By 1990 it was turning up often enough in the mainstream press that an informed reader had to know what it meant. By the new millennium, it could be used to talk about situations other than warfare, as in economic policy, politics more generally, and even labor-management conflict (as in an action where a small group of vital employees walks out). A reliance on drone warfare in recent years has given the term another boost. Actually, it’s hardly ever used in the sense of “work stoppage.” The “strike” has to do always with an attack of some kind, whether against missile silos or undesirable people or urban poverty.
Originally, surgical strikes denoted attacks on places — one of the things that made them “surgical” was that there weren’t supposed to be any dead people littering the landscape — but now the term is at least as likely to refer to an attack on a person or a small group of people. The onset of drone warfare has cemented this association, although all the loose talk of bombing Iran’s nuclear facilities demonstrates that the old meaning has not gone anywhere. One feature both kinds of strike have in common: a tendency to kill unintended victims, whether because a bomb goes astray or the wrong people happen to be near the target.
The vision of clean, precise death and destruction is very beguiling, and surgical strikes do pay off sometimes. But generally they work best in the fantasies of hawks and their adherents. We like the idea because it seems to get around the indiscriminate nature of warfare. Take out one lousy reactor, or terrorist cell, and everything would be so much better. A few bombs, a couple of deaths, a little ruined acreage add up to a small price to pay for peace in our time. In real life, the payoff isn’t usually as big as advertised. A surgical strike is pretty much impossible in an urban area, and even when it works, it creates blowback down the road. But the real problem is how easy it is for a few bombing runs to lead to protracted, inextricable wars, overt or covert. We must tread very carefully around anything that makes offensive military action seem tempting. Most of the time, the surgical strike is nothing more than a shortcut to more useless violence, more senseless death.
(2000′s | bureaucratese | “not knowing where your next meal is coming from,” “malnutrition”)
Just as “dysfunctional family” is a classic example of therapese, “food insecurity” is unmistakable bureaucratese. The first use I found in LexisNexis dates from 1977, uttered by none other than Lester R. Brown, environmental crier in the wilderness for nigh onto fifty years now. But most early examples of the term come from reports by the UN, the World Bank, or other such do-gooder organizations. The adjective version, “food insecure,” pops up for the first time in 1988. In the early years, it was generally used in the context of talking about hunger in Africa, but now it applies readily anywhere.
The earliest uses of “food insecurity” were not generally defined, and it may have had a broader connotation ca. 1980. In 1983, the co-founder of the Club of Rome, Aurelio Peccei, used the term in an address to the Club: “In the past, the concept of food security could never become a cultural value, because food insecurity was then the norm. Only more recently, since it has been shown that enough food could be produced to satisfy all human needs, has food security become an moral and humanitarian issue.” As in Brown’s use of the phrase, “food insecurity” seems to portend problems on a global scale, rather than on local or even national levels. Food insecurity leads to political insecurity and even revolution, not just individual uncertainty about access to sustenance.
By 1990, when U.S. agencies were classifying individuals and families as “food-insecure,” the USDA defined it as “limited or uncertain availability of nutritionally adequate and safe foods or limited or uncertain ability to acquire acceptable foods in socially acceptable ways.” The Associated Press (1999) summarized it as “unable to meet basic food needs at all times.” It’s not quite as bad as hunger, but it’s only one shaky step above that; any reversal of fortune can kick you into a much worse state. It’s closer to “malnutrition” (or “malnourished”), which didn’t mean you were starving, but did mean you were underfed.
“Food insecurity” is a functionary’s attempt at precision, a phrase devised to denote a state that isn’t out-and-out hunger but still something we have to worry about. (If your goal is to eliminate hunger, you have to get rid of food insecurity as well, because a certain number of the food-insecure will become hungry sooner or later.) The proliferation of bureaucratic vocabulary may arise from an honest effort to measure and categorize more precisely, or it may just stem from carelessness and lack of attention. “Food insecurity” is an example of the former, but in everyday use, its sound irritates us. One more clumsy euphemism from the government stock and store, which apparently is as rich as Fort Knox.
(1980′s | “soup kitchen,” “food bank”)
A “food pantry” is not really the same thing as a “soup kitchen” or a “food bank.” They do not generally serve hot meals, as a soup kitchen does, and unlike a food bank, they operate at the retail level rather then the wholesale. A food pantry distributes donated supplies to individuals or families. It’s a little like going to the grocery store, except you don’t have to pay and the selection is nowhere near as good. At least nowadays, they are almost always run by private groups: often houses of worship, sometimes unions or community organizations. But nothing prevents the government from running them, as it might maintain homeless shelters.
“Food pantry” used in this sense hardly shows up before the late 1970′s in Google Books; by the mid-1980′s it’s fairly common. Before that it was a mildly redundant way to say “pantry,” mostly used literally. “Pantry” has always struck me as a slightly odd word, but never until I sat down to write this entry did I look up its origin. The word has existed in English since 1300 or so, and it comes from the Old French word for “bread (storage) room.” All this talk of bread reminds me of another fun old word for a place where food is stored, “buttery,” which had nothing to do with butter. The buttery was the storeroom for butts, that is, casks or barrels.
LexisNexis spits out a spate of articles about food pantries’ efforts to alleviate hunger in the early 1980′s. I had trouble thinking of a precise pre-1980 equivalent for this expression, probably because there wasn’t one. (I remember schoolwide canned goods drives in the 1970′s, and my parents delivered food packages to shut-ins.) Before 1980, there were plenty of hungry people who needed help, but they got it in ways that didn’t require them to go to food pantries. If you got food stamps, you went to the store and stocked up. If you subscribed to Meals on Wheels, the food came to you. But a number of trends came together in 1981: unemployment went up as Paul Volcker’s Fed sharply restricted the money supply. That brought down inflation, but it drove up misery. The homeless population increased sharply — partly because a lot of people had recently been released from mental institutions — and started to include many more women and children. And the Reagan administration worked to cut federal aid to the needy, so food stamps were harder to get and bought less. (Reagan and his men also made it respectable to drag out the old canard that a lot of people getting government aid didn’t deserve it because they were lazy. The sneers directed at the poor went right along with tax policies that made life easier for a few at the top and harder for everyone below them, a trend that has continued unabated to this day.) Add it all together, and you had a much larger number of people in need at the very moment government assistance was shrinking. Concerned citizens did what they could to pick up the slack, but a shaky network of small groups dependent on a few active members and local donations lacks the reach and power of a national effort led by the federal government. Reagan succeeded in casting the Great Society into disrepute, but its replacement is much more fragile, much more easily overwhelmed.
(1990′s | journalese | “sex discrimination,” “no room at the top”)
An unusual expression in that it seems to have been invented, or at least brought to the world’s attention, at a specific, detectable moment. Or perhaps that is an illusion based on the superstition that LexisNexis is infallible. Every on-line source that ventures an opinion gives credit to a magazine editor named Gay Bryant, quoted in Adweek, March 1984. The phrase took off — according to LexisNexis — in the second half of 1986, with a sharp increase in incidences, including uses by Betty Friedan and Katharine Graham. A book titled “Breaking the Glass Ceiling” was published in 1987. By the early nineties the phrase was well settled. The speed with which it took its place in our vocabulary suggests a certain pent-up demand.
“Glass ceiling” has not changed much since then, other than to seek out groups other than women, like African-Americans. My sense is that it is still used far more often of women than of any other group. A glass ceiling is an unacknowledged barrier to advancement to the top levels of an organization. More specifically, it is the name for the attitudes and actions of male executives, who find ways to prevent women from advancing into the highest circles of management (I almost typed “hell” — but according to Dante, the worst circles of hell were the lowest). The image suggests women who are close enough to real power to see it, but unable to reach the goal. It’s transparent (i.e., unacknowledged), like glass, yet impenetrable, like a ceiling. You can see it, but you can’t get there. In an earlier age, we might have referred to such women as “(left) out in the cold.”
Every so often when you study an expression, you find alternative meanings that saw print once or twice and quickly disappeared. Usually they are as plausible as the winning meaning — often more — and it is not always clear why they disappeared. I can’t resist noting two that I came across. The first, from American Banker, November 22, 1985: “The response from the regulators and the Congress was a greater degree of voyeurism, and you and I found ourselves with offices that had glass walls and a glass ceiling. . . . The Comptroller’s office went on a witch-hunt, and the next thing we knew, our office had a glass floor and they were looking up our pants legs.” The emphasis falls on transparency, not impassability. Working beneath a glass ceiling means there’s no place to hide and you can’t get away with anything. Less than a year later, once again in American Banker, the president of the National Association of Bank Women said, “We’re not seeing women move into the very executive levels any more [any more?! -- ed.]. There seems to be a glass ceiling that’s there. I call it a glass ceiling because I think it’s a fragile one and that it is going to be shattered.” Another familiar kind of glass — the kind that breaks easily. In retrospect, her optimism seems unwarranted, but the way she uses the expression seems at least as satisfying as the one we accept today.
In the military, you may hear about the “brass ceiling” that keeps women out of the top ranks. In England, the “class ceiling” hinders the upwardly mobile who went to state schools. Maybe there are other imitators out there as well. When a woman does reach the inner sanctum, she may be said to “crack,” “break,” or “shatter” the glass ceiling. The problem is more pervasive, however. If one or two extraordinary women here and there get through, that doesn’t mean a path is cleared for everyone else. There will be a glass ceiling until it’s as normal and commonplace for women to take the levers of power as for men. As long as qualified and deserving women aren’t promoted at the same rate as men — and that ain’t changed — the glass ceiling remains as thick as ever.
Ever-flowing gratitude to my gorgeous girlfriend, who not only nominated the phrase but pointed out that another pre-1980′s equivalent was “gentlemen’s agreement.” Extremely apt, as always.