Tag Archives: employment
bring to the table
(1980’s | businese | “have to offer,” “start out with”)
What one brings to the table by definition benefits the party already there. It is a positive term, rarely used ironically, indicating qualities that will improve an existing situation or resolve a problem. In a job interview, it’s the thing that makes you desirable. Among athletes, it’s what will make the team into a winner. In diplomacy, it’s a bargaining chip that helps move the process along. Generally, it’s what you can do to help. There was a time when it might connote baggage as well as benefit; what you brought to the table was simply what you had, good or bad. But since 1980 or so, it has taken on the favorable connotation exclusively. The phrase arose in business and government; nowadays athletes also use it a lot. To my ear at least, when a phrase becomes popular among athletes, it has stepped irrevocably over the border into cliché country. I’m not exactly sure why, but I think it has to do with the fact that professional sports figures are quick to adopt new expressions from each other and use them frequently thereafter, rarely with any imagination or creativity.
You have to keep your eye on the table, because idioms that rely on that word come from different places. “Bring to the table” calls to mind negotiation: the big table everyone sits around to hammer out an agreement. “Everything on the table” almost certainly comes out of gambling — the moment of showing your hand. “Seat at the table” could come from either, or from the dining room. To get anywhere at any table, a seat is the minimum requirement. Waiters bring things to the table all the time, but that sort of pig-headed literal-mindedness doesn’t get the blog written. In all these expressions, the table by now is purely metaphorical; when an actual table is involved, we understand it to be a play on words.
There’s a certain kind of new expression that develops a settled usage even though it is not particularly distinctive and could occur in everyday conversation without any reference to the specialized meaning. That description is a little vague, so let me offer some examples: “at the end of the day,” “be careful out there,” “do the math,” “don’t even think about it,” “good luck with that,” “I’ll shut up now,” “in a good place,” “play well with others,” “smartest guy in the room,” “what’s your point?.” All of these expressions have in common an ordinariness, almost a triviality, that allows us to notice, if we think about it, that they could just as well have no meaning beyond that carried by the word string itself. And yet, when we hear such phrases, we grasp an extra dimension, so that even if the sense of the expression is not much different from the literal sense of the words, we know we are hearing a distinct expression. There must be a process that allows such utterances to transmogrify into idioms, but I don’t understand it. Is there any way to predict that “I’ll shut up now” would take on a universe of connotation while “I’ll go to the store” (so far) has not?
(1980’s | therapese | “the house feels so empty”)
This is one of those effortless phrases. The first example I found in Google Books dates from 1968; by the late 1970’s it was turning up in the mainstream press now and then, and everyone seemed to get it right away. At that early date, it still required quotation marks and a brief gloss, but little time elapsed before the expression made itself at home. It was well arrived by the time a sitcom of that title debuted in 1988, spun off from The Golden Girls. “Empty nest syndrome,” an early elaboration, is the most common use of “empty nest” in adjective form; “period,” “phase,” and “blues” are other possibilities. As noun or adjective, it retains an innocent, “literal” quality — of course, the phrase is not literal at all, but its evocation of pure-hearted little birdies seems to shield it from irreverent wordplay. Even after thirty years, the phrase has not developed much of an ironic life, and it is not often used to refer to anything other than a home (or family) from which the last resident child has departed. “Empty nest” does have unlooked-for complexity when you take it apart. The first half is literally false — the nest isn’t empty because the parents are still there. The phrase as a whole requires knowledge of how birds bring up their young, sheltering them until they reach maturity, then sending them on their way.
The semantics of “empty nest” may tickle the analytical brain, but the concept appeals to the emotions, and it soon found a home in the long-running debate between parents and grown children over whether it’s really a good idea for the kids to move back in rent-free after college. The kids are all for it; parents are much more divided on the question. In my own case, the model was the great economist or perhaps sociologist Thorstein Veblen, who returned to his parents’ farm after taking a Ph.D. because he couldn’t find work, and filled the time with reading and long conversations about society and politics with his father. That sounded pretty good to me, but Dad saw disadvantages to the scheme and suggested graduate school instead, which ultimately got me out the door for good.
Not all parents are unhappy at the thought of their children moving back in. Some parents get all broken up when the last child leaves the house, and they are the most vulnerable to later irredentism on the part of their down-and-out offspring. Other parents can’t wait to see the back of their kids and have looked forward to the empty nest for years. I haven’t done a study, but I doubt such empty nesters (is it my imagination, or does that term imply a certain affluence?) relish the prospect of having their uncouth twenty-something kids cluttering the living room. This antidote to the empty nest is now known as “boomerang kid,” a term which arose within the last thirty years. By the way, that news article we’ve all read about how unprecedented numbers of college graduates are moving back in with Mom and Dad has been a staple at least since 1980. It’s a wonder anyone under forty lives on their own.
It is less true now, but in the olden days empty nest syndrome was primarily associated with women, a rough complement to the midlife crisis for men. True, mothers nostalgic for having surly kids in the house didn’t usually buy sports cars or cheat on their husbands, but both middle-age traumas mark a troubled transition to a later phase of adulthood. How can you tell “empty nest syndrome” was a well-established concept by 1985? By that time a whole new branch of the advice for the lovelorn industry had already sprung up, especially in women’s magazines, soothing unhappy mothers with an endless stream of counsel and reassurance.
(1990’s | teenagese? therapese? | “siesta,” “catnap,” “forty winks”)
I grew familiar with this term as businese, through articles about frazzled employees needing a way to get back on track during the workday. That’s probably where you learned it, too, but the phrase more likely saw the light of day elsewhere. It was in use among college students in the late eighties, and still is, but it became much more familiar to the rest of us in the nineties when psychologists started pushing the benefits of resting and recharging at the office. The businese definition has largely won out, yet students even today may assign the phrase a slightly different meaning. Businesspeople use the term to mean a short period of sleep intended to increase alertness, vigor, and therefore productivity. Students use it that way, too, but it can also mean a period of deep sleep without any indication of duration. In 1988, New York Times columnist Richard Bernstein defined it as “deep sleep induced by extreme exhaustion,” and cited it as an example of college slang. That sense has not disappeared completely, though it has been largely eclipsed.
The reason it sounds like businese is that it goes with “power lunch” and “power tie,” which became clichés in the eighties, when the cult of the world-bestriding businessman, brought low for a couple of generations by the Great Depression, ramped up again. Flaunting was in, and executives took pride in asserting their prerogatives. In the early nineties, when psychologists like Dennis Shea, James Maas, and Bill Anthony began writing about the benefits of brief rest periods for white-collar workers, “power nap” made our vocabulary more productive and efficient. (I can’t resist: “Feeling logy at work? There’s a nap for that!”) But powerful people don’t generally sleep on the job if they want to stay that way, and a power nap wasn’t a way to project one’s own muscle (like a power tie) or extend one’s dominion (like a power lunch). The fit isn’t as neat as it sounds, more evidence that “power nap” was not native to businese.
In 1992, the Guardian, reporting on the U.S. military’s methods of keeping soldiers minding sensitive or complex equipment as sharp as possible, noted that those charged with such duties were instructed to rest regularly: “to avoid implications of sissiness, such rests are called ‘power naps.’” Another possible origin story for “power nap,” one I don’t find very convincing. There’s no doubt that our armed forces are a great source of euphemisms (collateral damage, anyone?), and it’s also true that there is a lot of stubborn machismo in the ranks. But even the Army must put aside long-cherished prejudices when science and experience team up to demand it. “Soldier, I order you to take a power nap before your next eighteen-hour shift!” “Yes, sir!”
No matter how many studies demonstrate that short rests during the workday improve employee performance, most bosses still view power naps as proof that workers aren’t serious about their jobs. I’m as prone as anyone to get sleepy after lunch, but I shudder to think of how my boss would react if he caught me in an actual doze. Your average boss just can’t get past that rock-bottom-line calculation: time spent sleeping is time spent not working, and you’re here to work, so sleeping on the job is dereliction, dress it up as you will. American bosses are not, on the whole, a very imaginative or innovative lot. The experts can talk till they’re blue in the face, but the boss knows what he knows. Power naps are for weaklings.
(1990’s? | academese (economics) | “employable population,” “what one has to offer”)
In 1979, an economist named Theodore Schultz won the Nobel Prize. He was noted for studying “human capital”; in fact, he used the term in his acceptance speech. At that time, the word remained the exclusive property of economists, in or out of academia. (The first citations in LexisNexis come from Paul Samuelson’s Newsweek columns in the mid-1970’s.) President Carter used the phrase in a Labor Day Proclamation in 1980. After that, it began to show up more often in reporting and editorials. Politicians and journalists started to use it, and it has become pretty ordinary by now.
This phrase bears a slippery resemblance to another expression that has flourished since my youth, “human resources.” If we are human capital en masse, then each of us might be considered a human resource, just another bit of carbon-based raw material for the all-embracing economy, from whom all blessings flow. But that isn’t how we use “human resources,” which doesn’t exist in the singular. It’s part of a company — the part known as “personnel” when I was a boy — in charge of hiring and firing and employee relations. Oxford Online defines “human capital” to mean “the skills, knowledge, and experience possessed by an individual or population, viewed in terms of their value or cost to an organization or country,” which covers pretty thoroughly the ways in which the term is used.
Most of the time the emphasis falls on “capital” when this expression rears its head. The purpose of human capital is to benefit an employer — that is, it’s what you bring to the job. That means the employee can be treated as a commodity, whose salary and benefits amount to rent for whatever attributes she has that boost the employer’s profits. (Here‘s a useful distillation of that point of view.) Economists blandly employ this sort of thinking every day: You are what you’re worth. But it also possible to place the emphasis on “human.” I found a brief but rather touching post on deloitte.com that urges thinking about your employees as more than additions and subtractions on the balance sheet. Unlike physical capital, human capital needs to be nurtured and recognized for its good work. If not, it can always leave the employer high and dry if it feels mistreated. As long as there’s another boss out there willing to act a little more humane and less capitalist. (Of course, the employer is also free to rescind investments in human capital, in the form of education, vocational training, affordable housing, better health care (or child care), etc. If the boss isn’t satisfied with the return, he can always cancel the benefits.)
Many screeds stand to be written about this phrase, so glibly tossed around by bureaucrats and technocrats. To me its most disturbing aspect is the way it makes us worth anything only insofar as we contribute to the gross domestic product — only as long as someone is making a buck off us. The category “human capital” is generally opposed to “physical capital,” but they are both judged by their profit potential; all other talents, abilities, and attractions are strictly subservient. Another point against the phrase: it turns us all into servants — in fact, you don’t have to mumble much for it to resemble “human chattel,” which may in turn remind us of cattle. It’s true that even the few at the top are, strictly speaking, part of the whole economy’s pool of human capital, and therefore serve the same remorseless, soulless capitalist machine as the rest of us. But the one percent — who may, like the machine, have little in the way of soul — have grasped the levers of power. They may serve the system, but they don’t serve the boss.
(1990’s | businese (banking?) | “proper procedure,” “agreed-upon standard”)
Many of the terms I’ve covered in this blog are genuine arrivistes on the vocabulary scene — expressions that were unknown, or expressions already in use that have picked up startling new meaning(s) since 1980 or so. What interests me about “best practices” is that its metamorphosis has been much more subtle; it has undergone a micro-evolution that still has resulted in a significant change. There’s not much in the way of semantic or grammatical fireworks here, but I’ll try to sketch it out. (Actually, there is one grammatical oddity: I can’t quite figure out whether “best practices” is singular or plural. It sounds plural but feels singular.)
The phrase “the best practice” means simply “the preferred method,” and has been so used for a long time. It refers to a way to get the job done as quickly and completely as possible. It might be plural, but was more often singular and applied in a particular situation or field. Here’s an example from the Washington Post (July 19, 1981): “When you water tomatoes, the best practice is to moisten the soil slightly deeper than the root system.” A very specific activity in which the most effective means will be clear and consistent.
Around 1980, in the banking and industry press, you start to see “best practice(s)” either shorn of its article or sometimes even taking the indefinite article. The ever-useful American Banker provides one of each. “Examples of ‘best practices’ emerging within the organization” comes from the October 14, 1980 issue; “About 15 years ago the Bank Board decided that branching was a ‘best practice’ [for S&L’s] and permitted it by regulation” dates from October 21. These are both easily recognized instances of the way we use the term now, although the indefinite article has disappeared. In this sense, it effortlessly becomes a hyphenated adjective, as in “best-practices training.” The term didn’t become all that common even in the business press until 1990 or so. By the mid-1990’s it was picked up in computerese and its use grew dramatically; the January 2000 issue of Governing Magazine cited it as a buzzword. The phrase “best practices” — no article — now has put down roots. It was doubtless influenced by a set British phrase, “best practice,” which means the same thing.
At this point you’re entitled to ask exactly what the phrase means. It promises empirical answers, derived from experience, proven to work under a given set of conditions, which may be general enough to hold up across a wide range of circumstances. Another aspect: it is always a matter of consensus; there has to be general agreement on what “best practices” might mean within the field. If there’s no widespread agreement about how to handle a certain situation, then you can’t draw up a set of best practices for it. In certain fields, at least, “best practices” requires keeping up with technical advances; they can become obsolete fast in a field like computer programming or nanotechnology (or even just lab maintenance). Often it suggests some sort of official or government approval, or at least insurance against regulatory violations. Ideally, then, “best practices” is the current consensus on the most effective methods of accomplishing the generally acknowledged goals in a given area of endeavor.
Maybe it’s just me, but the shift from “the best practice” to “best practices,” minor though it seems, marks a significant change. “The best practice” partakes of home-and-hearth common sense: we’ve tried different strategies and learned there’s a way to proceed that will work better than the others, so that’s how we’ll do it. “Best practices” is more opaque somehow; even though they are supposed to arise from the workers and managers, they are usually drawn up far from the shop floor then handed down from above by the bosses and mandarins. “Best practices” is the descendant of the work done by much-maligned efficiency experts in the 1950’s, another example of the boss imposing ill-considered standards from above rather than allowing employees to work together to create well-founded, tested means to accomplish the organization’s goals.
Thanks again to my most prolific supplier of new expressions for suggesting this term months ago. Hang in there, Charles, I’ll get to all those other ones you’ve sent me sooner or later.