Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 40 years

Tag Archives: parenting

latchkey kid

(1980’s | therapese?)

Also latchkey child, though that wording seems almost archaic now. Some sources date the expression to the 19th century, but it’s probably later. Random House assigns an origin between 1940 and 1945, and Dorothy Zietz in “Child Welfare: Principles and Methods” (Wiley, 1959) cites not only “latchkey child” but “eight-hour orphan” and “dayshift orphan” as synonyms. Zietz points to “’emergency’ day care programs which became prominent during World War II [that] are now regarded as part of the community’s basic child welfare services,” which will come as no surprise to anyone who has ever heard of Rosie the Riveter. Nonetheless, in 2017 it is generally assumed that Generation X both invented and perfected the concept of the latchkey kid. Scattered references can be found before 1980, but the phrase really took off afterwards, which explains why Gen X gets the credit. (Full disclosure: I’m a proud member of Generation X (the older end) but was not a latchkey kid.) I can’t find any sign that “latchkey child/kid” came along before World War II, certainly not as early as the nineteenth century. It’s easy to imagine a Victorian illustration of a disconsolate waif with a key on a string or chain (not a lanyard) around her neck, but the term was not needed then because the kids were working the same hours as their parents. We still have plenty of latchkey kids, of course, but the novelty has worn off. Today, Free Range Kids carries on the tradition of advocating unsupervised time for children.

God help us, a lot of those Gen X’ers are parents now, and they indulge in the eternal practice of contrasting their kids’ experience unfavorably with their own. The Generation Next of parents proclaims that all that time with no adults in the house made them resilient and self-reliant, and maybe it did. But then why have so many turned into helicopter parents who starve their own kids of opportunities to learn how to manage without adult intervention? I suspect such generational shifts aren’t all that unusual, because parents have a commendable desire to spare their children the traumas they had to go through. But the wider tendency to bewail these kids today goes back a long time, too long and steady to be wholly unfounded. Every generation of parents sees their own experiences as definitive and notices only that which has deteriorated. The thing is, a lot of the time they’re right; standards do change, sometimes for the worse, and good parents must be especially alert to such slippages.

We associate latchkey kids with working single mothers and always have, though plenty of them have working fathers. From this has arisen a certain stigma the phrase can never seem to shake. Even today, it is used as a class marker, one of many indications of poverty, crime, substandard education, and the rest of it. Numerous studies suggest that latchkey kids don’t generally do worse than average; they share the fate of all studies that call easy explanations into question. We just know that the kids are worse off now and/or will do worse as adults; don’t try to tell us different. It is common to read nostalgic accounts of eighties childhoods, but at the time most press coverage — and there was quite a bit — was marked by dubiety. Some researchers pointed to pervasive fear among latchkey kids of emergencies they were unequipped to handle, or of intruders, or just of being all alone in an empty house. Latchkey kids may not want to relate such feelings to their parents, knowing that expressing doubt or anxiety will disappoint or irritate their hard-working elders. Then again, some kids learned to keep house, manage their time, or just watch lots of television. It’s unlikely that most parents want to leave their kids alone day in and day out, but unless the kid shows obvious ill effects, there’s no point feeling guilty over it.

Tags: , , , , , , , , , , , ,

empty nest

(1980’s | therapese | “the house feels so empty”)

This is one of those effortless phrases. The first example I found in Google Books dates from 1968; by the late 1970’s it was turning up in the mainstream press now and then, and everyone seemed to get it right away. At that early date, it still required quotation marks and a brief gloss, but little time elapsed before the expression made itself at home. It was well arrived by the time a sitcom of that title debuted in 1988, spun off from The Golden Girls. “Empty nest syndrome,” an early elaboration, is the most common use of “empty nest” in adjective form; “period,” “phase,” and “blues” are other possibilities. As noun or adjective, it retains an innocent, “literal” quality — of course, the phrase is not literal at all, but its evocation of pure-hearted little birdies seems to shield it from irreverent wordplay. Even after thirty years, the phrase has not developed much of an ironic life, and it is not often used to refer to anything other than a home (or family) from which the last resident child has departed. “Empty nest” does have unlooked-for complexity when you take it apart. The first half is literally false — the nest isn’t empty because the parents are still there. The phrase as a whole requires knowledge of how birds bring up their young, sheltering them until they reach maturity, then sending them on their way.

The semantics of “empty nest” may tickle the analytical brain, but the concept appeals to the emotions, and it soon found a home in the long-running debate between parents and grown children over whether it’s really a good idea for the kids to move back in rent-free after college. The kids are all for it; parents are much more divided on the question. In my own case, the model was the great economist or perhaps sociologist Thorstein Veblen, who returned to his parents’ farm after taking a Ph.D. because he couldn’t find work, and filled the time with reading and long conversations about society and politics with his father. That sounded pretty good to me, but Dad saw disadvantages to the scheme and suggested graduate school instead, which ultimately got me out the door for good.

Not all parents are unhappy at the thought of their children moving back in. Some parents get all broken up when the last child leaves the house, and they are the most vulnerable to later irredentism on the part of their down-and-out offspring. Other parents can’t wait to see the back of their kids and have looked forward to the empty nest for years. I haven’t done a study, but I doubt such empty nesters (is it my imagination, or does that term imply a certain affluence?) relish the prospect of having their uncouth twenty-something kids cluttering the living room. This antidote to the empty nest is now known as “boomerang kid,” a term which arose within the last thirty years. By the way, that news article we’ve all read about how unprecedented numbers of college graduates are moving back in with Mom and Dad has been a staple at least since 1980. It’s a wonder anyone under forty lives on their own.

It is less true now, but in the olden days empty nest syndrome was primarily associated with women, a rough complement to the midlife crisis for men. True, mothers nostalgic for having surly kids in the house didn’t usually buy sports cars or cheat on their husbands, but both middle-age traumas mark a troubled transition to a later phase of adulthood. How can you tell “empty nest syndrome” was a well-established concept by 1985? By that time a whole new branch of the advice for the lovelorn industry had already sprung up, especially in women’s magazines, soothing unhappy mothers with an endless stream of counsel and reassurance.

Tags: , , , , , , , , , , , , ,


(1990’s | counterculturese? journalese? | “hippie” (adj.), “tree-hugging”)

“Crunchy granola” (adjective or noun) is a common variant. I remember hearing “nutty-crunchy” first around 1990, and I had to have it explained to me. (Even then, your humble maniac was hard at work.) It’s not clear to me when this expression arose, but surely not before 1980. One is expected to suppress mental cross-references to the old sense of “nutty” (crazy), but detractors of the environmental movement cheerfully let them creep in. In fairness, some exponents also emphasize the “nutty” in “nutty-crunchy,” taking pride in their purity. But “crunchy” is the word you have to watch, for its overtones have changed. At first, it referred to environmentalists, with the implication that they lived off the land or at least made their own stuff. Now the implication is a little more rarefied, especially in the term, “crunchy (granola) mom”: someone who gives birth with the aid of a midwife, breastfeeds, uses cloth diapers, makes her own organic baby food (but need not grow the vegetables herself), won’t eat meat, and maybe co-sleeps or refuses vaccinations. Not being a big player in the parenting game, I wasn’t familiar with this phrase until I started looking around, but we may measure its ubiquity by the number of on-line quizzes telling new mothers how crunchy they are.

A digression on “crunchy granola” used as an adjective: It continues to sound strange to me, but you do hear it; it may obliterate “nutty-crunchy,” which I sense has become less common. The short form, “crunchy,” at least sounds like an adjective. The full-length form reflects a certain exuberance — the “I’m weird and proud of it” attitude characteristic of the counterculture, the weirdness extending to the eccentric use of “granola” as an adjective. It is not clear to me whether this expression arose from the believers or the mockers, but in practice it may not matter, since the former steal from the latter all the time. The other odd thing about the yoking is the fact that the connection between granola and the counterculture does not hinge on crunchiness. “Organic granola” would make more sense, or even “nutty granola.” “Crunchy” is more evocative than either of these, and “chewy” would be worse, but I haven’t quite figured out why it became the preferred shorthand for one who is environmentally conscious, or fanatical about one’s health or childrearing practices.

Crunchy beliefs and behavior do not belong exclusively to the left or right; they are where both extremes converge. A 2006 book by Rod Dreher, “Crunchy Cons,” points out that many right-wingers do crunchy things, too. The specific manifestations may differ — right-wingers seem to do more home-schooling, for example — but both modalities boil down to rejection of the way most people obtain the necessities of life and raise their children, powered by the middle-of-the-road scientific consensus that tells us how to live our lives in a thousand minute, complicated ways. It’s an old idea in this country, though in some instances it has relied on science rather than keeping it at bay. In the nineteenth century (the word “granola,” originally a trade name, goes back to 1875) we had Graham and Kellogg; before them countercultural ideas about nutrition or lifestyle often stemmed from outlying sects like the Shakers. I’m old enough to remember Euell Gibbons, who shilled for Grape Nuts (there’s that nut again). The sixties gave natural living another boost, and the tradition goes on.

Tags: , , , , , , , , , , , ,


(1990’s | therapese?, academese (education)? | “serve as a model of or for,” “exemplify”)

Here is a verb that has turned right around in the last thirty years. Back then, when applied to conduct, “to model” meant “follow another’s example.” It was normally used with “on,” or maybe “after,” as in children modeling themselves on a popular athlete, a rock star, who knows? maybe even a parent or teacher. And now? Now it means “set an example for another.” Those responsible for raising or schooling children must do things like “model appropriate behavior,” so their charges will see their way clear to becoming civilized. When did the change occur, and why? The first example I found on LexisNexis lurked in a review of several children’s books at the beginning of 1988, but it doesn’t seem to have become commonplace for at least another decade. My recollection, which I have learned not to trust very much, is that I started to hear it certainly by the late nineties, but I’d be hard-pressed to say when it became customary. Neither is it clear to me where the word arose, although therapese seems the most likely answer. But the verb was also used in its new sense early and often in the ed biz (as the immortal (so far) Tom Lehrer called it), so the educators may have a claim to ownership as well.

The shift in meaning looks larger than it is. “Model” has been used for centuries as a noun or adjective, generally denoting a pattern or example worthy of emulation or copying, in general or in particular. The model citizen or the model of bravery or generosity, to say nothing of the artist’s model, have an ample pedigree. But Noah Webster’s dictionary gives only one definition of “model” as a verb, which encompasses the notion of following a pattern described in the first paragraph. (Artist’s or fashion models of the day “posed,” one supposes.) Scientists and economists have long used the term in a way that seems analogous to today’s meaning: “create a model of.” The phrase “role model” came along in the fifties, according to Random House, and that phrase led to a veritable gamut of post-Freudian psychological usages of “model.” Role models do not generally embody one specific quality but are thought to be worth studying across the board. Instead of emulating Washington for this or Socrates for that, we started concentrating on finding just one all-around good person to emulate. It’s hard enough just finding one pair of coattails to ride.

And why did the word change meanings? When we think of “modeling” good behavior, we think primarily of adults doing it for children. When we thought of “modeling” one’s acts after others’ examples, we thought primarily of children doing it with reference to adults. The subject-object switcheroo goes along with a cultural shift in demands on parents. In the old days, kids had to buckle down and learn to do what was expected of them in public. Responsible adults offered guidance and were expected to help, or at least not retard the process, but it was the kid’s responsibility to work out ways to control himself and make an effort to comply with social conventions. As we make ourselves at home in the twenty-first century, parents are expected to do more and more of what used to be regarded as the kids’ work. The older generation must lay everything out so plainly that no child could possibly misunderstand — willfully or otherwise — the rules they are expected to follow. The grown-ups have to come up with a way to train the kids that doesn’t require them to exert any effort or risk failure. Deprived of any stake in their own improvement, the young’uns will infallibly turn into enviable adults, right? Of course, most kids turn out o.k. if their parents do even a middling job raising them, but it seems to me that conventional wisdom — and a parade of parenting manuals — demands more of parents, and less of children, than it did in the old days.

Actually, I was channeling my sister in the previous paragraph, who was ruminating recently on changes in our child-rearing practices, and who is well-qualified in general to discuss these kids today. I may have misrepresented her views, and I alone am responsible for any errors of fact or interpretation.

Tags: , , , , , , , , ,

soccer mom

(1990’s | journalese (polling))

Cast your minds back to 1995, those of you who go back that far. An unheralded candidate in a race for a seat on the Denver City Council, Susan Casey called herself a “soccer mom” and won the election. A year later, the phrase was heard round the world during the presidential campaign, with both parties wooing soccer moms aggressively. That was when the expression impressed itself on the national lexicon, within the span of a month or two during that singularly undramatic presidential contest. But Casey’s use of “soccer mom” gained her minor national attention; I remember learning the phrase at that time. Before Casey, the phrase, when used at all, connoted no more than boosterism or helping out with kids’ soccer leagues. Despite its political path to prominence and occasional use as a code word (see below), the lowest-common-denominator meaning of the expression — “suburban mother” — emerged quickly and decisively. We form the hackneyed image of a well-off white woman ferrying the kids to various extracurricular activities in the family minivan. The so-called “Soccer Mom Madam” — the suburban mother convicted in 2012 of running a prostitution ring in New York — was so called simply because she had kids and lived outside the city; it didn’t have to do with her party affiliation, employment status, driving habits, or anything else.

The suburbs had formed the object of intense political strategizing for a generation by 1995, but soccer moms energized the bloc and influenced the 1996 election. Early sightings that summer offered definitions: “overburdened, middle income working mothers” (E.J. Dionne quoting Bob Dole’s campaign strategist Alex Castellanos, Washington Post, July 21) or “working mothers, in the suburbs, stressed out and stretched thin” (CBS News, August 29). While only one source mentions the suburbs, both include “working” in the description; I think that is not an essential component of the phrase today. By 2000, “soccer mom” had acquired a left-wing tinge. It was assumed, at least in political discourse, that soccer moms were environmental do-gooders or health nuts or something that made them objects of contempt in right-wing eyes (imagine caring about the health and well-being of your children!). Terms popularized by political consultants are subject to these sorts of shifts, because strategists live by dividing the electorate into ever-narrower slices, defined precisely enough that a certain kind of direct appeal has a good chance of reaping votes, so they try to pile on as many traits as possible to create the narrowest possible definition. Have soccer moms held onto their political clout? Political types no longer use the expression much, or make much effort to reel in their votes, not in any obvious way.

“Soccer mom” (also “hockey mom”) had no precise pre-1990 equivalent, I believe, even though the practice of driving carloads of children to this practice or that class or those lessons was widespread in my childhood. (Back then, we rode in station wagons, kids! Ah, those battleships of the road, some of ’em twenty feet long, slatternly yet majestic with or without the fake wood paneling.) There just doesn’t seem to have been a word for it, much less a socioeconomic category. Soccer hadn’t entered its boom phase in the U.S. yet, although it was closer than any of us knew. (I don’t recall that we had little-league soccer in my reasonably affluent suburb in the seventies.) “Mom” as a common noun didn’t roll off the tongue quite as easily back then, and that’s part of it, too. Then there’s the possibility that most kids didn’t have as many after-school activities that required being driven somewhere. The point is, we could have had baseball moms or ballet moms, and we didn’t. Suburban mothers were not on anyone’s radar as a political force in the seventies, and it didn’t occur to anyone that they might need a special name. When the time came, the word sprang forth to enfold (or obfuscate) a new set of assumptions about power, gender, and family.

Tags: , , , , , , , , , , ,

dysfunctional family

(early 1990’s | therapese | “messed-up family,” “bad home environment”)

No prizes for guessing the origins of this popular phrase. Could there be a more typical example of therapese? I ought to investigate the uses of the word “dysfunction” and its derivatives in psychology; it occurs everywhere. (I’ve covered one instance already.) From Google Books, I deduce that “dysfunctional family” arose in the literature around 1970, at first primarily as a compound adjective. It was not unusual to see phrases like “dysfunctional family structure or pattern or system.” But the temptation to use it on its own soon grew irresistible. Psychotherapy became more widespread as the passionate sixties dwindled into the neurotic seventies, and it became fashionable to blame transgressions on one’s upbringing, as if the whole country were singing along with the Jets in “Officer Krupke.” By the end of the eighties, “dysfunctional family” was sprouting through all the usual therapese conduits: art critics, advice columnists, clergy, lawyers concocting a defense, and so forth. By 1995 it was paralyzingly common.

What is a dysfunctional family? One headed by a person or persons unable to hold a home together, of course, or more generally one that runs on dishonesty or intimidation or otherwise instills bad relationship habits into hapless children and dooms them to do the same damn things when they grow up. Therapist Barbara Cottman Becnel in 1989: “In a dysfunctional family, you’re generally taught don’t talk, don’t feel and don’t trust.” But these noble core values may effloresce in so many different ways. As Tolstoy would have said, had he thought of it, functional families are all alike; every dysfunctional family is dysfunctional in its own way.

The dysfunctional family is, after all, the root of all evil — a handy, ever-present villain that works for all non-orphans (i.e., nearly everybody). We are all more or less damaged by our upbringers. Whatever’s wrong with you, chances are your parents are to blame somehow. This became a popular sentiment, except among parents, who felt compelled to point out that lots of people could claim credit for messing up the new crop of kids. Homes, after all, are not impermeable fortresses. In an earlier century, we noticed that larger social trends and tendencies contributed to individual tribulations, and we took measures at the federal and state levels to discourage cultural forces that increased hardship and misery. Now we insist furiously that adults who haven’t turned out well are responsible for their own individual moral choices. Since blame the individual took precedence over blame society, families have become a natural target, even though, implicitly, this explanation has the effect of once again displacing blame from individuals to their environment. The contradiction between blaming the families and holding the products of their failures absolutely responsible for their own misdeeds never seemed to bother anyone. Actually, it was a nice two-for-one, having another set of people to find guilty of every single example of anti-social behavior. Plenty of upstanding Americans wanted to see any and all kinds of crime severely punished, even though they might concede, if cornered, that maybe it wasn’t entirely the individual’s fault. Blaming society was never satisfying because it was too big and abstract, but particular individuals and families are small enough targets to allow respectable types to reap some reward from reviling them.

The phrase probably would have begun to diffuse during the 1980’s no matter what. But several eighties trends (and I don’t mean Madonna or leg warmers or Brat Pack movies) gave us lots of reasons by the end of the decade to reflect on the causes of what you might call anti-social behavior. We had AIDS, we had crack, we had high murder rates, even a stock market crash. We needed an explanation for all the messed-up people out there, and pundits and bloviators lost no time in declaring that dysfunctional families had done the damage. There go those larger social forces again, hijacking the discussion. The concept of the dysfunctional family is convenient because it helps us focus on the individual wrongdoer and ignore what’s going on in the greater world. Words can be very useful that way.

Tags: , , , , , , ,


(2000’s | “older woman on the make or prowl,” “cradle robber,” “(old) hussy”)

The Encyclopedia of Women in Today’s World (2011) traces “cougar” to Canada right around 2000, and it does show up first in the Canadian press; the first citation in LexisNexis appears in 2002 on a Canadian news program, one year after Canadian sex and relationships expert Valerie Gibson published “Cougar: A Guide for Older Women Dating Younger Men.” That about sums it up. Authorities may quarrel over how old a woman has to be to qualify as a cougar — most people who try to define the term say over forty — and how much younger the man has to be. Based on my limited research, the age difference has to be at least ten years, and the more the merrier. The term doesn’t seem to have taken up residence in the U.S. until after 2005; the earliest use in the New York Times dates from 2009. Writers and editors no longer feel compelled to gloss it, although as recently as two or three years ago many instances of the word in print came with an explanation. Cougars (the cats, that is) seem to be a lot more common in Canada than in the U.S., and maybe also more common in Canada than other kinds of hunting cats (not that I’m a zoologist), so it is not surprising that Canadians would have pioneered the new use of the word.

There seems to be little doubt that the word takes its new metaphorical meaning from the predatory habits of cats. Such women are conceived as hunters preying on young men. The term may be applied to women who hit the bars and bring home a new conquest every night, or who take up with one man for an extended relationship, as long as he’s noticeably younger than she. The term may be used as an insult or to express “you go, girl” solidarity. It’s a very old idea; wise young men have always understood that you can learn a lot from mature, experienced women. Today’s twist comes from the idea of older women taking public pride in, not to say gloating over, their pursuit of younger men. Cougars claim a privilege traditionally reserved for the male of the species, while celebrating their attractiveness and sex drive into middle age and beyond, staking out territory generally unavailable to women who went before.

“Urban cougar” is a common variant, the adjective probably intended to dispel any confusion with the cat prowling the woods, but most people probably grasp the word now without the elaboration, and it is likely to disappear over time. records other epithets sprung up in cougar’s wake — bobcat, jaguar, panther, puma, and a whole bunch that have nothing to do with cat names — which may distinguish age ranges, or something. Urban Dictionary does get a bit fanciful at times. Then there are “manther” and “cheetah,” which refer to older men chasing younger women. I rather like “manther,” and “cheetah” has a certain lex appeal.

Thanks through the ether to my nephew, who used “cougar” in conversation, little suspecting that my antennae were on full alert. Tell your dad to read my blog, kid.

tiger mother

(2010’s | “disciplinarian,” “slavedriver,” “strict mother”)

This term burst upon the scene with unusual speed and force in 2011, when law professor Amy Chua published “Battle Hymn of the Tiger Mother.” Partly a critique of western (i.e., American) child rearing, partly an account of the limits of her own Asian (i.e., Chinese) methods of raising children, the book seeded any number of blogospheric disturbances, with the usual fuss and bother and misunderstandings. I did not read the book, but my five-minute summary understanding based on a few blog posts and web videos suggests that what defines the “tiger mother” is making demands on her children, requiring them to work hard and resist complacency. The premise is open to debate — and since when does one set of child-rearing practices work for everybody? — but many critics seem to have taken the most extreme moments described in the book as typical of the tiger mother and set up a straw woman instead of engaging Chua’s primary points. My broad-brush analysis, based on sketchy research, is not to be taken entirely seriously, but this sort of distortion-by-simplification is pretty common in our discourse, or what passes for it these days, so I’m prone to suspect that something of the kind occurred in this instance.

Chua’s elder daughter has matriculated at Harvard, and apparently her younger daughter, who rebelled against her mother and forced her to relent to some extent, also is doing quite well. The older daughter’s achievements are often touted as vindication of Chua’s methods, but I can think of lots of explanations for the daughter of two high-powered Yale law professors getting into Harvard. Few people seem to have addressed the possibility that the offspring of two extremely intelligent, motivated, and well-connected parents would probably do pretty well regardless of how her mother treated her.

American parents with enough leisure and income to be self-conscious seem to enjoy entertaining misgivings about their child-rearing missteps, so Chua’s seeds fell on fertile ground, and the discussion continues, although instances of both “tiger mother” and “tiger mom” in LexisNexis fell off significantly in 2012 and are poised to drop again in 2013, barring a sudden revival of the topic. (“Tiger mom” may soon become the preferred form as “moms” complete their takeover of the language.) “Tiger parent” turns up now and then; “tiger father” or “dad” hardly at all. And that’s revealing: mothers more than fathers are now held responsible for how their kids turn out, and kids who don’t turn out well are blamed on failures of maternal discipline — not like the old days, when society held the father responsible (thanks, Liz!). The “tiger mom” also conjures up all too easily the all-too-familiar image of the shrewish mother hectoring and exhausting her children, a stereotype of long standing held against relatively successful immigrant groups. Used to be Jews; now it’s Asians.

In thirty years, the phrase may seem quaint or irrelevant, just one more passing fad ginned up by the media, social or otherwise. Too early to tell, of course. It’s not clear to me why Chua chose “tiger” as her epithet, but it doesn’t seem to have anything to do with tiger moths or Tiger Woods. Count me relieved.

Tags: , , , , , , , , ,


(1990’s | advertese | “pre-teen,” “person at that awkward age,” “kid”)

A word we owe to advertisers. It bubbled up in the late 1980’s, mainly in marketing publications, although it appeared in the mainstream press now and then, most notably in a USA Today series inaugurated in September 1989, “The Terrible Tweens.” (Royal Caribbean seems to have been an early adopter, offering both “Teen” and “Kid/Tween” programs on their cruises by the end of the 1980’s.) It took a few years to mature, but the word was solidly established within ten years and has become widely recognized and understood.

The origin of the term appears uncomplicated. The resemblance to “teen” is obvious (it’s why we don’t call them “twixts”), and the reference to the time be”tween” young child and teenager is catchy. It was defined as “those between 8 and 12 years old” in the Washington Post (January 24, 1988), which is, I suspect, about how the term would be generally understood now. Maybe 9, maybe 13, but since tweenhood may be a state of mind that need not correspond with precise ages, we should expect a little fuzziness. Some definitions showed more variation in the beginning; for example, a report on McDonalds’ advertising strategy (November 9, 1988) explored its practice of marketing to subgroups including “‘tweens’ (9-to-16 year olds),” while an article in Adweek less than six months earlier gave a range of “10-15.” U.S. News (April 1989) confidently gave “9 to 15.” You could get pretty much any endpoints you wanted, but the core of prepubescents and beginner pubescents remained constant. The traditional preference for 12 or 13 as the beginning of the teenage years seems to have reasserted itself, and there’s much less tendency to incorporate full-blown teenagers into tweendom nowadays. Sometimes the word was spelled with an initial apostrophe in the beginning; sometimes you saw “tweenage” or “tweenager.” It’s a good thing the variant didn’t catch on, or we would all be heartily sick of hearing about Justin Bieber, tweenage idol.

We may see this term simply as the product of the advertiser’s restless, relentless pursuit of the bottom dollar. Whenever defenseless spending money is discovered in a sub-group of the population, the sharks of commerce circle, seeking to engross a healthy chunk of it for themselves. Somebody found out that pre-teens — some of them, anyway — had a certain amount of money, so they had to be defined, categorized, converted to data, and appealed to. Just another demographic in an ever more precisely demarcated consumer universe. Pre-teens’ embrace of social media has lately given the youngsters a new kind of consumer power (and new ways to get into trouble).

The word soon elbowed its way into the parents’ lexicon, adding one more milepost to a track stretching from colic and the terrible twos to empty nests and fledglings returning to fill them. It’s one more group to worry about, one more place the wheels can come off the cart — according to a world view in which childhood and youth are recognized as a succession of traumas. If we hope to understand our children, we must learn about the special characteristics of tweens, their developmental stages and kinks, their symptoms and syndromes, and how not to ruin them utterly (hint: anything you say or do may doom them to a bitter, ineffectual adulthood). The same urge to dissect ever more finely, to understand ever more minutely, is at work among parents as it is among advertisers.

In 1988, Polaroid (Polaroid!) offered its Cool Cam to the youth market (PR Newswire, February 19, 1988), “designed especially for trendy ‘tweens'” (defined here as “the latest demographic label for the 9- to 14-year-old set”). The “tween,” understood as another subgroup of the youth population, was very new then. Nowadays cascades of carefully orchestrated opportunities to spend money confront tweens at every turn, including a fashion designer for tweens who is herself a tween (she promises “blood, sweat, and glitter”). They have money, they have Twitter, and they know how to use them. The rest of us had better stand back.

Tags: , , , , , , , , , , ,

helicopter parent

(2000’s | academese (education) | “overprotective parent,” “one who keeps kids on a short leash,” “nervous nellie”)

I didn’t encounter this expression until fairly recently, so I was surprised to find a number of instances from the 1990’s on LexisNexis. It’s usually credited to Jim Fay and Foster Cline, a teacher and psychiatrist, creators of a philosophy of raising children known as Love and Logic. Fay used the word in a book title, “Helicopters, Drill Sergeants, and Consultants” (1994); the three categories, as you might have guessed, are three types of parents. That was not the first recorded use by any means; “helicopter parent” turned up in Newsweek in 1991 as an example of teachers’ slang. In 2004, Fay noted that he had started writing about “helicopter parents” thirty years earlier, although I haven’t found any instances of the term before 1990 or so. We certainly owe the popularity of the expression, if not its origin, to Fay and Cline. When used in the 1990’s, it was invariably glossed. A helicopter parent hovers, supervises every corner of the child’s existence, and swoops down to rescue the child from difficulty, resulting in young adults who can‘t make decisions or deal with adversity. The phrase was much more common by 2005, but a definition was generally offered even then. Responsible writers didn’t feel they could just drop it casually without cluing in readers.

If Fay and Cline didn’t originate the term, another educator did; at first it was used mainly in educational contexts, and generally still is. The term was most commonly applied in its early life to parents of college students, characteristically used by college administrators to describe parents who refused to let their children exercise a little responsibility. But it wasn’t long before it came into more general use to talk about all parents with school-age children — old enough to leave the house and get into trouble. I learned the term from The Simpsons (“Father Knows Worst,” 2009), but it was well and truly current by then; I was behind the curve, as usual. It has spawned a few competitors: I’ve seen “lawnmower parent” (removes obstacles from child’s path) and even “Humvee parent” (defined as “ready to roll over rough terrain and ‘rescue’ my child”). These three terms all refer to fundamentally the same thing, excessive or harmful interference in the child’s life. There are plenty of ways for parents to screw up their kids, and it shouldn’t be too hard to come up with more. What we really need is cute expressions for negligent parents who don’t do enough for their kids. “Absentee parent” is too obvious. “Three-toed parent”? “Blue moon parent”? Help me out here.

We are entitled to wonder how much such oppressive childrearing is due to concern for the children and how much should be chalked up to parents’ amour-propre, or fear of the judgment of their merciless peers. In some communities, parents do seem to be more rivals than anything else. The more you do for your kid, the better parent you are. So do your kid’s homework, take his side against the teacher, pull strings, bail her out when she gets in a jam. Don’t let your kids learn from their mistakes, because making mistakes is in the first place a reflection on you. Self-reliance is a very old strain in the American character (there was an old fellow named Emerson . . . ), but some of us, however indomitable we may be on our own behalf, can’t seem to pass it on to our offspring.

Tags: , , , ,

physically challenged

(late 1980’s | bureaucratese? | “handicapped,” “disabled”)

Always understood to be a euphemism, and soon lampooned, this phrase never quite made its way. It was used in the Democratic Party platform in 1980 as a substitute for “handicapped,” becoming the latest in a line of euphemisms for “crippled” or “unable to move like most people.” “Handicapped,” current in bureaucratese by the 1920’s, had finally conquered by the 1970’s, with “disabled” (less euphemistic) being the primary alternative. “Physically challenged” and “differently abled” came along in the 1980’s, and for a while it looked like “physically challenged” might muster the votes to take over the top spot. But it hasn’t happened; “disabled” and “handicapped” remain more common in everyday speech. “Special needs” (adjective) has since tossed its hat into the ring but seems to be used by preference of children or students.

Perhaps the problem was that “physically challenged” lent itself so readily to parody, and we all had fun with it in the late eighties and early nineties. Everybody remembers “vertically challenged” (“short”), but there were others. “Temporally challenged” (always late). “Verbally challenged” (bad with words). “Follicly challenged” (bald). “Pigmentationally challenged” (albino). For a few years there, if you were caught in any sort of inferiority, you could grin ruefully and say, “I’m ___-challenged” and draw a laugh. The phrase got caught up in the political correctness backlash — when the traditionally privileged got mad because the government dared to help anyone else — and ridicule was a primary weapon. The most recent crop of euphemisms made the easiest targets, which left room for the old euphemisms to retain their supremacy.

I guess the idea was that “handicapped” made you sound too passive or too much like a victim, whereas “physically challenged” made it sound like you could overcome your obstacles with grit and determination. There may have been a submerged battle over who should pick names for minority or disadvantaged groups. For a while, there was a grumpy consensus that we should call such groups what they wished to be called, with the understanding that it might change over time. It seemed the least we could do. That worked fairly well for “gay” or “African-American,” maybe even “hearing-impaired,” but not everybody in a wheelchair wanted to be known as challenged, much less differently abled. And plenty of people, mostly PC-bashers, thought we should keep the good old euphemisms even if they had accreted some unpleasant connotations. And so “physically challenged” never quite lived up to its potential.

pick your battles

(late 1990’s | therapese? | “pick your spots,” “don’t blow this out of proportion,” “know when to quit,” “don’t sweat the small stuff”)

This phrase used to turn up predominantly in two kinds of books: childrearing advice and advice to minorities (including women) trying to get ahead in business. Your boss and your toddler have about equal power, or equal ability to insist on their own way, and it’s unwise to spend too much time resisting or arguing. The expression now is used more generally, but most of the hits yielded by Google Books in the eighties came from books for parents. “Choose your battles” is a common variant. The practice of following either phrase with “carefully” or “wisely” is still common, but not as much as in the good old days.

A number of reference web sites assert that “pick your battles” is an old proverb or dictum, but Google Books shows that the phrase barely existed before 1980. “Origin unknown” shadows it in all the on-line dictionaries. A few adventurous souls seek its source in military strategy (here’s an example ), but Sun-Tzu never seems to have said anything that translates as “pick your battles,” although some of his principles are obviously related. I can’t find any evidence that a general or strategist originated the expression. Several sites attribute it to Dale Carnegie, but I’m dubious. I haven’t seen a citation anywhere, and although it is used on a couple of sites run by the Dale Carnegie people, they never claim that he used (much less originated) the phrase. The web being the echo chamber that it is, one site could have screwed up the quotation and others just copied without checking. For the record, here is the complete quotation: “Any fool can criticize, complain, condemn, and most fools do. Picking your battles is impressive and fighting them fairly is essential.” (reliability unknown) lists the first sentence but not the second. I find “picking your battles is impressive” a little cryptic and clumsy, not typical of Carnegie’s style, although the sentiments are plausible. I’d love to find a trail directly back to Carnegie; as influential as he was, he’d make a great origin point, but without better evidence, I don’t buy it. Likewise with the Washington Post review of a biography of Edward R. Murrow (December 2, 1988), in which the line “You have to choose your battles” is attributed to Murrow, but it isn’t clear when he delivered it.

The expression means simply “save your energy for important matters,” or, less often, “intervene only where you’ll be most effective.” Or, more simply, “there just isn’t enough time to do everything.” Sometimes the emphasis falls more on saving time and energy; sometimes more on fighting hard for what’s important to you. You let most of the crap go by and try to avoid wasting your strength, or using up your reservoir of goodwill.

Tags: , , , , , , , , ,