Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 40 years

Tag Archives: childrearing

latchkey kid

(1980’s | therapese?)

Also latchkey child, though that wording seems almost archaic now. Some sources date the expression to the 19th century, but it’s probably later. Random House assigns an origin between 1940 and 1945, and Dorothy Zietz in “Child Welfare: Principles and Methods” (Wiley, 1959) cites not only “latchkey child” but “eight-hour orphan” and “dayshift orphan” as synonyms. Zietz points to “’emergency’ day care programs which became prominent during World War II [that] are now regarded as part of the community’s basic child welfare services,” which will come as no surprise to anyone who has ever heard of Rosie the Riveter. Nonetheless, in 2017 it is generally assumed that Generation X both invented and perfected the concept of the latchkey kid. Scattered references can be found before 1980, but the phrase really took off afterwards, which explains why Gen X gets the credit. (Full disclosure: I’m a proud member of Generation X (the older end) but was not a latchkey kid.) I can’t find any sign that “latchkey child/kid” came along before World War II, certainly not as early as the nineteenth century. It’s easy to imagine a Victorian illustration of a disconsolate waif with a key on a string or chain (not a lanyard) around her neck, but the term was not needed then because the kids were working the same hours as their parents. We still have plenty of latchkey kids, of course, but the novelty has worn off. Today, Free Range Kids carries on the tradition of advocating unsupervised time for children.

God help us, a lot of those Gen X’ers are parents now, and they indulge in the eternal practice of contrasting their kids’ experience unfavorably with their own. The Generation Next of parents proclaims that all that time with no adults in the house made them resilient and self-reliant, and maybe it did. But then why have so many turned into helicopter parents who starve their own kids of opportunities to learn how to manage without adult intervention? I suspect such generational shifts aren’t all that unusual, because parents have a commendable desire to spare their children the traumas they had to go through. But the wider tendency to bewail these kids today goes back a long time, too long and steady to be wholly unfounded. Every generation of parents sees their own experiences as definitive and notices only that which has deteriorated. The thing is, a lot of the time they’re right; standards do change, sometimes for the worse, and good parents must be especially alert to such slippages.

We associate latchkey kids with working single mothers and always have, though plenty of them have working fathers. From this has arisen a certain stigma the phrase can never seem to shake. Even today, it is used as a class marker, one of many indications of poverty, crime, substandard education, and the rest of it. Numerous studies suggest that latchkey kids don’t generally do worse than average; they share the fate of all studies that call easy explanations into question. We just know that the kids are worse off now and/or will do worse as adults; don’t try to tell us different. It is common to read nostalgic accounts of eighties childhoods, but at the time most press coverage — and there was quite a bit — was marked by dubiety. Some researchers pointed to pervasive fear among latchkey kids of emergencies they were unequipped to handle, or of intruders, or just of being all alone in an empty house. Latchkey kids may not want to relate such feelings to their parents, knowing that expressing doubt or anxiety will disappoint or irritate their hard-working elders. Then again, some kids learned to keep house, manage their time, or just watch lots of television. It’s unlikely that most parents want to leave their kids alone day in and day out, but unless the kid shows obvious ill effects, there’s no point feeling guilty over it.

Tags: , , , , , , , , , , , ,

empty nest

(1980’s | therapese | “the house feels so empty”)

This is one of those effortless phrases. The first example I found in Google Books dates from 1968; by the late 1970’s it was turning up in the mainstream press now and then, and everyone seemed to get it right away. At that early date, it still required quotation marks and a brief gloss, but little time elapsed before the expression made itself at home. It was well arrived by the time a sitcom of that title debuted in 1988, spun off from The Golden Girls. “Empty nest syndrome,” an early elaboration, is the most common use of “empty nest” in adjective form; “period,” “phase,” and “blues” are other possibilities. As noun or adjective, it retains an innocent, “literal” quality — of course, the phrase is not literal at all, but its evocation of pure-hearted little birdies seems to shield it from irreverent wordplay. Even after thirty years, the phrase has not developed much of an ironic life, and it is not often used to refer to anything other than a home (or family) from which the last resident child has departed. “Empty nest” does have unlooked-for complexity when you take it apart. The first half is literally false — the nest isn’t empty because the parents are still there. The phrase as a whole requires knowledge of how birds bring up their young, sheltering them until they reach maturity, then sending them on their way.

The semantics of “empty nest” may tickle the analytical brain, but the concept appeals to the emotions, and it soon found a home in the long-running debate between parents and grown children over whether it’s really a good idea for the kids to move back in rent-free after college. The kids are all for it; parents are much more divided on the question. In my own case, the model was the great economist or perhaps sociologist Thorstein Veblen, who returned to his parents’ farm after taking a Ph.D. because he couldn’t find work, and filled the time with reading and long conversations about society and politics with his father. That sounded pretty good to me, but Dad saw disadvantages to the scheme and suggested graduate school instead, which ultimately got me out the door for good.

Not all parents are unhappy at the thought of their children moving back in. Some parents get all broken up when the last child leaves the house, and they are the most vulnerable to later irredentism on the part of their down-and-out offspring. Other parents can’t wait to see the back of their kids and have looked forward to the empty nest for years. I haven’t done a study, but I doubt such empty nesters (is it my imagination, or does that term imply a certain affluence?) relish the prospect of having their uncouth twenty-something kids cluttering the living room. This antidote to the empty nest is now known as “boomerang kid,” a term which arose within the last thirty years. By the way, that news article we’ve all read about how unprecedented numbers of college graduates are moving back in with Mom and Dad has been a staple at least since 1980. It’s a wonder anyone under forty lives on their own.

It is less true now, but in the olden days empty nest syndrome was primarily associated with women, a rough complement to the midlife crisis for men. True, mothers nostalgic for having surly kids in the house didn’t usually buy sports cars or cheat on their husbands, but both middle-age traumas mark a troubled transition to a later phase of adulthood. How can you tell “empty nest syndrome” was a well-established concept by 1985? By that time a whole new branch of the advice for the lovelorn industry had already sprung up, especially in women’s magazines, soothing unhappy mothers with an endless stream of counsel and reassurance.

Tags: , , , , , , , , , , , , ,

model

(1990’s | therapese?, academese (education)? | “serve as a model of or for,” “exemplify”)

Here is a verb that has turned right around in the last thirty years. Back then, when applied to conduct, “to model” meant “follow another’s example.” It was normally used with “on,” or maybe “after,” as in children modeling themselves on a popular athlete, a rock star, who knows? maybe even a parent or teacher. And now? Now it means “set an example for another.” Those responsible for raising or schooling children must do things like “model appropriate behavior,” so their charges will see their way clear to becoming civilized. When did the change occur, and why? The first example I found on LexisNexis lurked in a review of several children’s books at the beginning of 1988, but it doesn’t seem to have become commonplace for at least another decade. My recollection, which I have learned not to trust very much, is that I started to hear it certainly by the late nineties, but I’d be hard-pressed to say when it became customary. Neither is it clear to me where the word arose, although therapese seems the most likely answer. But the verb was also used in its new sense early and often in the ed biz (as the immortal (so far) Tom Lehrer called it), so the educators may have a claim to ownership as well.

The shift in meaning looks larger than it is. “Model” has been used for centuries as a noun or adjective, generally denoting a pattern or example worthy of emulation or copying, in general or in particular. The model citizen or the model of bravery or generosity, to say nothing of the artist’s model, have an ample pedigree. But Noah Webster’s dictionary gives only one definition of “model” as a verb, which encompasses the notion of following a pattern described in the first paragraph. (Artist’s or fashion models of the day “posed,” one supposes.) Scientists and economists have long used the term in a way that seems analogous to today’s meaning: “create a model of.” The phrase “role model” came along in the fifties, according to Random House, and that phrase led to a veritable gamut of post-Freudian psychological usages of “model.” Role models do not generally embody one specific quality but are thought to be worth studying across the board. Instead of emulating Washington for this or Socrates for that, we started concentrating on finding just one all-around good person to emulate. It’s hard enough just finding one pair of coattails to ride.

And why did the word change meanings? When we think of “modeling” good behavior, we think primarily of adults doing it for children. When we thought of “modeling” one’s acts after others’ examples, we thought primarily of children doing it with reference to adults. The subject-object switcheroo goes along with a cultural shift in demands on parents. In the old days, kids had to buckle down and learn to do what was expected of them in public. Responsible adults offered guidance and were expected to help, or at least not retard the process, but it was the kid’s responsibility to work out ways to control himself and make an effort to comply with social conventions. As we make ourselves at home in the twenty-first century, parents are expected to do more and more of what used to be regarded as the kids’ work. The older generation must lay everything out so plainly that no child could possibly misunderstand — willfully or otherwise — the rules they are expected to follow. The grown-ups have to come up with a way to train the kids that doesn’t require them to exert any effort or risk failure. Deprived of any stake in their own improvement, the young’uns will infallibly turn into enviable adults, right? Of course, most kids turn out o.k. if their parents do even a middling job raising them, but it seems to me that conventional wisdom — and a parade of parenting manuals — demands more of parents, and less of children, than it did in the old days.

Actually, I was channeling my sister in the previous paragraph, who was ruminating recently on changes in our child-rearing practices, and who is well-qualified in general to discuss these kids today. I may have misrepresented her views, and I alone am responsible for any errors of fact or interpretation.

Tags: , , , , , , , , ,

soccer mom

(1990’s | journalese (polling))

Cast your minds back to 1995, those of you who go back that far. An unheralded candidate in a race for a seat on the Denver City Council, Susan Casey called herself a “soccer mom” and won the election. A year later, the phrase was heard round the world during the presidential campaign, with both parties wooing soccer moms aggressively. That was when the expression impressed itself on the national lexicon, within the span of a month or two during that singularly undramatic presidential contest. But Casey’s use of “soccer mom” gained her minor national attention; I remember learning the phrase at that time. Before Casey, the phrase, when used at all, connoted no more than boosterism or helping out with kids’ soccer leagues. Despite its political path to prominence and occasional use as a code word (see below), the lowest-common-denominator meaning of the expression — “suburban mother” — emerged quickly and decisively. We form the hackneyed image of a well-off white woman ferrying the kids to various extracurricular activities in the family minivan. The so-called “Soccer Mom Madam” — the suburban mother convicted in 2012 of running a prostitution ring in New York — was so called simply because she had kids and lived outside the city; it didn’t have to do with her party affiliation, employment status, driving habits, or anything else.

The suburbs had formed the object of intense political strategizing for a generation by 1995, but soccer moms energized the bloc and influenced the 1996 election. Early sightings that summer offered definitions: “overburdened, middle income working mothers” (E.J. Dionne quoting Bob Dole’s campaign strategist Alex Castellanos, Washington Post, July 21) or “working mothers, in the suburbs, stressed out and stretched thin” (CBS News, August 29). While only one source mentions the suburbs, both include “working” in the description; I think that is not an essential component of the phrase today. By 2000, “soccer mom” had acquired a left-wing tinge. It was assumed, at least in political discourse, that soccer moms were environmental do-gooders or health nuts or something that made them objects of contempt in right-wing eyes (imagine caring about the health and well-being of your children!). Terms popularized by political consultants are subject to these sorts of shifts, because strategists live by dividing the electorate into ever-narrower slices, defined precisely enough that a certain kind of direct appeal has a good chance of reaping votes, so they try to pile on as many traits as possible to create the narrowest possible definition. Have soccer moms held onto their political clout? Political types no longer use the expression much, or make much effort to reel in their votes, not in any obvious way.

“Soccer mom” (also “hockey mom”) had no precise pre-1990 equivalent, I believe, even though the practice of driving carloads of children to this practice or that class or those lessons was widespread in my childhood. (Back then, we rode in station wagons, kids! Ah, those battleships of the road, some of ’em twenty feet long, slatternly yet majestic with or without the fake wood paneling.) There just doesn’t seem to have been a word for it, much less a socioeconomic category. Soccer hadn’t entered its boom phase in the U.S. yet, although it was closer than any of us knew. (I don’t recall that we had little-league soccer in my reasonably affluent suburb in the seventies.) “Mom” as a common noun didn’t roll off the tongue quite as easily back then, and that’s part of it, too. Then there’s the possibility that most kids didn’t have as many after-school activities that required being driven somewhere. The point is, we could have had baseball moms or ballet moms, and we didn’t. Suburban mothers were not on anyone’s radar as a political force in the seventies, and it didn’t occur to anyone that they might need a special name. When the time came, the word sprang forth to enfold (or obfuscate) a new set of assumptions about power, gender, and family.

Tags: , , , , , , , , , , ,

meltdown

(1980’s | enginese | “disintegration,” “sudden sharp decline,” “tantrum”)

A term ushered into everyone’s lexicon by a film called The China Syndrome and a nuclear power plant called Three Mile Island in March 1979, the year before this blog’s usual cut-off date (admittedly a highly movable and redefinable affair). The word existed before then, certainly, most likely invented by nuclear engineers. The failure of a nuclear reactor’s cooling system would cause the fuel rods to heat up uncontrollably, so that they would melt through their thick-walled chamber into the ground, with unknown effects — but it wasn’t hard to imagine massive releases of ionizing radiation with dire long-range health consequences, even if catastrophe were avoided in the short run. “Meltdown” in any of its varieties has always been associated with disaster, and just as fundamentally, the inability to control events. The first use I saw on LexisNexis dated from 1976 (Newsweek), and it turned up here and there for the next two or three years, before the deluge. “Core meltdown” was a common elaboration back then, although the modifier was unnecessary even before Three Mile Island. A steady stream of disasters, nuclear and otherwise, has kept the word in the news ever since.

“Meltdown” burst on the scene propelled by an unholy mix of popular culture and what passes for real life around here (it was real enough to those of us who grew up close to Harrisburg, unsure whether we would have to evacuate). It was one of those expressions, like “go postal” or “bobbitt,” or “been there, done that” that roared into the language. And what since? The word moved quickly into other contexts, and by the mid-eighties it was natural, if slightly fast, to use the word to talk about financial collapses, or sports teams blowing a big lead. This use points up a questions about meltdowns: How sudden are they? Screwing up the economy or losing a game takes place over an appreciable period of time, but we also use the word to denote a more or less instantaneous downfall. “Meltdown” was quickly absorbed to talk about glaciers and ice sheets, too — another gradual process — as it is still used today. In this sense, it makes a certain amount of sense to talk about something melting down. Otherwise, “melt down” doesn’t seem comfortable as a verb. Your kid may have a meltdown, but your kid doesn’t melt down, like a sno-cone on a hot day. It’s much more dramatic than that.

Which brings us to the semantic leap wrought in recent years: “meltdown” meaning “tantrum” or “conniption.” I haven’t found a clear trail into the lexicon for this usage, but an informal poll of my sister, who was raising children in the eighties, confirms that like the other metaphorical uses, it was thoroughly established by the end of that decade. Nowadays meltdowns are mainly the property of celebrities and kids, but anyone can have one; it’s basically the same as “losing it.” I’m not sure that was always true. It may have been used originally to describe only children’s tantrums and spread to the rest of us from there. The word does capture the cataclysmic violence of a screaming fit delivered by a child bent on having his or her way, a child who has lost all restraint, like a reactor core which due to an uncontainable chemical reaction is no longer responsible for its actions. When used to describe human behavior, the word might be considered indulgent — a way of excusing or mitigating bad conduct by implying that the offender isn’t really responsible — or it might just be a weary acknowledgment of the inevitable. Thwarted kids can be damnably anti-social, and sometimes you just can’t keep them from going overboard.

Tags: , , , , , , , , ,

tween

(1990’s | advertese | “pre-teen,” “person at that awkward age,” “kid”)

A word we owe to advertisers. It bubbled up in the late 1980’s, mainly in marketing publications, although it appeared in the mainstream press now and then, most notably in a USA Today series inaugurated in September 1989, “The Terrible Tweens.” (Royal Caribbean seems to have been an early adopter, offering both “Teen” and “Kid/Tween” programs on their cruises by the end of the 1980’s.) It took a few years to mature, but the word was solidly established within ten years and has become widely recognized and understood.

The origin of the term appears uncomplicated. The resemblance to “teen” is obvious (it’s why we don’t call them “twixts”), and the reference to the time be”tween” young child and teenager is catchy. It was defined as “those between 8 and 12 years old” in the Washington Post (January 24, 1988), which is, I suspect, about how the term would be generally understood now. Maybe 9, maybe 13, but since tweenhood may be a state of mind that need not correspond with precise ages, we should expect a little fuzziness. Some definitions showed more variation in the beginning; for example, a report on McDonalds’ advertising strategy (November 9, 1988) explored its practice of marketing to subgroups including “‘tweens’ (9-to-16 year olds),” while an article in Adweek less than six months earlier gave a range of “10-15.” U.S. News (April 1989) confidently gave “9 to 15.” You could get pretty much any endpoints you wanted, but the core of prepubescents and beginner pubescents remained constant. The traditional preference for 12 or 13 as the beginning of the teenage years seems to have reasserted itself, and there’s much less tendency to incorporate full-blown teenagers into tweendom nowadays. Sometimes the word was spelled with an initial apostrophe in the beginning; sometimes you saw “tweenage” or “tweenager.” It’s a good thing the variant didn’t catch on, or we would all be heartily sick of hearing about Justin Bieber, tweenage idol.

We may see this term simply as the product of the advertiser’s restless, relentless pursuit of the bottom dollar. Whenever defenseless spending money is discovered in a sub-group of the population, the sharks of commerce circle, seeking to engross a healthy chunk of it for themselves. Somebody found out that pre-teens — some of them, anyway — had a certain amount of money, so they had to be defined, categorized, converted to data, and appealed to. Just another demographic in an ever more precisely demarcated consumer universe. Pre-teens’ embrace of social media has lately given the youngsters a new kind of consumer power (and new ways to get into trouble).

The word soon elbowed its way into the parents’ lexicon, adding one more milepost to a track stretching from colic and the terrible twos to empty nests and fledglings returning to fill them. It’s one more group to worry about, one more place the wheels can come off the cart — according to a world view in which childhood and youth are recognized as a succession of traumas. If we hope to understand our children, we must learn about the special characteristics of tweens, their developmental stages and kinks, their symptoms and syndromes, and how not to ruin them utterly (hint: anything you say or do may doom them to a bitter, ineffectual adulthood). The same urge to dissect ever more finely, to understand ever more minutely, is at work among parents as it is among advertisers.

In 1988, Polaroid (Polaroid!) offered its Cool Cam to the youth market (PR Newswire, February 19, 1988), “designed especially for trendy ‘tweens'” (defined here as “the latest demographic label for the 9- to 14-year-old set”). The “tween,” understood as another subgroup of the youth population, was very new then. Nowadays cascades of carefully orchestrated opportunities to spend money confront tweens at every turn, including a fashion designer for tweens who is herself a tween (she promises “blood, sweat, and glitter”). They have money, they have Twitter, and they know how to use them. The rest of us had better stand back.

Tags: , , , , , , , , , , ,

physically challenged

(late 1980’s | bureaucratese? | “handicapped,” “disabled”)

Always understood to be a euphemism, and soon lampooned, this phrase never quite made its way. It was used in the Democratic Party platform in 1980 as a substitute for “handicapped,” becoming the latest in a line of euphemisms for “crippled” or “unable to move like most people.” “Handicapped,” current in bureaucratese by the 1920’s, had finally conquered by the 1970’s, with “disabled” (less euphemistic) being the primary alternative. “Physically challenged” and “differently abled” came along in the 1980’s, and for a while it looked like “physically challenged” might muster the votes to take over the top spot. But it hasn’t happened; “disabled” and “handicapped” remain more common in everyday speech. “Special needs” (adjective) has since tossed its hat into the ring but seems to be used by preference of children or students.

Perhaps the problem was that “physically challenged” lent itself so readily to parody, and we all had fun with it in the late eighties and early nineties. Everybody remembers “vertically challenged” (“short”), but there were others. “Temporally challenged” (always late). “Verbally challenged” (bad with words). “Follicly challenged” (bald). “Pigmentationally challenged” (albino). For a few years there, if you were caught in any sort of inferiority, you could grin ruefully and say, “I’m ___-challenged” and draw a laugh. The phrase got caught up in the political correctness backlash — when the traditionally privileged got mad because the government dared to help anyone else — and ridicule was a primary weapon. The most recent crop of euphemisms made the easiest targets, which left room for the old euphemisms to retain their supremacy.

I guess the idea was that “handicapped” made you sound too passive or too much like a victim, whereas “physically challenged” made it sound like you could overcome your obstacles with grit and determination. There may have been a submerged battle over who should pick names for minority or disadvantaged groups. For a while, there was a grumpy consensus that we should call such groups what they wished to be called, with the understanding that it might change over time. It seemed the least we could do. That worked fairly well for “gay” or “African-American,” maybe even “hearing-impaired,” but not everybody in a wheelchair wanted to be known as challenged, much less differently abled. And plenty of people, mostly PC-bashers, thought we should keep the good old euphemisms even if they had accreted some unpleasant connotations. And so “physically challenged” never quite lived up to its potential.

pick your battles

(late 1990’s | therapese? | “pick your spots,” “don’t blow this out of proportion,” “know when to quit,” “don’t sweat the small stuff”)

This phrase used to turn up predominantly in two kinds of books: childrearing advice and advice to minorities (including women) trying to get ahead in business. Your boss and your toddler have about equal power, or equal ability to insist on their own way, and it’s unwise to spend too much time resisting or arguing. The expression now is used more generally, but most of the hits yielded by Google Books in the eighties came from books for parents. “Choose your battles” is a common variant. The practice of following either phrase with “carefully” or “wisely” is still common, but not as much as in the good old days.

A number of reference web sites assert that “pick your battles” is an old proverb or dictum, but Google Books shows that the phrase barely existed before 1980. “Origin unknown” shadows it in all the on-line dictionaries. A few adventurous souls seek its source in military strategy (here’s an example ), but Sun-Tzu never seems to have said anything that translates as “pick your battles,” although some of his principles are obviously related. I can’t find any evidence that a general or strategist originated the expression. Several sites attribute it to Dale Carnegie, but I’m dubious. I haven’t seen a citation anywhere, and although it is used on a couple of sites run by the Dale Carnegie people, they never claim that he used (much less originated) the phrase. The web being the echo chamber that it is, one site could have screwed up the quotation and others just copied without checking. For the record, here is the complete quotation: “Any fool can criticize, complain, condemn, and most fools do. Picking your battles is impressive and fighting them fairly is essential.” BrainyQuote.com (reliability unknown) lists the first sentence but not the second. I find “picking your battles is impressive” a little cryptic and clumsy, not typical of Carnegie’s style, although the sentiments are plausible. I’d love to find a trail directly back to Carnegie; as influential as he was, he’d make a great origin point, but without better evidence, I don’t buy it. Likewise with the Washington Post review of a biography of Edward R. Murrow (December 2, 1988), in which the line “You have to choose your battles” is attributed to Murrow, but it isn’t clear when he delivered it.

The expression means simply “save your energy for important matters,” or, less often, “intervene only where you’ll be most effective.” Or, more simply, “there just isn’t enough time to do everything.” Sometimes the emphasis falls more on saving time and energy; sometimes more on fighting hard for what’s important to you. You let most of the crap go by and try to avoid wasting your strength, or using up your reservoir of goodwill.

Tags: , , , , , , , , ,