Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 40 years

Tag Archives: children

latchkey kid

(1980’s | therapese?)

Also latchkey child, though that wording seems almost archaic now. Some sources date the expression to the 19th century, but it’s probably later. Random House assigns an origin between 1940 and 1945, and Dorothy Zietz in “Child Welfare: Principles and Methods” (Wiley, 1959) cites not only “latchkey child” but “eight-hour orphan” and “dayshift orphan” as synonyms. Zietz points to “’emergency’ day care programs which became prominent during World War II [that] are now regarded as part of the community’s basic child welfare services,” which will come as no surprise to anyone who has ever heard of Rosie the Riveter. Nonetheless, in 2017 it is generally assumed that Generation X both invented and perfected the concept of the latchkey kid. Scattered references can be found before 1980, but the phrase really took off afterwards, which explains why Gen X gets the credit. (Full disclosure: I’m a proud member of Generation X (the older end) but was not a latchkey kid.) I can’t find any sign that “latchkey child/kid” came along before World War II, certainly not as early as the nineteenth century. It’s easy to imagine a Victorian illustration of a disconsolate waif with a key on a string or chain (not a lanyard) around her neck, but the term was not needed then because the kids were working the same hours as their parents. We still have plenty of latchkey kids, of course, but the novelty has worn off. Today, Free Range Kids carries on the tradition of advocating unsupervised time for children.

God help us, a lot of those Gen X’ers are parents now, and they indulge in the eternal practice of contrasting their kids’ experience unfavorably with their own. The Generation Next of parents proclaims that all that time with no adults in the house made them resilient and self-reliant, and maybe it did. But then why have so many turned into helicopter parents who starve their own kids of opportunities to learn how to manage without adult intervention? I suspect such generational shifts aren’t all that unusual, because parents have a commendable desire to spare their children the traumas they had to go through. But the wider tendency to bewail these kids today goes back a long time, too long and steady to be wholly unfounded. Every generation of parents sees their own experiences as definitive and notices only that which has deteriorated. The thing is, a lot of the time they’re right; standards do change, sometimes for the worse, and good parents must be especially alert to such slippages.

We associate latchkey kids with working single mothers and always have, though plenty of them have working fathers. From this has arisen a certain stigma the phrase can never seem to shake. Even today, it is used as a class marker, one of many indications of poverty, crime, substandard education, and the rest of it. Numerous studies suggest that latchkey kids don’t generally do worse than average; they share the fate of all studies that call easy explanations into question. We just know that the kids are worse off now and/or will do worse as adults; don’t try to tell us different. It is common to read nostalgic accounts of eighties childhoods, but at the time most press coverage — and there was quite a bit — was marked by dubiety. Some researchers pointed to pervasive fear among latchkey kids of emergencies they were unequipped to handle, or of intruders, or just of being all alone in an empty house. Latchkey kids may not want to relate such feelings to their parents, knowing that expressing doubt or anxiety will disappoint or irritate their hard-working elders. Then again, some kids learned to keep house, manage their time, or just watch lots of television. It’s unlikely that most parents want to leave their kids alone day in and day out, but unless the kid shows obvious ill effects, there’s no point feeling guilty over it.

Tags: , , , , , , , , , , , ,

empty nest

(1980’s | therapese | “the house feels so empty”)

This is one of those effortless phrases. The first example I found in Google Books dates from 1968; by the late 1970’s it was turning up in the mainstream press now and then, and everyone seemed to get it right away. At that early date, it still required quotation marks and a brief gloss, but little time elapsed before the expression made itself at home. It was well arrived by the time a sitcom of that title debuted in 1988, spun off from The Golden Girls. “Empty nest syndrome,” an early elaboration, is the most common use of “empty nest” in adjective form; “period,” “phase,” and “blues” are other possibilities. As noun or adjective, it retains an innocent, “literal” quality — of course, the phrase is not literal at all, but its evocation of pure-hearted little birdies seems to shield it from irreverent wordplay. Even after thirty years, the phrase has not developed much of an ironic life, and it is not often used to refer to anything other than a home (or family) from which the last resident child has departed. “Empty nest” does have unlooked-for complexity when you take it apart. The first half is literally false — the nest isn’t empty because the parents are still there. The phrase as a whole requires knowledge of how birds bring up their young, sheltering them until they reach maturity, then sending them on their way.

The semantics of “empty nest” may tickle the analytical brain, but the concept appeals to the emotions, and it soon found a home in the long-running debate between parents and grown children over whether it’s really a good idea for the kids to move back in rent-free after college. The kids are all for it; parents are much more divided on the question. In my own case, the model was the great economist or perhaps sociologist Thorstein Veblen, who returned to his parents’ farm after taking a Ph.D. because he couldn’t find work, and filled the time with reading and long conversations about society and politics with his father. That sounded pretty good to me, but Dad saw disadvantages to the scheme and suggested graduate school instead, which ultimately got me out the door for good.

Not all parents are unhappy at the thought of their children moving back in. Some parents get all broken up when the last child leaves the house, and they are the most vulnerable to later irredentism on the part of their down-and-out offspring. Other parents can’t wait to see the back of their kids and have looked forward to the empty nest for years. I haven’t done a study, but I doubt such empty nesters (is it my imagination, or does that term imply a certain affluence?) relish the prospect of having their uncouth twenty-something kids cluttering the living room. This antidote to the empty nest is now known as “boomerang kid,” a term which arose within the last thirty years. By the way, that news article we’ve all read about how unprecedented numbers of college graduates are moving back in with Mom and Dad has been a staple at least since 1980. It’s a wonder anyone under fifty lives on their own.

It is less true now, but in the olden days empty nest syndrome was primarily associated with women, a rough complement to the midlife crisis for men. True, mothers nostalgic for having surly kids in the house didn’t usually buy sports cars or cheat on their husbands, but both middle-age traumas mark a troubled transition to a later phase of adulthood. How can you tell “empty nest syndrome” was a well-established concept by 1985? By that time a whole new branch of the advice for the lovelorn industry had already sprung up, especially in women’s magazines, soothing unhappy mothers with an endless stream of counsel and reassurance.

Tags: , , , , , , , , , , , , ,

special needs

(1980’s | therapese | “handicapped,” “disabled”)

Presumably descended from the already widespread phrases, “special education” and “Special Olympics.” The crucial change in recent years has to do with part of speech; “special needs” has gone from noun-adjective phrase to unhyphenated compound adjective — not that the old formulation has disappeared. The compound adjective started getting tossed around in the eighties. Before it was applied wholesale to students, it went with orphans and foster children. As one commentator put it in 1984, the old word for “special needs” was “unadoptable.” (Another was “problem,” as in “problem child.”) Now it can apply even to pets. “Special needs” come in many forms, from familiar physical handicaps to mental or emotional instabilities of various kinds, or maybe your kid is just slow (excuse me, has a developmental disability). It has become standard to talk about special needs kids, or the institutions that serve them — classes, programs, transportation — or the group that they are part of; “special needs community” is a common expression now, and it wasn’t twenty years ago. When you’re talking about children, “special needs” refers to disorders of individuals; when it is used to talk about the elderly or anyone else, it normally encompasses conditions common to most members of the group.

That distinction is interesting, and to see why we’ll have to go back to the noun-adjective construction, which has been available for a long time. Kids generally do not claim special-needs status for themselves; there are plenty of people anxious to claim it for them. But other kinds of special needs are advertised by the group they belong to. Take a phrase like “special needs of the oil industry.” In 1975, this phrase could easily have been used (in fact, it was) not to emphasize the burdens fossil-fuel barons labored under, but the privileges that their circumstances entitled them to. It was the sort of thing a lobbyist or legislator might remark upon just before pushing through a big tax break. You didn’t have to be underprivileged (does anyone use that word any more? — it was all the rage back in the seventies) to have special needs. And you don’t now. But we are much more inclined to hear it that way thanks to the last thirty years’ worth of education policy. Before 1985 or so, “special needs” meant “I’m better than you” rather than “I’m worse off than you.”

What does “special” mean, anyway? When it doesn’t mean “specific or distinct” (as it did in the Middle Ages and the Renaissance) or “extraordinary,” as it did then and still does, it means “unique,” a much more recent definition dinned into us by pop psychology. When I was a kid, this use of “special” was common, but it had grown up only in the previous couple of decades. “I am special” came to mean “I am unique,” with the corollary that uniqueness entitled you to respect. It was a word used by eager kindergarten teachers to reassure children that they were valued. “Special needs” doesn’t rely on that definition, though there is a clear echo in parents’ insistence that each special needs child is unique (and adorable, and so forth). But lots of kids may have the same, or very similar, maladies, so that they can be grouped together for purposes of education or therapy. “Special needs” doesn’t have to refer to extreme or bizarre conditions; almost any kid with a problem may qualify if their parents are persistent enough, and some of ’em are, because special needs is where the money is.

The phrase seems more like a euphemism than anything else, a way of coating disabilities — mild or severe — with kindergarten cheer. Language so used is ripe for parody, and “special,” which for centuries had a generally favorable connotation, has become an insult. Uttered with a smirk, it means “substandard,” and every kid knows it, just as they understand that students with special needs have something wrong with them. Yet the expression has hung onto a palliative quality in spite of all the currents running the other way.

Tags: , , , , , , , , ,

model

(1990’s | therapese?, academese (education)? | “serve as a model of or for,” “exemplify”)

Here is a verb that has turned right around in the last thirty years. Back then, when applied to conduct, “to model” meant “follow another’s example.” It was normally used with “on,” or maybe “after,” as in children modeling themselves on a popular athlete, a rock star, who knows? maybe even a parent or teacher. And now? Now it means “set an example for another.” Those responsible for raising or schooling children must do things like “model appropriate behavior,” so their charges will see their way clear to becoming civilized. When did the change occur, and why? The first example I found on LexisNexis lurked in a review of several children’s books at the beginning of 1988, but it doesn’t seem to have become commonplace for at least another decade. My recollection, which I have learned not to trust very much, is that I started to hear it certainly by the late nineties, but I’d be hard-pressed to say when it became customary. Neither is it clear to me where the word arose, although therapese seems the most likely answer. But the verb was also used in its new sense early and often in the ed biz (as the immortal (so far) Tom Lehrer called it), so the educators may have a claim to ownership as well.

The shift in meaning looks larger than it is. “Model” has been used for centuries as a noun or adjective, generally denoting a pattern or example worthy of emulation or copying, in general or in particular. The model citizen or the model of bravery or generosity, to say nothing of the artist’s model, have an ample pedigree. But Noah Webster’s dictionary gives only one definition of “model” as a verb, which encompasses the notion of following a pattern described in the first paragraph. (Artist’s or fashion models of the day “posed,” one supposes.) Scientists and economists have long used the term in a way that seems analogous to today’s meaning: “create a model of.” The phrase “role model” came along in the fifties, according to Random House, and that phrase led to a veritable gamut of post-Freudian psychological usages of “model.” Role models do not generally embody one specific quality but are thought to be worth studying across the board. Instead of emulating Washington for this or Socrates for that, we started concentrating on finding just one all-around good person to emulate. It’s hard enough just finding one pair of coattails to ride.

And why did the word change meanings? When we think of “modeling” good behavior, we think primarily of adults doing it for children. When we thought of “modeling” one’s acts after others’ examples, we thought primarily of children doing it with reference to adults. The subject-object switcheroo goes along with a cultural shift in demands on parents. In the old days, kids had to buckle down and learn to do what was expected of them in public. Responsible adults offered guidance and were expected to help, or at least not retard the process, but it was the kid’s responsibility to work out ways to control himself and make an effort to comply with social conventions. As we make ourselves at home in the twenty-first century, parents are expected to do more and more of what used to be regarded as the kids’ work. The older generation must lay everything out so plainly that no child could possibly misunderstand — willfully or otherwise — the rules they are expected to follow. The grown-ups have to come up with a way to train the kids that doesn’t require them to exert any effort or risk failure. Deprived of any stake in their own improvement, the young’uns will infallibly turn into enviable adults, right? Of course, most kids turn out o.k. if their parents do even a middling job raising them, but it seems to me that conventional wisdom — and a parade of parenting manuals — demands more of parents, and less of children, than it did in the old days.

Actually, I was channeling my sister in the previous paragraph, who was ruminating recently on changes in our child-rearing practices, and who is well-qualified in general to discuss these kids today. I may have misrepresented her views, and I alone am responsible for any errors of fact or interpretation.

Tags: , , , , , , , , ,

dysfunctional family

(early 1990’s | therapese | “messed-up family,” “bad home environment”)

No prizes for guessing the origins of this popular phrase. Could there be a more typical example of therapese? I ought to investigate the uses of the word “dysfunction” and its derivatives in psychology; it occurs everywhere. (I’ve covered one instance already.) From Google Books, I deduce that “dysfunctional family” arose in the literature around 1970, at first primarily as a compound adjective. It was not unusual to see phrases like “dysfunctional family structure or pattern or system.” But the temptation to use it on its own soon grew irresistible. Psychotherapy became more widespread as the passionate sixties dwindled into the neurotic seventies, and it became fashionable to blame transgressions on one’s upbringing, as if the whole country were singing along with the Jets in “Officer Krupke.” By the end of the eighties, “dysfunctional family” was sprouting through all the usual therapese conduits: art critics, advice columnists, clergy, lawyers concocting a defense, and so forth. By 1995 it was paralyzingly common.

What is a dysfunctional family? One headed by a person or persons unable to hold a home together, of course, or more generally one that runs on dishonesty or intimidation or otherwise instills bad relationship habits into hapless children and dooms them to do the same damn things when they grow up. Therapist Barbara Cottman Becnel in 1989: “In a dysfunctional family, you’re generally taught don’t talk, don’t feel and don’t trust.” But these noble core values may effloresce in so many different ways. As Tolstoy would have said, had he thought of it, functional families are all alike; every dysfunctional family is dysfunctional in its own way.

The dysfunctional family is, after all, the root of all evil — a handy, ever-present villain that works for all non-orphans (i.e., nearly everybody). We are all more or less damaged by our upbringers. Whatever’s wrong with you, chances are your parents are to blame somehow. This became a popular sentiment, except among parents, who felt compelled to point out that lots of people could claim credit for messing up the new crop of kids. Homes, after all, are not impermeable fortresses. In an earlier century, we noticed that larger social trends and tendencies contributed to individual tribulations, and we took measures at the federal and state levels to discourage cultural forces that increased hardship and misery. Now we insist furiously that adults who haven’t turned out well are responsible for their own individual moral choices. Since blame the individual took precedence over blame society, families have become a natural target, even though, implicitly, this explanation has the effect of once again displacing blame from individuals to their environment. The contradiction between blaming the families and holding the products of their failures absolutely responsible for their own misdeeds never seemed to bother anyone. Actually, it was a nice two-for-one, having another set of people to find guilty of every single example of anti-social behavior. Plenty of upstanding Americans wanted to see any and all kinds of crime severely punished, even though they might concede, if cornered, that maybe it wasn’t entirely the individual’s fault. Blaming society was never satisfying because it was too big and abstract, but particular individuals and families are small enough targets to allow respectable types to reap some reward from reviling them.

The phrase probably would have begun to diffuse during the 1980’s no matter what. But several eighties trends (and I don’t mean Madonna or leg warmers or Brat Pack movies) gave us lots of reasons by the end of the decade to reflect on the causes of what you might call anti-social behavior. We had AIDS, we had crack, we had high murder rates, even a stock market crash. We needed an explanation for all the messed-up people out there, and pundits and bloviators lost no time in declaring that dysfunctional families had done the damage. There go those larger social forces again, hijacking the discussion. The concept of the dysfunctional family is convenient because it helps us focus on the individual wrongdoer and ignore what’s going on in the greater world. Words can be very useful that way.

Tags: , , , , , , ,

helicopter parent

(2000’s | academese (education) | “overprotective parent,” “one who keeps kids on a short leash,” “nervous nellie”)

I didn’t encounter this expression until fairly recently, so I was surprised to find a number of instances from the 1990’s on LexisNexis. It’s usually credited to Jim Fay and Foster Cline, a teacher and psychiatrist, creators of a philosophy of raising children known as Love and Logic. Fay used the word in a book title, “Helicopters, Drill Sergeants, and Consultants” (1994); the three categories, as you might have guessed, are three types of parents. That was not the first recorded use by any means; “helicopter parent” turned up in Newsweek in 1991 as an example of teachers’ slang. In 2004, Fay noted that he had started writing about “helicopter parents” thirty years earlier, although I haven’t found any instances of the term before 1990 or so. We certainly owe the popularity of the expression, if not its origin, to Fay and Cline. When used in the 1990’s, it was invariably glossed. A helicopter parent hovers, supervises every corner of the child’s existence, and swoops down to rescue the child from difficulty, resulting in young adults who can‘t make decisions or deal with adversity. The phrase was much more common by 2005, but a definition was generally offered even then. Responsible writers didn’t feel they could just drop it casually without cluing in readers.

If Fay and Cline didn’t originate the term, another educator did; at first it was used mainly in educational contexts, and generally still is. The term was most commonly applied in its early life to parents of college students, characteristically used by college administrators to describe parents who refused to let their children exercise a little responsibility. But it wasn’t long before it came into more general use to talk about all parents with school-age children — old enough to leave the house and get into trouble. I learned the term from The Simpsons (“Father Knows Worst,” 2009), but it was well and truly current by then; I was behind the curve, as usual. It has spawned a few competitors: I’ve seen “lawnmower parent” (removes obstacles from child’s path) and even “Humvee parent” (defined as “ready to roll over rough terrain and ‘rescue’ my child”). These three terms all refer to fundamentally the same thing, excessive or harmful interference in the child’s life. There are plenty of ways for parents to screw up their kids, and it shouldn’t be too hard to come up with more. What we really need is cute expressions for negligent parents who don’t do enough for their kids. “Absentee parent” is too obvious. “Three-toed parent”? “Blue moon parent”? Help me out here.

We are entitled to wonder how much such oppressive childrearing is due to concern for the children and how much should be chalked up to parents’ amour-propre, or fear of the judgment of their merciless peers. In some communities, parents do seem to be more rivals than anything else. The more you do for your kid, the better parent you are. So do your kid’s homework, take his side against the teacher, pull strings, bail her out when she gets in a jam. Don’t let your kids learn from their mistakes, because making mistakes is in the first place a reflection on you. Self-reliance is a very old strain in the American character (there was an old fellow named Emerson . . . ), but some of us, however indomitable we may be on our own behalf, can’t seem to pass it on to our offspring.

Tags: , , , ,

inner child

(1990’s | therapese | “the child in you,” “inner self,” “the repressed”)

This expression has a bad case of bipolar disorder, or split personality, or schizophrenia, or dementia praecox, or something. One of many concepts sprouted in pop psychology and transplanted thence to popular culture over the last five fruitful decades, it is strikingly bifurcated in popular use. Is the inner child full of wonder, the bright, joyous, spontaneous part of us that we want to liberate? Or is it the part that got picked on, wounded, broken, the traumatized piece of us that holds us back and needs to be coaxed out and treated? The latter sense seems to have predominated when the phrase arose, but today I’m not so sure it’s more common. Charles Whitfield, author of Healing the Child Within (1987), used this term (and several others, which he regarded as synonyms) in the sunnier sense, but he also talks about the scars of mistreatment that the inner child bears.

Most of us, I would venture to say, have at least a little of both inside us. We all felt frustration and rage as children and had to learn to handle it when things didn’t go our way — repressed anger is the price of adulthood. Now we are in Freudian territory. You force down the bad stuff in order to survive, but it lingers and comes back to bite you later. Jung, on the other hand, says, “In every adult there lurks a child — an eternal child [“ewiges Kind” in German], something that is always becoming, is never completed, and calls for unceasing care, attention, and education. That is the part of the personality which wants to develop and become whole.” Jung takes a more rigorous view than those therapists who counsel us to embrace our inner Pollyanna, but he does emphasize that what’s important about the inner child is not that it suffered, but that it can develop in healthy and helpful ways. A number of therapists trace the whole concept back to Jung, and it’s no surprise that they would want to attach a controversial therapeutic approach to so august a name. Still, it’s not unreasonable to trace the fault line in the definition of “inner child” to disagreements between Freud’s and Jung’s interpreters. Wouldn’t be the first time.

Those who object to all this talk of an inner child fear that the concept seduces us into discarding all the hard lessons we learned on the way to adulthood in favor of coddling our most selfish and anti-social impulses. Defenders of inner child therapy recognize the dangers of such self-indulgence: “Popular visibility and the resulting media over-simplifications have led to the idea that Inner Child work is about self-absorbed people carrying around teddy bears and whining about their parents. It is undoubtedly true that there are many people ‘in Recovery’ who are busy blaming their parents and their childhoods for the failures of their adult lives.” When the inner child spends too much time outside, there’s trouble, just as there’s no doubt that if you let children have their way all the time, thereby sparing them unhappiness and its consequent trauma, they won’t make better adults. But the idea that our childhood experiences have a decisive effect on our adult behavior, or that we’re happier when we strip away some of the jadedness and dreary constraint we accrete while growing up, seems so well-established as to be beyond cavil — one of the few central human truths to emerge unscathed from the twentieth century. Thus both notions of the inner child will have value at least in some cases.

The phrase became widespread among therapists in the 1980’s and went on swiftly from there. Tom Hanks’s Big (1988) came along at the right time and may have given it a boost. Like many ideas drawn from pop psychology, the “inner child” has always been ripe for adaptation and parody, and inner entities have become thick on the ground, like “inner critic.” Here’s a short list (scroll down), but there’s always room for more.

Tags: , , , , , ,