Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 40 years

Tag Archives: generation x

latchkey kid

(1980’s | therapese?)

Also latchkey child, though that wording seems almost archaic now. Some sources date the expression to the 19th century, but it’s probably later. Random House assigns an origin between 1940 and 1945, and Dorothy Zietz in “Child Welfare: Principles and Methods” (Wiley, 1959) cites not only “latchkey child” but “eight-hour orphan” and “dayshift orphan” as synonyms. Zietz points to “’emergency’ day care programs which became prominent during World War II [that] are now regarded as part of the community’s basic child welfare services,” which will come as no surprise to anyone who has ever heard of Rosie the Riveter. Nonetheless, in 2017 it is generally assumed that Generation X both invented and perfected the concept of the latchkey kid. Scattered references can be found before 1980, but the phrase really took off afterwards, which explains why Gen X gets the credit. (Full disclosure: I’m a proud member of Generation X (the older end) but was not a latchkey kid.) I can’t find any sign that “latchkey child/kid” came along before World War II, certainly not as early as the nineteenth century. It’s easy to imagine a Victorian illustration of a disconsolate waif with a key on a string or chain (not a lanyard) around her neck, but the term was not needed then because the kids were working the same hours as their parents. We still have plenty of latchkey kids, of course, but the novelty has worn off. Today, Free Range Kids carries on the tradition of advocating unsupervised time for children.

God help us, a lot of those Gen X’ers are parents now, and they indulge in the eternal practice of contrasting their kids’ experience unfavorably with their own. The Generation Next of parents proclaims that all that time with no adults in the house made them resilient and self-reliant, and maybe it did. But then why have so many turned into helicopter parents who starve their own kids of opportunities to learn how to manage without adult intervention? I suspect such generational shifts aren’t all that unusual, because parents have a commendable desire to spare their children the traumas they had to go through. But the wider tendency to bewail these kids today goes back a long time, too long and steady to be wholly unfounded. Every generation of parents sees their own experiences as definitive and notices only that which has deteriorated. The thing is, a lot of the time they’re right; standards do change, sometimes for the worse, and good parents must be especially alert to such slippages.

We associate latchkey kids with working single mothers and always have, though plenty of them have working fathers. From this has arisen a certain stigma the phrase can never seem to shake. Even today, it is used as a class marker, one of many indications of poverty, crime, substandard education, and the rest of it. Numerous studies suggest that latchkey kids don’t generally do worse than average; they share the fate of all studies that call easy explanations into question. We just know that the kids are worse off now and/or will do worse as adults; don’t try to tell us different. It is common to read nostalgic accounts of eighties childhoods, but at the time most press coverage — and there was quite a bit — was marked by dubiety. Some researchers pointed to pervasive fear among latchkey kids of emergencies they were unequipped to handle, or of intruders, or just of being all alone in an empty house. Latchkey kids may not want to relate such feelings to their parents, knowing that expressing doubt or anxiety will disappoint or irritate their hard-working elders. Then again, some kids learned to keep house, manage their time, or just watch lots of television. It’s unlikely that most parents want to leave their kids alone day in and day out, but unless the kid shows obvious ill effects, there’s no point feeling guilty over it.

Tags: , , , , , , , , , , , ,

blended family

(1980’s | therapese | “stepfamily”)

Contested terrain semantically, as in other, more obvious, ways. Start with the definition. Nowadays, most people would probably endorse a relatively loose definition of “blended family”: any family formed when an adult with one or more children takes up with a different adult, who may or may not have children. If you’re a purist, you might require that both adults have at least one child. In 1983, a writer defined it thus: “pop-psychology euphemism for members of two broken families living under the same roof, a mixture of step-parents, step-children and step-siblings.” Ten years before that, a psychology textbook defined it as a “family consisting of a husband and a wife, the children of either or both from a previous marriage, and children of the present marriage.” The new spouses had to have kids together, not just with former partners. The extra distinctions may have been made possible by a wider panoply of related terms than we can remember now. A surprisingly large amount of vocabulary sprang up around such filial configurations; in 1980, the New York Times propounded the following list: “conjugal continuation, second-marriage family, stepfamily, blended family, reconstituted family and metafamily.” (It missed “merged family,” also in use by 1980. “Mixed family” means that the parents are of different race, ethnicity, or religion.) Of these, only “stepfamily” would be familiar to most people in 2017, but Wikipedia distinguishes between stepfamilies (only one adult has a pre-existing kid) and blended families (both adults). According to the OED, “stepfamily” goes back to the 19th century; the earliest citation I found for “blended family” dated from 1964.

Why did “blended family” win out? Probably the usual mixture of euphony and accuracy, or intuitiveness. Most of us understood pretty quickly what it meant the first time we heard it in context, and it sounds good — not too long, not too short, scans nicely. “Second-marriage family” is clunky; “metafamily” is jargony and doesn’t make a whole lot of sense anyway. “Blended family” sounds a lot better than “reconstituted family” (just add water!), you have to admit. The only mystery: why didn’t “merged family” catch on?

We like to think that the quirks and foibles of our own generation are unprecedented, but blended families are hardly new. My father’s father grew up in one after his mother divorced his father and married her second husband. My mother’s mother was the daughter of a second marriage, an old widower and a young wife. Life expectancy was lower then, so remarriages were more often occasioned by death than divorce. Was there a decline in the number of blended families for a generation or two, long enough to forget how common such arrangements used to be? If so, the phenomenon has come roaring back. Somehow, before 1970 or so, we got along without a general term for it. Now we’ll never get rid of this one.

There may have been earlier examples on television, but “The Brady Bunch” was the first show to feature a blended family week after week, thus perhaps making the whole idea seem more wholesome. It is doubtful that the sitcom had much effect in its time, given its poor ratings and reviews, but pop-culture observers agree that it had a long and powerful afterlife among those of a certain age (mine), for whom the Brady Bunch is part of a comforting nostalgic penumbra (accent on “numb”). Several shows about different varieties of blended family have succeeded Mike and Carol and Sam* and Alice: Full House, Step by Step, Modern Family. The Bradys anticipated a trend; their descendants follow along behind, trying to catch up to everyday life. The Stepfamily Foundation started life in 1977; support groups and talks at the local library aimed at blended families seem to have arisen in the eighties, when the requisite self-help books also began to appear. New terms must surely arise to reflect new conditions, but the rule is that only one or two out of a larger number will make it to the next generation and a shot at immortality.

* The butcher. Remember?

Tags: , , , , , , , , , , ,

soul patch

(1990’s | “jazz dab”)

We all know what this is, right? It is the smallest recognized denomination of facial hair, a tuft (or wisp, or dot) immediately under the middle of the lower lip. If it extends down to the chin, it’s a chin stripe. If it extends beyond the chin and comes to a soft point, like a paintbrush, it’s an imperial. Some men may have dense hair all along the lower lip, but it only counts as a soul patch if it’s centered — they’re never more than an inch wide, usually less. And if you grow whiskers anywhere else (except possibly the upper lip), it’s not a soul patch any more; it’s just part of another configuration. Here’s a reasonably comprehensive chart that illustrates different categories of facial hair.

As for the history of this particular beardstyle, there’s a firm on-line consensus that it was originally popularized by Dizzy Gillespie in the 1950’s and caught on among beatniks like Maynard G. Krebs, who wasn’t a real person, and who wore something closer to a goatee most of the time, but never mind. Full beards came back in style in hippie times, but in the 1990’s disaffected Gen X’ers (was there any other kind?) took it up again. Not until the nineties was it called a “soul patch,” however. Gillespie seems to have called it a “jazz dab” and I’ve also seen “mouche” cited as an older term for the same thing. Neither is familiar to me, but I (a disaffected Gen X’er) came along too late. You see “flavor saver,” which is newer, on-line, though that could apply just as well to a mustache. I’ve never heard anyone actually utter that phrase, but I’ll concede that it’s a time-honored function of facial hair to preserve bits of breakfast where they can embarrass us later on. Teenage boys trying to grow facial hair are often told they have dirt on their faces; in such cases “soil patch” might be appropriate.

It is not obvious why we should use the phrase “soul patch” to denote the typewriter eraser brush under the lip. It probably has something to do with “soul” in the musical sense (Dizzy Gillespie, remember?), and it does sound better than “jazz patch.” I uncovered an oddity about the phrase in the course of researching it: while many on-line sources chronicle the history of this tonsorial arrangement, several with reference to other terms mentioned above, hardly anyone speculates on the origin of the phrase itself. My sources agree that it did not exist in print before 1990, and LexisNexis shows that it was in use by 2000 and generally appeared without a gloss. Usually on-line commentators engage freely in etymological speculation, even if they aren’t any good at it. When I did “go commando” a few weeks ago, just about every page Google coughed up yielded some unsubstantiated or insubstantial theory explaining how the phrase arose. But no site accessible from the first ten pages of Google’s search results offers even the most casual hypothesis for the origin of this expression.

Soul patches, like the hipsters they often adorn, frequently find themselves objects of ridicule. Even defenders admit that only a select few look good with a soul patch, while others abhor middle-aged men who wear them in hopes of passing for young and hip. I don’t like them myself, but I’ve worn a full beard most of my adult life on the theory that trimming it occasionally takes much less work than shaving every day. The soul patch may require less sculpting than the chinstrap or many other styles, but you still have to shave everything else all the time to make it work. If it doesn’t improve your appearance or save labor, why do it?

Tags: , , , , , , , , ,

don’t even think about it

(1990’s | journalese (politics) | “let’s nip this in the bud,” “don’t be a fool,” “forget it,” “watch it!”)

I’m often left wondering how reliable LexisNexis really is, and the question is particularly pertinent to this week’s inquiry. It appears to be remarkably reliable about transcribing text. Now and then you run across an OCR error, but they are pretty minor and pretty infrequent. The dating is also reliable (unlike, say, Google Books, my other on-line primary source). The chief limitation probably arises from the selection — the publications indexed and how representative they prove to be. Most major newspapers and a sprinkling of minor ones, a bias toward print and the establishment business press despite a pretty fair number of blogs and other odds and ends. On the other hand, that sort of publication still has the most reach and influence, so the bias toward wealth and power may be just what is needed to understand the diffusion, or transformation, of English expressions.

Take the exact phrase “don’t even think about it.” It comes in two moods, the indicative and the imperative. In 1980, and even 1990, the indicative was much more common, saith LexisNexis. Most frequently spotted in athletese, generally preceded by “I.” It doesn’t matter to me, because . . . it’s no big deal, or it’s ancient history, or I’ve gotten used to it, or I have more important things to worry about; the phrase conveys resilience or bravado, perhaps defiance. Athletes and others still use the phrase this way, but — again, according to LexisNexis — it has fallen in comparison to the imperative use, which has increased dramatically since 1980, so that now it occurs much more often than the indicative. In the process, it has become a fixed phrase and probably achieved concurrently the coveted status of cliché.

So how reliable is LexisNexis? How far can we trust them to deliver a fair snapshot of usage patterns over the entire population? I do sense that the imperative is used much more often than it was, which doesn’t mean that the indicative is used less — it’s not a zero-sum game. It may be a rising line crossing a steady one, but there’s no doubt the imperative is getting more imperative. The first hit in LexisNexis dates from 1980; almost always in political contexts for the first few years. The New York Times noted in 1987 that “Don’t even think about it” was popular on signs drivers put in their cars to deter thieves, an elaboration on “no radio.” It sounds like an urban legend, but I pass it along anyway.

“Don’t even think about it”: Anything from finger-wagging to a blunt threat, in the imperative it’s always a warning, a reminder that what you are about to do is a mistake. There are plenty of possible reasons, just like there are plenty of reasons you might have dismissed something from your mind. Because . . . it’s illegal, it won’t give you what you want, it will lead you into temptation, it ain’t happening . . . whatever. In this sense, it’s pretty close to another newish expression, “don’t go there.” (I’m surprised I haven’t done this one yet. Oops, now I have).

When I was a kid we said, “Don’t EVEN . . .” Here’s an example: At a baseball game, a fan yells to the batter, “Don’t EVEN strike out, Bob!” It meant “you of all people must not screw this up,” or, more simply, “This would be a particularly bad time to screw up.” “Don’t even think about it” doesn’t have the same scansion, of course, and there’s probably no connection. But I’d like to think my generation influenced the language even in our collective childhood. Children of the seventies rule!

Tags: , , , , ,