Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 50 years

Tag Archives: psychology

dad joke

(2000’s | “corny joke,” “dumb joke”)

Here are a few adjectives that often appear in discussions of these oft-lamented attempts at humor: cheesy, corny, dorky, goofy, nerdy, silly. If you ask me, the dad joke and the corny joke are all but co-extensive.

As I tried to establish the characteristics of the dad joke, I wondered if it could be defined in any rigorous way. Now that I have looked around on-line, I’d say they do have a general profile, although by the time you assemble all the definitions, you end up covering a lot of humor that most people might not consider dad jokes. Let’s run down a few features:

— they are intended to induce eye rolls, groans, and grimaces rather than laughs

— they are inoffensive, steering clear of adult themes, bad language, or charged subjects

— they usually involve a pun of some kind, although they can be more complicated than a rudimentary play on words

— they are repeated often; fathers who tell dad jokes generally have a few specialties that they trot out again and again

— they are old and lame.

Now if you encounter a joke that meets all of these criteria, you can be pretty sure you have a dad joke. But you can also find putative dad jokes that match only one or two. It surely doesn’t help that they are all over the internet; collections of dad jokes abound on-line, and the sheer number ensures a certain vagueness of definition. Another ambiguity involves the question of whether dad jokes do or don’t directly involve parent-child relationships. Some say they are ways of getting points across to kids; others that they’re simply ways of bonding with your kids by making them wince. Finally, I may be jumping the gun, but I believe we have reached the point when dad jokes are no longer the exclusive property of fathers (or men acting in loco parentis). They are now a category of humor as much as a topic of conversation in family therapy.

One source revels in a classic dad joke that comes at you from a different direction:

Q. Why did the plane crash into the mountain?

A. Because the pilot was a loaf of bread.

You can call that a dad joke if you want to, but I say it’s a dada joke, in which the punch line has no discernible connection to the set-up. The relation between Dada and ADD bears further investigation.

A bit of trolling through LexisNexis shows that most early instances of “dad joke,” which started to appear in quantity after 1990, occurred in the Australian press. Yet Wikipedia credits an article published in 1987 in the Gettysburg Times as an early and influential instance. As far as I can tell, the phrase didn’t really get going in the U.S. until after 2000, but it has earned its keep since then. A related term, “dad humor,” came along a little later and remains less common. It seems to be useful more as a description of a style or school of wit (if that’s the word I want); mostly we experience dad jokes as a set of unrelated riddles and puns rather than as a body of work.

Some research suggests that dad jokes are beneficial for children (here’s one recent report). They do seem to be a means of softening the paterfamilial personality, allowing even a stern father to show a playful side. The predictability of dad jokes may make them reassuring — if also grating — and while unpredictability plays an important role in making people laugh, knowing what’s coming can make the joke all the funnier. One of my father’s favorite jokes put me in stitches whenever I heard it, and I knew it by heart. Didn’t matter. The punch line slew me every time.

As Lovely Liz from Queens would be the first to tell you, Lex Maniac is no stranger to the art of the dad joke. Here’s one that I haven’t seen anthologized anywhere: Hey, your socks must be the finest in the world. They’re unmatched!

Tags: , , , , , , , , , , ,

impostor syndrome

(1990’s | academese (psychology) | “inferiority complex,” “false pretenses,” “insecurity,” “gnawing self-doubt”)

If you’re not sure what this expression means, you’re just a Google search away from finding out. Many, many web sites, from clinical to pop with many gradations in between, will happily explain the “impostor phenomenon,” as it was first known. (There is also some on-line disagreement over whether the penultimate letter of “impostor” is “o” or “e,” but I view “imposter” as an impostor.) Impostor syndrome is a state of self-doubt that causes accomplished people to believe that they are not truly responsible for their own achievements. In extreme cases, sufferers may posit that they have a mysterious other self that reaches goals and meets deadlines, because the “real” self is too incompetent to complete the work that one has demonstrably done. More often it is simply a nagging sense of inadequacy accompanied by a fear that others will find out, as Simon and Garfunkel put it, that one is fakin’ it.

The fear is essential. Victims of impostor syndrome sense that they are pulling the wool over others’ eyes, engaging in fraud and deception. Con men do that all the time, but deliberately and with pleasure. Take “Congressman” George Santos, whose brazen lies got him into office and showed arrant contempt for voters and whose brazen defenses of his conduct show even more. Whatever variety of megalomania or sociopathy Santos has, it ain’t impostor syndrome. There must be qualms and compunctions, an unshakeable feeling that one is deceiving others even though one is actually deceiving only oneself.

The term was invented in a 1978 paper by Drs. Suzanne Imes and Pauline Rose Clance that attempted to explain why certain women, whose job performance looked exemplary from the outside, could not accept evidence of their own ability. The psychologists posited that many women, trained from an early age to view themselves with a belittling eye and often pigeonholed as less bright, have difficulty breaking free of habitual self-doubt. The phrase has retained a strong connection with women and minorities, and with paid employment. The syndrome does occur in men, but it does not occur often in people who are habitually sanguine or self-confident. Like many psychological concepts, it has an acute side — most people feel that way at least once in a while — and a chronic side, which is much more crippling because it seems inescapable.

“Impostor phenomenon” comes directly from the work of Clance and Imes and therefore belongs to the family of expressions owed to a single person or small group working as a unit (list here), and in particular to the sub-species academicus. As Lovely Liz from Queens pointed out, when the perspective shifts from that of an observer describing a set of symptoms to that of individual sufferers, “syndrome” becomes a natural complement; it is far more common in today’s usage.

While impostor syndrome is presented as an infirmity or problem to be corrected, there are times when it seems reasonable. In normal life, we are bound to confront at least occasionally situations where we don’t know how to proceed. When we make it through unscathed, it’s easy to think we lucked out or benefited from a fluke. (Learning and remembering lessons about what to do during comparable crises should make one less prone to impostor syndrome the next time.) If it partakes of humility and goes along with a willingness to learn, it may be salutary. When it becomes persistent and delusional, it is pernicious.

Tags: , , , , , , , , , , , ,

in one’s DNA

(1990’s | journalese (arts)? | “in one’s blood,” “second nature”)

Genetic code, shmenetic code. “In my DNA,” as direct a reference as you can make to heredity and genetics, now apparently owes more to Freud than to Watson, Crick, and Franklin. In using the phrase now, we generally evoke experiences rooted in early childhood, experiences repeated so often, whether by choice or under compulsion, that for better or worse we never escape their shadow. Most often for better, it seems to me; “in my DNA” tends to have a favorable cast, celebration more than excuse. Yet the implication often remains that a certain habit, attitude, or aptitude runs in the family somehow, as encapsulated in a recent example from a BBC television program, The Bargain Hunt. The question was “So it’s in your DNA, isn’t it?” The answer: “My mum, she’s an artist and she taught me how to clean paintings, restore the gilding and also work on a bit of porcelain and glass.” Both inherited and learned, you see.

The two directions in which this expression can go may be illustrated with reference to two previous entries: “hard-wired” and “male bonding.” The former pretends to the same kind of genetic inevitability but actually sneaks in a good deal of formative experience under that guise, at least when used in the same careless fashion in which “in one’s DNA” is often used. “Male bonding,” on another hand, shows what can happen when we rely too much on that same inevitability — it turns into an excuse for bad behavior.

Miscellaneous usage notes: The phrase soon developed a jocular side, as in references to a shopping gene, drawing on that strong sense of compulsion mentioned above with no true appeal to genetics. While it is often used in allusion to larger patterns of action and world-view, it is not unusual to suggest that a certain song, book, etc. is taking up room in one’s double helices — a dubious proposition by any standard. Finally, the phrase has an active life in the negative, used to indicate emphatically that one would never do such a thing, that it would be unthinkable.

The change from attributes determined by genes to attributes acquired when young and impressionable may be the result of carelessness, or exuberance. The discovery of DNA had a powerful effect on our understanding of our species and entered popular discourse rather more quickly than most difficult science concepts. That was probably inevitable, because this concept has so much explanatory power in areas that affect us all deeply — our predispositions, afflictions, health, life and death — here is a way to talk about all of that with at least the illusion of precision, and it’s easy to pronounce, a lot easier than “deoxyribonucleic,” anyway. DNA enjoyed cachet, in other words. Therefore “in my DNA” became a way of emphasizing the depth of the attachment to an activity, or place, or song, or lots of other things. This isn’t just any childhood obsession, it’s in my DNA.

Another expression rescued from the long-ago oblivion of the raw list, put there originally through the good offices of lovely Liz from Queens. Verily, her bounty knows no bounds.

Tags: , , , , , , , , , ,

no filter

(1990’s | journalese? | “uncensored,” “unedited”; “lack of inhibition,” “no holds barred”)

Where is the line between wholesome honesty and TMI? “No filter,” as an adjective, adverb phrase, or interjection, challenges us to find it. We may like the idea of a politician speaking with no filter, because they are shifty and always hiding behind carefully chosen words — including the firebrands. Politicians must be aware of the destructive potential inherent in saying what they really think, but if you’re addressing a narrow enough audience (e.g., the typical Congressional district), you can go pretty far. But on the internet contempt for decorum leads to worse abuses; it remains to be seen how much we will regret giving the dregs of our society a megaphone. Civilization demands self-restraint. If more and more of us strip it away more and more often, will civilization survive?

The expression promises straight talk straight from the shoulder. Civilization (there it is again) leaves us longing for the unmediated; we find self-control irksome, even those of us in whom it is well-developed. We may well resent contrived barriers to free communication. When the concept of “no filter” is spelled out, the filter is usually specified to lie between mind and larynx — it has to do with speaking. That’s not surprising, because speech offers the possibility of absolute uninhibitedness that writing cannot. When I entered the workforce in the early nineties, you could buy little plaques that read “Make sure brain is engaged before starting mouth” and hang them in your office. The slogan had the virtue of illustrating the point of “no filter”: indiscretion rather than logorrhea. It does not mean rambling free association; it means setting aside the usual caution to say something quite pointed, or blunt. And, more and more often, slanderous.

“No filter” promises no cold calculation, just unbridled talk. But it is perfectly possible to speak with naked candor, or cruelty, for carefully considered reasons. Even the apparently spontaneous can be quite filtered. People like Donald Trump and Elon Musk may want you to think they habitually have no filter — a variation on Nixon’s madman theory? — but even they speak their minds (or ids) selectively. In other words, the Trumps and Musks of the world aren’t being so frank after all; they are working hard to make you hear what they want you to hear.

The metaphorical use of “filter” for “inhibitions” was starting to enter the mainstream in the eighties, but “no filter” didn’t get rolling until the nineties, though some earlier examples may be found. While the rapid growth of the internet propelled the phrase into far more frequent use, originally it probably had more to do with politicians speaking their mind (or pretending to). “No filter” also has a use in the visual world, particularly post-Photoshop, though film cameramen have been using them for decades. When an actress posts an unretouched, makeup-free photo, for example, she might caption it “no filter.”

We usually think of “no filter” as characteristic of the speaker, producer, etc., rather than the hearer or consumer, whose filters are presumed irrelevant against waves of undammed utterance. It’s also possible to lack a filter on the consuming end, as in ways to prevent kids from reaching certain web sites, for example. We’re most likely to hear in this phrase a simple attribution of self-projection, but it may have a touch more subtlety than that.

This week’s expression is brought to you by lovely Liz from Queens, who suggested it years ago. Stay tuned to see what she comes up with next.

Tags: , , , , , , , , , , , , ,

happy place

(2000’s | therapese? | “hog heaven”)

“Happy place” requires a possessive pronoun to work. It is your, or my, or her happy place, where one is most content, most enraptured. (I am not sure about this, but it sounds odd to hear it with a plural subject — “we” don’t have a happy place. To each his/her/its/their own.) It generally begins with a specific activity — playing with dogs or kids, hiking in the woods, yoga, karaoke, managing a balance sheet — that induces the desired state of mind. The happy place is soothing, inviting, beguiling, perhaps stupefying (cf. “zone out“). Once we figure out how to get there, we do it again and again, like rats harvesting pellets in their maze.

Now “happy place” has a long history, with an unusual tendency toward the negative (“not a happy place”). It was a pretty boring, literal-minded expression; you had to be Jacques Derrida to make anything of it. It called attention to a characteristic of a household, company, institution, etc., where residents felt secure without irksome constraints. It almost always took the indefinite article.

Somewhere around 2000, the new version with pronoun took form — there don’t seem to have been many precise pre-1980 equivalents — and “happy place” became the preferred term for that carefree state we covet, a refreshing if temporary escape from the stress and trouble the world imposes on us. In a twist from the old literal use, one’s happy place was not generally understood as a physical location; it was in your head (cf. “in a good place“). In today’s use, it seems to refer more often to an actual spot, but always with the understanding that being there brings about the same feelings, so that the distinction no longer matters. The phrase adorns several on-line shopping sites and a lifestyle blog; it also titles a recent novel by Emily Henry.

To my mind, there’s no question that the idea harks back to childhood, when such blissful interludes seemed slightly more possible. When adults go to their happy places, they are regressing. I find it revealing that the phrase is composed of the sort of fundamental words we associate with very young English speakers. I can’t help but wonder if “Happy Meal” influenced the development of the expression. (As regular readers know, I blame McDonald’s for everything.)

Here we have another instance of an ordinary, entirely unrhetorical string of words, with no inherent distinction or zing, turned into a stock phrase. You guessed it, time for another Lex Maniac list: at the end of the day, be careful out there, has left the building, how cool is that?, if it ain’t broke, don’t fix it, my work here is done, play well with others, what’s your point?. There are a few others that might qualify, but I will refrain.

Tags: , , , , , , , , ,

dynamic (n.)

(1980’s | therapese | “pattern,” “modus operandi,” “web of relationships”)

Used as a noun before 1980, but not often — almost always an adjective back then. The noun is the singular of “dynamics,” already quite familiar by the seventies in psychology. In becoming singular, “dynamic” has simplified itself from multi-faceted means of communication and interaction to a single pathway or mode of action. A dynamic obtains both between two people and in larger contexts, such as politics, corporate governance, or any collection of people defined as a group. (When did “group dynamic” replace “group dynamics”? It never has completely, but LexisNexis suggests that the singular was established by the mid-1980’s.)

As “dynamic(s)” slowly grew, at first in psychological circles, through the sixties and seventies, it maintained a sense of instability caused by the ever-present possibility of change. When you talked about “group, or political, dynamic(s),” you conveyed the sense of a number of people whose relations with each other were fluid and unsettled. The word had power and conveyed a strong sense that what’s true today may not be true tomorrow, or next week, or next year. Always unpredictable, sometimes momentous, potentially violent. I submit that “dynamic” has been domesticated to something more familiar and ordinary. Now a dynamic is usually a well-worn groove, whether in a one-on-one relationship or within a group of millions of voters for a national candidate. It may describe a new and chancy phenomenon, but most of the time a dynamic is more akin to the the same old same old than to a configuration that might change any minute — it is behavior that we have observed and expect to continue.

The shift is the more odd because “dynamic(s)” once went the other way, acknowledging the unpredictability of our relationships and recognizing that fault lines tend to persist. The word brought with it that sense of complexity and instability noted above rather than portraying our ways of dealing with each other as static. The new vocabulary provided a more realistic way of understanding our relationships; in the intervening half-century, that urgency has dissipated, and “dynamic” has lost its kinetic quality in such contexts. Maybe that has more to do with the character of our connections with each other than with the nature of language. If most of us, most of the time, prefer the level, the regular, the predictable to the mutable and novel, that puts pressure on certain words to deform themselves over time — to shed old meanings in favor of new ones, to lose precision, or simply to fossilize.

It hasn’t happened to the other “dyna-” words, arising from the Greek root meaning “power,” derived in turn from the verb for “to be able” (rather like “pouvoir” in French). Dynamo, dynamism, dynamite all retain their force. “Dynamic(s)” in certain contexts still brings that punch to bear. But only this word among all of them has been bent back on itself from its sojourn in pop psychology, with little sign of recovery in sight. To paraphrase lovely Liz from Queens, it makes a word think.

Tags: , , , , , , , , ,

learned helplessness

(1970’s | academese (psychology) | “throwing in the towel,” “apathy,” “defeatism”)

We owe this expression, and the large and popular school of psychology grown up around it, to Dr. Martin Seligman, who sprang it on an unsuspecting world in the 1960’s. (The phrase didn’t really take hold in the mainstream until the 1990’s.) The idea has been nutshelled many times: animals, including people, who learn or are trained to believe that they have no control over what happens to them in a given situation will become passive and endure whatever comes. Even when such victims are actually capable of escaping punishment, they won’t. This insight was soon linked to the study of depression because it provided a simple and powerful understanding of why depressed people, or people in abusive relationships, frequently make little effort to improve their lot. Seligman’s theory is now standard, even essential, to the common understanding of depression. He has enjoyed a long and successful academic career and penetrated the self-help market as well, later offering “learned optimism” as an antidote to depression and hopelessness; in the eponymous book (1991), he declared, “individuals can choose the way they think.” A retrospective paper published in 2016 brings things more or less up to date for those who are interested.

The phrase is sometimes used in a simpler way to refer to situations in which one partner, having taken over an essential duty — cooking dinner, maintaining the property, managing finances, etc. — becomes unable to perform it, forcing the other to confront unfamiliar and urgent tasks. This usage probably has more to do with a failure to learn, a passive process, than with internalizing a certain response after repeated experience, which seems more worthy of the adjective “learned.” It’s important to remember that such one-sided arrangements may result from expedience or even kindly impulses. I’ve also seen the phrase used to refer to the atrophy of skills or memory caused by too much automation — any sort of labor-saving device or technological advance that threatens to make our lives easier.

Still, we often think of “learned helplessness” as imposed on others, part of a malign strategy of subjugation. That isn’t necessary; we can absorb the same lessons from the elements or our own inadequacy. Yet the phrase conjures overbearing spouses (physically, emotionally, or in other ways), helicopter parents, or nanny-state governments, which are said to destroy citizenly initiative. When used to characterize relations between citizens and the government, the phrase points toward a bedrock bone of contention between left and right. Take COVID restrictions, which have sparked more than enough debate for most of us: are we upholding best practices, ever following the science, or numbly accepting massive overreach, babies looking blindly to the overlords for aid and comfort? Where the left sees government correctly relying on scientific consensus and proven expertise, the right sees irrevocable cession of control over personal autonomy. They both have points to make, and they are both abundantly strident in making them. “Learned helplessness” has not made itself as indispensable in political discourse as in psychology, but it has kept a pied-à-terre there since the eighties, when the concept served as a way to explain political apathy; voters who became convinced that they could do nothing to influence government action simply stopped trying.

Lex Maniac has covered a number of expressions demonstrably born of a single person (list here), and now a discernible sub-species has emerged — academicus. My select catalogue of expressions invented by eggheads that have become part of the language: “Cognitive dissonance” (Leon Festinger), “male bonding” (Lionel Tiger), “tiger mother” (Amy Chua), “type A personality” (Michael Friedman and Ray Rosenman). Half-credit for that last one, since it has two authors; half-credit also for “sabermetrics” (Bill James), not created by an academic but born of a scholarly bent all the same. Star professors don’t have the influence they once had, but there is still a vestige of the old fecundity there.

Tags: , , , , , , , , , ,

Stockholm syndrome

(1970’s | journalese? therapese?)

survivor guilt

(1970’s | therapese)

Two expressions that go with traumatic events. Both phrases existed in the seventies, largely as technical terms that might make an occasional appearance in the press but still felt specialized. The syndrome was indeed born in Stockholm during a bank robbery and hostage-taking in 1973, after which the hostages defended their captors and refused to help prosecute them. The phrase has been defined more rigorously but still goes with situations where the victim of some sort of kidnapping or forcible restraint comes to identify with the perpetrator (and possibly vice versa). It is not a recognized medical or psychiatric term, as far as I know. “Survivor guilt” (or “survivor’s guilt”), which is commonly used by mental health professionals, seems to have originated after the Holocaust to denote the pain felt by those who had escaped death at the Nazis’ hands but lost loved ones. The phenomenon — I would rather have died in their place; why not take me instead? — is much older. Several sources in Google Books say the expression was invented in 1961 by psychiatrist William Niederland; I also saw it attributed to Robert Jay Lifton writing of people who lived through the atomic bombings in Japan. As these examples suggest, it often went with mass death and destruction.

Despite the fact that both expressions move in fairly small and well-defined circles, they have undergone some broadening of meaning. “Survivor guilt” has seen a change in scale, so that it is readily applied to relatives of suicides or individual victims of violence, an evolution that seems inevitable when one considers the murder and suicide rates in the U.S. “Stockholm syndrome” seems to be slipping into a much broader meaning that sometimes has little discernible to do with literal captivity and subjugation. It may be invoked to explain why some people stay with others who are bad for them, whether in a job, a relationship, or a political alliance.

I yoked these phrases together in a single post because I sense a connection more multifaceted than simply trauma. They both describe states of mind that may begin while a threat is active and continue long after it is dispelled. Both have seen more use because of the pandemic, “survivor guilt” for obvious reasons and “Stockholm syndrome” as a somewhat imprecise explanation for popular acquiescence with mask mandates and other overreach (i.e., the people are embracing the oppressive government). I’m armchair psychologizing here, but they both partake of a kind of overcompensation to the loss of autonomy or companionship, an unusually powerful reaction to a deep feeling of powerlessness.

I have not seen “survivor guilt” used much in cases where one person has accidentally caused the death of another (as opposed to failing to prevent it). It often arises even in cases where no blame can reasonably be assigned to the survivor; just knowing you made it through while others didn’t may bring despair with it.

Tags: , , , , , , , , , , , ,

on the spectrum

(1990’s | academese (psychology))

To the extent that it has a literal meaning, “spectrum” refers to the full range of electromagnetic radiation, which we associate primarily with the narrow band of visible wavelengths — “spectrum” comes from a Latin word meaning “look,” after all — but which also includes all those waves that allow us to communicate via broadcasting, telephones, etc., etc. (Sound waves are not electromagnetic, but electromagnetic waves allow us to transmit sound great distances.) The more figurative use, which denotes any sort of continuum, was a twentieth-century phenomenon, according to the OED. By the seventies, talk of the political spectrum was common; the word also had military uses that have continued to grow (as in the chilling “full-spectrum dominance”). So when autism started getting a bit more attention around that time, still a rare and exotic disorder, it was already associated with the notion of a spectrum, meaning that autism encompassed a range of severity. Some kids had just a few characteristics; others had it full-blown and really couldn’t function in everyday society.

“On the spectrum” as a fixed phrase used to describe people made itself felt first in the nineties in child psychology journals, and it was available in mainstream outlets, though not particularly common, by 2000. Since then, it has flourished, and it’s hard to imagine there’s anyone out there who doesn’t know what it means. It may be used as a neutral modifier, as an insult, or even as an apology for someone’s (but probably not one’s own) behavior.

“Autism,” from the Greek word for “self,” is another twentieth-century term, and it first referred to a mental disorder of adults so wrapped up in themselves that they don’t notice other people at all (“idiot” at its root means something similar). By the forties (thanks, OED!), it had come to be used of children, often in the phrase “infantile autism.” We still think of it as a kids’ problem, although there are plenty of adults on the spectrum, often still cared for, or at least paid for, by their parents. The terminology changes frequently. “PDD” — pervasive development disorder — was a standard term in the nineties, but you don’t hear it much today. It applied to relatively “high-functioning” autistic people, like Asperger’s syndrome, which we heard a lot about twenty years ago but is no longer formally recognized. The baffling nature of autism makes it necessary to arrive continually at new understandings that force new vocabulary into the lexicon.

Autism was even more mysterious when newspapers started writing about it in the late seventies. An Associated Press article described it as “a state of mind characterized by disregard of external reality. Autistic youths have great difficulty in communicating.” A pediatric specialist adduced typical symptoms: “a preoccupation with toys and other inanimate objects, a lack of desire to be held or cuddled, constant crying, or no crying at all, repetitious movements such as hand shaking, rocking and spinning, and head banging, and various speech and eating problems.” The other thing experts knew about autism was that it occurred predominantly in boys, and it was extremely rare. Sound familiar? It should, because with the exception of the “rare” part, it’s how we think of autism now. If it seems more widespread, that’s partly because we are paying more attention, but no matter how you count, a lot more people seem to be on the spectrum now than then. The new expression makes it easier to acknowledge that the syndrome and the damage it does may vary widely, rather than hurling every sufferer into the same ghetto, as “autism” unadorned does.

Tags: , , , , , , ,

emotional intelligence

(1990’s | academese (psychology) | “sympathy,” “empathy”)

First we must pay homage to Daniel Goleman, who adopted this week’s expression for the title of a best-seller in 1995, vaulting it into everyday language. Psychologically speaking, his goal was to cast doubt on the primacy of IQ testing as a method for predicting success in life. He followed in the footsteps of Harvard psychologist Howard Gardner, who proposed several different types of intelligence, each playing an important role, of which IQ represented only one. Goleman’s work was a summation of research that had been going on at least a decade among psychologists, neuroscientists, etc., and an unusually effective popular treatment of recent science. He was also concerned with childhood development, attempting to prove empirically that children turn out better if they are taught means to deal with and mitigate their emotional reactions — more likely to avoid major trouble and relate well to their peers. His focus on education tended to disguise a strong self-help tendency in Goleman’s popular writing; he seemed to be trying to start a movement. To some extent, he has: there are now a number of tests that measure “EQ,” and emotional intelligence has become a familiar concept, denoting what we used to think of as skill at reading expressions, gestures, and tone of voice, and a willingness to use it.

But that’s not the whole story of this phrase; it had two other uses in the mid-nineties. One, which turned up most often in reviews of the performing arts, denoted the ability to convey a character’s emotions, credited primarily to actors and singers. (That meaning seems to have lapsed.) The other, closer to Goleman’s, had mainly to do with grasping and responding to the emotions displayed by others; whereas Goleman emphasized understanding and controlling one’s own emotional response, other early adopters of the expression made more of looking outside oneself. This distinction may also be observed by introducing the notion of social intelligence — understanding others — in contradistinction to emotional intelligence — understanding oneself. Actual people who boast one attribute are likely to have the other, it is true, and Goleman argued that the emotionally intelligent (in his sense) did better because they played better with others, suggesting that their sensitivity stretched beyond their personal boundaries.

It seems to me that by now the outer-directed sense of emotional intelligence has won. The term has long since outgrown the psychology ghetto and is common all over the lot, including sportswriting and political reporting. Philosophers of business have made a near-fetish of it (as they did, twenty years ago, with a closely related concept, “interpersonal skills“). Today’s business coaches laud emotional intelligence, meaning roughly “ability to fend off drama queens and divas and make everyone else feel less oppressed.” Buffing up your emotional intelligence will make you a better leader and turn your employees into obedient little gnomes. The business press thrives on this sort of thing; every year a new panacea that will make every lousy boss into a good one. And every year, the preponderance of bosses fail to follow the sensible advice of management gurus, which is a darn shame, except it means the bosses will continue to require their expensive services. It’s the employees who won’t get anything out of it.

Business apologists do glom onto expressions that make the boss look better while doing little to improve actual performance. “Mindfulness” and “wellness” have certainly gone that route, while “who moved my cheese?” also deflects responsibility for major disruptions of employees’ lives. Now “emotional intelligence” takes its turn. The phrase conveys increased sympathy and humane attitudes toward employees, but books are written about emotional intelligence because it benefits employers at their expense. Yes, your employees will be happier — because you have become more adept at manipulating them. When executives turn their attention to the wider world, “downsize,” “go green,” “outsource,” and “win-win” treat the rest of us the same way, using euphemisms or feel-good phrases to avoid or disguise harmful policies and acts.

Tags: , , , , , , , , , , , ,