Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 50 years

branding

(1980’s | businese | “marketing (strategy),” “image”)

You are wondering about the connection between branding oneself or one’s organization and branding cattle, so I will tell you. They are both ways of marking salable property. Your brand is the quality, whatever it may be, that sets you apart from the competition, just as branding a calf designates it exclusive property. A possible intermediary would be “brand” used as a verb meaning “accuse someone of being,” as in “he branded his opponent a liar.” (The occurrence of “as” in between the object and the article was already possible in 1980, though perhaps less common then.) “Branding” in this sense is the act of pinning a disagreeable attribute on someone, but “brand” does not refer to the scarlet A that dogs the victim (à la Hawthorne). Nowadays, individuals and organizations improve their brands — which would have sounded very strange back on the range — in order to increase their appeal, rather than repulse customers. It is the flavor or feature or je ne sais quoi that renders them more worthy of the sacred ritual of opening the wallet. It must be tenderly nurtured and aggressively developed, with much overtime and expensive consultation.

Not just for-profit businesses; universities, foundations, hospitals, even nations are expected to burnish their brands in order to attract more people and make themselves more relevant — that is, closer to the money spigot. Just as IT departments became necessary a generation ago, branding consultants (or in-house staff) are now de rigueur for any business serious about staying in business. Anything an organization does to increase its status or revenue might qualify as a branding venture. For now, at least, it remains grounded in consumer behavior; the true measure of branding success is consumer appeal. Thus such projects tend to take on an anxious or abject tone; consumers are capricious gods whose whims must be catered to in order to part them from their money. Americans have seen a steady erosion of their political power for a century or more. To some extent it has been replaced by consumer power, but consumers don’t get to hire and fire corporate executives.

“Brand” and “branding” broke new ground in the eighties; it was rare before then to see either term as we use it now. By 1990 they both showed up regularly in the business press, though not perhaps in everyday vocabulary. One team of researchers defined “brand” as consisting of three components: “physical make-up, functional characteristics, and characterization — i.e., personality.” “Branding” goes with words like “messaging” (conveying a selling point) and “positioning” (proving yourself superior to the competition). “Brand” meaning simply “name of manufacturer” or “name of particular product” has been superseded, though it has not disappeared. It’s not enough to be Heinz or Kleenex any more. Heinz and Kleenex have to get out there every day and prove they’re better, or at least more compelling. You can’t just maintain a good reputation and rest your name on it. You have to build, respond, and work, work, work to make sure you remain irresistible.

There seems to be a strong tendency in corporate America to find or create new methods and theories of improving sales or employee retention or customer loyalty (cf. the recent entry on “emotional intelligence“). They don’t all involve branding directly, but they do involve purchasing books, hiring consultants, and supporting researchers who seem more and more like a parasitic class, feeding off their high-powered hosts and justifying it by dispensing advice that doesn’t — and can’t — work most of the time, because in most fields winners must always be in the minority. Even if you follow your consultant’s report to the letter, it probably won’t improve your market share much. But another consultant will come along next year, and you’ll have to shell out for that one, too. It’s just another way to make the money trickle down, I suppose, but one can’t help but wish that all these corporate geniuses might put a bit more effort into innovation and investment than convincing us by more or less fraudulent and manipulative means that we should buy their product. Maybe it will turn out that the best long-term branding strategy is finding a gizmo nearly everyone uses and making it better than anyone else can. But it’s a lot easier to talk about what the logo should look like and where it should go than to re-envision the entire chain of people and duties required to improve the merchandise. The point is not the product, it’s your ability to convince the gullible to pony up. I’m beginning to think we should put P.T. Barnum on our money, not George Washington or Harriet Tubman.

This is the five hundredth expression I have written about, assuming I’ve counted correctly. I encourage everyone to head over to the alphabetical entry list and look around to see if I’ve covered a favorite expression, or a pet peeve. If so, comment! If not, send it in (usagemaven at verizon dot net).

Advertisements

Tags: , , , , , , , , , , , , , ,

age in place

(1980’s | therapese? | “stay put”)

“Age in place” has social and personal dimensions, referring to entire communities or single dwellings that contain upper-middle-aged people who must decide whether or not to move away. When they decide to remain at home or at least in the same town, they age in place. Retirement communities — because they blur the line between home and neighborhood — allow you to do both by moving from your house into an apartment while you can still look after yourself, and then rise through the assisted living ranks as your capacities dwindle. So without leaving the area, you create a new home to live out your life. It seems to me that most of the time when the phrase comes up now, it refers specifically to remaining in one’s own home through one’s retirement years.

Since the eighties, polls have shown consistently that most older people want to stay where they are rather than move away. It was assumed back then that baby boomers would not be content to age in place, but so far they seem to be. The continuing preference for growing old in familiar surroundings probably says more about the nature of the elderly than about demographics or American culture. Or simply the fact that moving is a hell of a lot of disagreeable work. It may be the strength of the roots you’ve put down, or it may just be inertia; either way it adds up to aging in place.

This week’s expressions have some significant forebears, most notably “stay in place” and “run in place.” Aging and sheltering in place capture the same refusal to pull up stakes. If you prefer your prepositions accusative, “snap into place” or “lock into place” are for you (both could function transitively or intransitively). Finally, I can’t help but hear an echo of “rest in peace” when I encounter this expression. Doesn’t matter how long you age in place; some day you’ll rest in peace. In your final resting place.

shelter in place

(1990’s | businese? bureaucratese? | “ride it out,” “hunker down”)

I believe we owe this expression to the good people of West Virginia, or maybe it’s the bad people. Starting in the 1980’s, as far as I can tell, chemical industry spokespersons began introducing the phrase to answer the question, “What do I do if a huge cloud of poison gas is enveloping my house?” A leak or explosion at a chemical plant is a big deal anywhere, and there’s a particularly high density of chemical manufacturing in those parts. The industry representatives were in the awkward position of spending half their time explaining why leaks and explosions couldn’t possibly happen, and the other half explaining what to do when they did. The vast majority of sightings of this phrase, well into the 1990’s, come from West Virginia newspapers or press releases, as far as LexisNexis is concerned. It is very unusual for a new expression to arise so exclusively from a particular state.

“Shelter in place” can be a verb, a noun phrase, or an adjective, as in “shelter-in-place drill.” It means more than just stay in your house and hope for the best. Paul Hill, president of the National Institute for Chemical Studies, put it this way in 1994: “Go inside the nearest structure and into a room with no or few windows. Pets should be brought indoors. A radio or television should be turned to a local Emergency Broadcast Service station for information and directions. If the emergency involves hazardous materials, heating and cooling systems and fans should be turned off, windows and doors should be shut and cracks covered with wet rags or tape. If directions call for protected breathing, the nose and mouth should be covered with a wet cloth. Wait for an ‘all-clear’ signal. In addition, residents should stay calm and stay off the phone.” Nowadays, when the governor tells residents of hurricane-prone beach towns to shelter in place, there’s more emphasis on boarding up windows and lashing everything down, but the idea is the same: retreat into your house and make it as impenetrable as possible, turning it into a temporary fallout shelter. The net result is rather like “lockdown” as it might be practiced in less densely populated areas.

I suppose the cynic in me hears an echo of that lovely expression I learned in my youth, “Bend over backwards and kiss your ass good-bye.” When the authorities tell us to shelter in place, we’re on our own. If it gets bad, we’re stuck.

Tags: , , , , , , , , , ,

disruptive

(1990’s | businese? athletese? | “shaking things up,” “causing a stir”)

A word of long standing, but when did it take on a favorable connotation? Not everywhere, of course, but executives use it approvingly now, unthinkable in the days of Henry Ford or even Lee Iacocca. Successful corporations have traditionally avoided boat-rocking and sought the even keel, but now executives congratulate each other on their disruptive business practices. It is not solely a matter of hobbling the competition; a certain amount of disruption is tolerated within the organization if it keeps employees on their toes, for example, or pushes a complacent division into activity. The buttoned-down set seems to have loosened their vests.

The first occurrences in the press that I found date from the late nineties, a few due to far-sighted business gurus but more from coaches describing the defensive unit, particularly in football and basketball. (Often it applied to a single defensive player.) I couldn’t guess which source influenced the other, but there’s nothing new about businessmen borrowing vocabulary from athletes — in this case, giving it more of an offensive than a defensive cast. By 2010 the word was ordinary in business contexts. Nowadays artificial intelligence and business models or strategies attract the label “disruptive.”

It’s a very forward-looking buzzword, associated with innovation, technology, and improved corporate management. Senior executives sling it around confidently, extolling the virtues of novelty and adroit exploitation of one’s strengths, or just crowing about how they’re going to mess with their competitors. There’s the usual tension between the goal of making the world a better place (if only for p.r. purposes) and simply extracting greater profit from it.

“Disruptive” is close to a newer expression — “game-changing” — and an older one, “revolutionary.” But these are both stronger than “disruptive,” which encompasses lesser shocks to the system. You can be disruptive without altering the playing field permanently or overthrowing an old order. It reminds me of Joseph Schumpeter’s notion of “creative destruction,” a hallmark of capitalism, which requires not just that single enterprises should fall so that better ones might rise, but that the rules of doing business, or other received wisdom, must fall to the new and improved. (Schumpeter believed strongly in innovation and entrepreneurism, by the way.) In today’s world, disruptive tactics are mainly intended to weaken or drive out competitors, but getting rid of rivals was always part of the entrepreneur’s toolbox. The fine talk of less able businesses fertilizing their successors didn’t disguise the fact that Schumpeter was merely peddling social Darwinism dressed up as economic law — yet another instance of trahison des clercs.

We owe this week’s expression to Will from Paris, a first-rate student of the language and a damn fine host to boot. He says, based on recent dealings with the corporate set, that this word will soon take over the world, and Lex Maniac wants nothing more than to get in on the rez-de-chaussée. Merci!

Tags: , , , , , , , , , , , ,

lockdown

(1970’s | legalese? | “quarantine”)

A grim word. Before 1970 or so, “lockdown” pertained to hardware, describing a mechanism that held something firmly in place. During the seventies, lawyers and prison wardens began using the term to talk about a way to control prisoners by confining them to their cells, forbidding gatherings, visits from the outside, etc. The usage became standard quickly, commonplace in the mainstream press by the mid-eighties. At some point in the nineties, the word kicked over the traces and spread to other contexts, anywhere there was unrest (just as prison lockdowns were a typical response to riots or smaller-scale violence). Some incidents in the late nineties in particular gave the word a boost — the Columbine High School shootings, the WTO protests in Seattle — each of which drove a spike in sightings of “lockdown.” It was already shifting from something imposed by corrections officers to something enforced by police. At the same time, lockdowns took on the flavor of safety and security rather than punishment.

It was arguably a safety measure even when first discussed. (What was the old word for it? Was there one?) To us, “lockdown” suggests an entire building or at least a wing, but in the seventies, it was not unusual for a single prisoner to be put in lockdown (solitary confinement) if they got a little too crazy. The whole premise of prison is that you get put away in a holding pen, away from society, and that’s just another level — prison squared. But soon lockdown became a much more general affair, imposed on hundreds of prisoners at a time. That does have to do with safety, of the guards if nobody else. But as in the case of a single prisoner, it’s very easy to confuse with retribution. When you lock down a school, a civic building, or a whole neighborhood because there’s a killer roaming loose nearby, we’re all supposed to have a warm feeling, like everyone is doing their job and protecting the kids from harm.

We already used “lockup” as a synonym for “jail” — for some reason, you don’t hear “lockup lockdown” — if we hadn’t, “lockup” might have become the accepted term instead. I don’t know exactly why, but “lockdown” works better somehow. It sounds more drastic, more final than “lockup,” and therefore better suited to widespread danger and panic. (Cf. “shutdown,” “breakdown,” or even “patdown.”)

The spread of “lockdown” to hospitals, hotels, or even entire cities demonstrates two things. One is that lockdown is primarily a response to contagion, whether of violence or disease. That’s why it sounds strange when sportswriters use it to describe an outstanding defensive player; we understand but it sounds a little off somehow. But the continuing creep of the term into other fields (itself a form of contagion) reveals the seductiveness of the concept. Here’s an easy way to prevent harm to the defenseless, and who wouldn’t be for that? The fact that it also represents an expansion of power — of government or administrators of private institutions — doesn’t seem so important against the backdrop of pious evocation of security for all. Pretty much everyone would agree that lockdowns are at least occasionally necessary to prevent dangerous situations from getting completely out of hand at prisons, hospitals, or schools. But how often? Should we carry them out as preventive measures rather than as responses to unfolding crisis? Is it true that the more lockdowns that occur within a society, the more authoritarian it becomes?

Tags: , , , , , , , , , , , ,

side hustle

(2000’s | African-American | “(little) thing (one does) on the side,” “second job,” “moonlighting”)

Hustle: a rich-hued word with a history. It goes back to the seventeenth century, when it generally meant “shake” or “jostle” (an echo of which survives in the old-fashioned phrase “hustle and bustle”). It comes, in fact, from a Dutch word meaning “to shake.” By the turn of the twentieth century, it bore several of the meanings we recognize today: move hurriedly, act fast and/or cleverly, (as verb) to sell, including one’s own body. According to Lighter’s slang dictionary, it did not start to mean “(do) something underhanded” or “attempt to deceive” until the 1940’s. Among African-Americans, it has meant “side source of income,” legal or otherwise, since the seventeenth century, according to Major’s slang dictionary. You can see how they all fit together, but it’s that African-American usage that has survived in today’s set phrase, “side hustle.”

Since it resolutely fails to earn me a dime, the blog does not count as a side hustle. Lovely Liz from Queens, on the other hand, has one that she’s very good at: helping your book resolve its issues. If you’re having problems writing a book — any kind at all — she works through them with you.

As it began to appear in the press around 2000, “side hustle” sometimes suggested shady or criminal activity but since has become quite respectable. A side hustle involves work for the purpose of getting paid, but it isn’t quite the same as a second job as we used to think of it, with predictable hours and a well-defined work site. The expression’s primary referent has evolved a bit over time; ten years ago, a side hustle most often resulted from a particular talent or interest — music, baking, embroidery — that someone was able to monetize. After the 2008 crash and the explosion of the gig (or freelance) economy, it more often refers to driving an Uber, selling your work on Etsy, starting an on-line business; the whole thing has become quite a bit more prosaic.

I’m afraid “side hustle” has joined the vocabulary of the apologists and cheerleaders, the people who write books in which the first premise is that when employees are getting shafted, it’s not because of the executives. A veritable horde of scribblers counsels employees to accept and deal with whatever the boss throws at them, but not to figure out why it’s happening and how to prevent it. I won’t belabor this, since I already went into it last week, but “side hustle” is one of the rentier class’s favorite responses to downsizing, or forty years of wage stagnation. (The phrase appeared in a Small Business Administration press release last year; even the federal government has gotten into the act.) You could also organize and force the bosses to pay more and offer better conditions, but hardly anyone writes books about that, and they rarely make the best-seller lists.

You ask why I’ve been attacking big business and entrenched wealth lately. What’s the occasion? Are they acting any worse than they did last year, or ten years ago? (Not really.) Well, some things never go out of style, and a laser focus on collusion between government and moneyed interests grows more necessary by the year. Thanks to the latest Republican tax cut, your average American corporation is sitting on a mountain of cash, enriching a few dozen people beyond dreams of avarice and investing precious little of it, certainly not in salaries. Meanwhile, large majorities of hard-working people scrape by as they have for decades now, one mishap away from penury. The elementary failure to redistribute wealth downward cripples our political system, leading to the crude parodies of popular deliberation that our presidential elections have become, and the utter and unremitting failure of government at all levels to do the people’s business, or to act in the public interest at all, from maintaining and building infrastructure to making parental leave an unquestioned right. Now the oligarchs have spat Trump up on the shore of our democracy, daring us to take it. How long?

Tags: , , , , , , , , , , ,

emotional intelligence

(1990’s | academese (psychology) | “sympathy,” “empathy”)

First we must pay homage to Daniel Goleman, who adopted this week’s expression for the title of a best-seller in 1995, vaulting it into everyday language. Psychologically speaking, his goal was to cast doubt on the primacy of IQ testing as a method for predicting success in life. He followed in the footsteps of Harvard psychologist Howard Gardner, who proposed several different types of intelligence, each playing an important role, of which IQ represented only one. Goleman’s work was a summation of research that had been going on at least a decade among psychologists, neuroscientists, etc., and an unusually effective popular treatment of recent science. He was also concerned with childhood development, attempting to prove empirically that children turn out better if they are taught means to deal with and mitigate their emotional reactions — more likely to avoid major trouble and relate well to their peers. His focus on education tended to disguise a strong self-help tendency in Goleman’s popular writing; he seemed to be trying to start a movement. To some extent, he has: there are now a number of tests that measure “EQ,” and emotional intelligence has become a familiar concept, denoting what we used to think of as skill at reading expressions, gestures, and tone of voice, and a willingness to use it.

But that’s not the whole story of this phrase; it had two other uses in the mid-nineties. One, which turned up most often in reviews of the performing arts, denoted the ability to convey a character’s emotions, credited primarily to actors and singers. (That meaning seems to have lapsed.) The other, closer to Goleman’s, had mainly to do with grasping and responding to the emotions displayed by others; whereas Goleman emphasized understanding and controlling one’s own emotional response, other early adopters of the expression made more of looking outside oneself. This distinction may also be observed by introducing the notion of social intelligence — understanding others — in contradistinction to emotional intelligence — understanding oneself. Actual people who boast one attribute are likely to have the other, it is true, and Goleman argued that the emotionally intelligent (in his sense) did better because they played better with others, suggesting that their sensitivity stretched beyond their personal boundaries.

It seems to me that by now the outer-directed sense of emotional intelligence has won. The term has long since outgrown the psychology ghetto and is common all over the lot, including sportswriting and political reporting. Philosophers of business have made a near-fetish of it (as they did, twenty years ago, with a closely related concept, “interpersonal skills“). Today’s business coaches laud emotional intelligence, meaning roughly “ability to fend off drama queens and divas and make everyone else feel less oppressed.” Buffing up your emotional intelligence will make you a better leader and turn your employees into obedient little gnomes. The business press thrives on this sort of thing; every year a new panacea that will make every lousy boss into a good one. And every year, the preponderance of bosses fail to follow the sensible advice of management gurus, which is a darn shame, except it means the bosses will continue to require their expensive services. It’s the employees who won’t get anything out of it.

Business apologists do glom onto expressions that make the boss look better while doing little to improve actual performance. “Mindfulness” and “wellness” have certainly gone that route, while “who moved my cheese?” also deflects responsibility for major disruptions of employees’ lives. Now “emotional intelligence” takes its turn. The phrase conveys increased sympathy and humane attitudes toward employees, but books are written about emotional intelligence because it benefits employers at their expense. Yes, your employees will be happier — because you have become more adept at manipulating them. When executives turn their attention to the wider world, “downsize,” “go green,” “outsource,” and “win-win” treat the rest of us the same way, using euphemisms or feel-good phrases to avoid or disguise harmful policies and acts.

Tags: , , , , , , , , , , , ,

emotional baggage

(1970’s | therapese | “emotional scars,” “trauma”)

At least in the seventies, when “emotional baggage” wormed its way into demotic language, it could be the property of persons, as it normally is now, but it might also trail along behind a political issue, analogous to what an older generation would have called “freight.” So certain matters of public policy — abortion, capital punishment, affirmative action, anything a lot of people get worked up about — were said to have emotional baggage. Today I think that such usage would sound rather odd, though the meaning would not be unclear. When pundits rather than therapists resorted to the phrase, it took a patronizing cast, indicating that all those simpletons needed to calm down and let the experts analyze the issue dispassionately. One wished to set it aside or get rid of it entirely. That’s true of emotional baggage bogging down an individual, too, but the tone is usually more sympathetic. One’s demons are presumed difficult, and even unsuccessful efforts to cast them out are deemed worthy. It is dangerously easy to recognize and cluck over others’ emotional baggage even as we go right on tripping over our own.

Other common phrases bearing “baggage”: “personal baggage,” which weighs down politicians in particular — past statements and votes, but more juicily, their peccadillos, magnadillos, or killerdillos — Ted Kennedy had a lot of it, for example. “Mental (or intellectual) baggage” also holds you back, but specifically because it consists of outmoded preconceived notions (cf. Wordsworth’s “creed outworn”). Emotional baggage treads the same path — it gets in your way AND takes its lessons from past experience that need not apply to your present or future — yet you continue to carry it with you.

The common denominator of “baggage” is that which weighs you down, but its earliest figurative uses encompassed other meanings. The earliest seems to have been “prostitute” — from Shakespeare’s time — later it went on to mean “saucy young woman,” which persisted into our era. But it could also mean “worthless man” or “nonsense,” neither of which corresponds very well to how we use it now. “Baggage” meaning “impediment” goes back at least to the late seventeenth century and has an extensive historical pedigree. Its most familiar avatar in the twentieth century was probably “excess baggage,” used to denote whatever people or things slow us down or get in the way: could be family, past history, or whatever you’re unable to cast aside. The word has never lost its negative connotations when used metaphorically, but they became less venomous somewhere back there. “Baggage” has a more complicated history than you might suspect, but by now certain strands have crowded out the others, and most old associations of “baggage” seem unlikely to return.

Further usage note: Something immutable, like genetic heritage, would not generally be called “baggage.” “Baggage” is not exactly voluntary, but the implication persists that we can get rid of it, or at least work around it, if we want to bad enough.

Tags: , , , , , , , ,

zero-sum game

(1970’s | academese (mathematics) | “winner take all”)

An expression that’s actually a bit old by Lex Maniac’s standards, “zero-sum game” was well-established by 1980 in political and economic journalism, and it retains a technical or bureaucratic flavor to this day. Its origins lie in mathematics, specifically game theory. In a zero-sum game, the gains of one side must be exactly matched by the losses of the other(s), so when you add the two together, you get zero. It might come up when finite resources are at stake, or to talk about election results or currency trading. A notable feature of this expression: the frequency with which it was glossed when it started to show up in the mainstream press in the 1970’s. As I remarked recently, most new expressions come with explanations some of the time, but virtually every instance of “zero-sum game” yielded by LexisNexis from the seventies included some sort of definition, even if only rough-and-ready. Certain terms draw our attention as they enter the discourse because they are glossed either rarely or nearly always (most new expressions fall somewhere in between, making the extremes noticeable). I can’t divine any shared characteristic that accounts for either state.

Like every economic or social science model, the zero-sum game is a simplification of what goes on in real life — a way of reducing complicated situations to a small number of “essential” characteristics, which makes solving the equations much easier. Sometimes the simplifications clear away irrelevancies and point the way to a clear answer. More often, they leave out significant factors and present a misleading picture of the underlying issues. It is important, in other words, to know how to recognize when the zero-sum game makes a good approximation of the problem at hand, and when it misrepresents it in more or less crucial ways. For the temptation to resort to zero-sum analysis is powerful, particularly among those who take a harsh view of society and human relations. It’s a great tool for social Darwinists — those who see human culture as an arena in which the quick and strong trample the slow and weak, figuratively if not literally — because the zero-sum game demands winners (the fittest) and losers (everyone else). The zero-sum approach is commonly equated with negotiating methods that emphasize imposing losses on the other party, rather than trying to give both sides part of what they want. Donald Trump is often derided, with some justice, for treating certain issues — immigration and trade come instantly to mind — as zero-sum games when both theory and experience show that they are not.

The zero-sum game does best in discussions of athletic or gambling competitions, where the winning and losing sides are easy to discern. (Note, however, that in sports such as golf and auto racing, the concept is less useful, because there are a number of participants in the prize pot, so finishing first does not knock everyone else out of the winners’ circle.) But athletic competition is itself a simplification that creates an arena in which we can sail past the immense complexities of everyday life and root wholeheartedly for our side, without equivocal undercurrents. That makes the zero-sum game a simplification of a simplification — that is, a distortion of a distortion — two removes from what happens in the real world, even when it looks like a good match for the zero-sum model. We need models, but we also need means of measuring their results and recommendations against what’s going on outside. Otherwise it’s easy to make progressively worse decisions until it all ends in catastrophe.

Tags: , , , , , , , , ,

experiential retail

(1990’s | businese | “something extra for the customer”)

When Lovely Liz from Queens is unfamiliar with an expression, it must be assumed to be generally unfamiliar, so I will define “experiential retail” in simple terms. It’s when the store gives the customer stuff to do besides shop. I’ve only started noticing the phrase in the last few years, but it arose among theorists of selling by the late nineties, though it doesn’t seem to have become ordinary until at least a decade after that. The giant keyboard at FAO Schwarz (installed in 1982, sez Wikipedia) is an example of the phenomenon before there was a word for it; the famed toy store was an early exponent.

The innovation of FAO Schwarz and its followers lay in taking the kind of service that had been available only to the rich and placing it within reach of the middle class. In the old days, only rich people got to shop in stores with fake waterfalls or what have you. (According to Shopping Center World in May 2000, a Nike store offered “celebrity athlete appearances, a motion simulator ride, viewing/listening stations, and a Ticketmaster outlet, along with other intangibles such as its “Stay-in-School” program.” Other oft-cited early exemplars: the Warner Bros. Studio store, the Disney Store, American Girl stores.) An oversize synthesizer you can play tunes on with your feet? Maybe in an exclusive store for people with real money, but no middle-class kid had a shot at such a thing before FAO Schwarz took the plunge. Give the customers something that will cause them to buy more this trip and come back more often. Of course, if nobody wants your stock, all the experiences in the world won’t help.

The most common mode of experiential retail is amusement, normally bearing at least a tenuous relationship to the merchandise. It’s mostly about offering shoppers self-indulgence — though self-improvement is sometimes touted. In the early days it could be as simple as a few television screens; games and other activities were popular ways of livening up the shopping experience. A store like Build-a-Bear turned the product into an activity, giving children and their parents a chance to customize a stuffed animal. The emphasis fell on big, gaudy displays intended to impress as well as influence, and when there was money behind them, the displays could get pretty impressive.

The late nineties was when on-line shopping started to make itself felt. (There had been many ways to buy stuff without leaving home before the internet came along, of course.) Experiential retail was one response; brick-and-mortar stores had to distinguish themselves from their on-line competitors, partly in order to justify higher prices. A fine example of a new term arising in direct connection with a development in the culture.

Twenty years later, as the great middle-class department stores like Sears and J.C. Penney are dying, experiential retail looks like a desperation tactic that failed to stem the on-line tide. The old dinosaurs were probably doomed anyway, but being forced to spend more money to lure shoppers who were spending less and less couldn’t have helped their bottom lines. It makes people like me sad to see those anchors of my suburban youth go. The market has no sentiment, but the little consumers who fuel it do, and the market must make room for our quirks if it wants our money.

Tags: , , , , , , , , ,

fatberg

(2010’s | journalese?)

This word has started popping up frequently in New York City, where the government has launched a campaign against this particular urban blight. For those of you fortunate enough to live in rural areas, a fatberg (“berg” as in “iceberg”) is a mass formed of cooking fat and other stuff — baby wipes are common — in the sewers large enough to obstruct the flow (examples here). A Briticism, the expression was pressed into service in 2013 when the things made their presence felt around the United Kingdom, notably in Kingston in August, when a fifteen-ton fatberg the size of a London bus was discovered just in time, before it caused raw sewage to pour out through the manholes. That occasion spread the term all over the English-speaking world and introduced it rather precipitately to the mainstream. It had shown up before in the British press, often in articles about London entrepreneurs who aimed to convert fatbergs into energy with some good old British pluck and ingenuity. The lads intended to salvage the grease, truck it off to the biodiesel plant, and turn it into kilowatt-hours. Makes me glad I have a desk job.

Presumably fatbergs result from changes in our flushing habits along with increased population and antiquated sewers. More products — mainly wipes and cat litter — advertise themselves as flushable nowadays, and disposal of used grease — known in the trade as “FOG,” which stands for “fat, oils, grease” — is the same problem it’s always been. Most municipalities do not have household-level collection of oil and fat, which is laborious and probably would not pay for itself. Which means that as it always seems to be in America, the landfill is the last resort; if you don’t know what else to do with it, throw it away. Even environmentally conscious New York City, which in some areas does actually collect food waste, including bones, grease, etc., must tell residents of other neighborhoods that they should throw used cooking oil away instead of dumping it down the drain. The city has inaugurated an educational campaign (Dave Barry is not making this up, and neither am I) that instructs residents to flush only the four P’s: pee, poop, puke, and paper. They forgot phlegm.

Fatbergs do seem to have caught our imagination. There’s a children’s book and a prospective West End musical (grease is the word); the Museum of London had a piece of a fatberg on display for a while (it was sealed in a plastic container, so you couldn’t smell it.) Two guys in Amsterdam are building a fatberg in the sea. Part of our fascination stems from horror or disgust. Yet there’s also an element of pride in our ability to generate waste and create vast subterranean messes that someone else has to clean up — because there’s no way to get rid of a fatberg except to send people down there to hack away until it no longer clogs the pipe. They are born at the intersection of consumerism and hygiene, the repressed dark side of our unremitting consumption and the waste that goes with it.

“Fatberg,” like “workaholic,” is formed with an affix that is otherwise nonsensical, yet we understand it right away (cf. “Jumbotron,” “McMansion,” and “robocall“). While it was not born at an ascertainable time and place, like “irrational exuberance,” it went from zero to sixty in a hurry.

Tags: , , , , , , , , , , , , , ,