Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 50 years

Tag Archives: innovation

business model

(1980’s | businese | “business plan,” “grand strategy,” “big idea”)

Putting a business model into practice requires a lot of attention to detail, but the business model itself doesn’t. It can usually be summed up in a few sentences, a statement of general means to achieve broad goals, or a couple of concepts connected loosely with a method of bringing them about. These are not the fiendishly complex models of the hobbyist; they’re like economic models that set up a highly simplified map of how money moves around intended to make reasonably accurate predictions about real life. Business models look to the future, and they are subject to change; executives must recognize when they need a new one, lest the firm fail or fall behind. A start-up might boast of future profit infallibly brought to pass by their business model, while an established concern is more likely to tout a business model that is serving them well in the present and doesn’t need changing, thank you very much.

In everyday use, the expression is pretty casual, but there is a bit more to it. A 1990 definition in Computerworld magazine broke it down thus: “[Business] models, generally developed by a data administration unit, describe current and planned business activities and the related information requirements. A model is typically a four-level hierarchy that identifies business functions, the processes within each function, the activities within each process and the information needed to accomplish each activity.” At some level there must be attention to detail, even in the most devil-may-care industries.

Before 1990, “business model” also meant something else and was more likely to be paired with institutions of higher learning, or perhaps government agencies and even individuals. The idea was “act like a business,” that is, subscribe to the reigning corporate nostrums and show no regard for employees. If you did that, the financial poobahs would congratulate you on following a business model. That use remained in play into the nineties, which is when today’s understanding of the term took over. The tech companies dragged it into prominence; the computer industry seems to be the first that was generally expected to produce “business models.” That’s probably because new firms can’t attract funding without one, and computer start-ups were a dime a dozen in those days. Also because with a few off-the-charts exceptions, most computer companies have never quite figured out how to be profitable, even with an enthusiastic customer base and lots of love from investors. (In that respect the tech industry resembles American society in general, where a tiny minority is staggeringly prosperous while the vast majority does its best many levels below.) But even an unsuccessful venture may pull the wool over the eyes of investors long enough to free them of their money and give the principal shareholders time to grab the capital and run.

For the business model is the blueprint for making money. It helps if it has been proven by others; if it’s untried, you’d better have a good line of patter to back it up. A common way to disparage an enterprise is to say that its business model is the same as that of another company that failed or is in the process of failing. But how much success can be traced back to the business model? If the product’s no good, the business model won’t save it; if your employees don’t do their jobs, your big idea won’t go far. If the theory isn’t put into practice effectively, it doesn’t matter how good it is.

Tags: , , , , , , , , , ,

disruptive

(1990’s | businese? athletese? | “shaking things up,” “causing a stir”)

A word of long standing, but when did it take on a favorable connotation? Not everywhere, of course, but executives use it approvingly now, unthinkable in the days of Henry Ford or even Lee Iacocca. Successful corporations have traditionally avoided boat-rocking and sought the even keel, but now executives congratulate each other on their disruptive business practices. It is not solely a matter of hobbling the competition; a certain amount of disruption is tolerated within the organization if it keeps employees on their toes, for example, or pushes a complacent division into activity. The buttoned-down set seems to have loosened their vests.

The first occurrences in the press that I found date from the late nineties, a few due to far-sighted business gurus but more from coaches describing the defensive unit, particularly in football and basketball. (Often it applied to a single defensive player.) I couldn’t guess which source influenced the other, but there’s nothing new about businessmen borrowing vocabulary from athletes — in this case, giving it more of an offensive than a defensive cast. By 2010 the word was ordinary in business contexts. Nowadays artificial intelligence and business models or strategies attract the label “disruptive.”

It’s a very forward-looking buzzword, associated with innovation, technology, and improved corporate management. Senior executives sling it around confidently, extolling the virtues of novelty and adroit exploitation of one’s strengths, or just crowing about how they’re going to mess with their competitors. There’s the usual tension between the goal of making the world a better place (if only for p.r. purposes) and simply extracting greater profit from it.

“Disruptive” is close to a newer expression — “game-changing” — and an older one, “revolutionary.” But these are both stronger than “disruptive,” which encompasses lesser shocks to the system. You can be disruptive without altering the playing field permanently or overthrowing an old order. It reminds me of Joseph Schumpeter’s notion of “creative destruction,” a hallmark of capitalism, which requires not just that single enterprises should fall so that better ones might rise, but that the rules of doing business, or other received wisdom, must fall to the new and improved. (Schumpeter believed strongly in innovation and entrepreneurism, by the way.) In today’s world, disruptive tactics are mainly intended to weaken or drive out competitors, but getting rid of rivals was always part of the entrepreneur’s toolbox. The fine talk of less able businesses fertilizing their successors didn’t disguise the fact that Schumpeter was merely peddling social Darwinism dressed up as economic law — yet another instance of trahison des clercs.

We owe this week’s expression to Will from Paris, a first-rate student of the language and a damn fine host to boot. He says, based on recent dealings with the corporate set, that this word will soon take over the world, and Lex Maniac wants nothing more than to get in on the rez-de-chaussée. Merci!

January 28, 2020: An obituary of consultant and professor Clayton Christensen in today’s newspaper reveals that he introduced “disruptive” into businese starting in the mid-1990’s. His name did not come up in my sketchy research, but I’m perfectly willing to acknowledge his role in popularizing, if not inventing, the new expression.

Tags: , , , , , , , , , , , ,

real time

(1970’s | computerese | “clock time”)

Another departure from my chronological standards, “real time” was well established by 1980, though mainly in technical contexts. The expression has a few slightly different meanings that pretty much come down to simultaneity — one system changes more or less instantly as a parallel system does, generally as the result of new information. The other notion it conveys is time neither expanded (as in slow-motion replay) nor compressed (as in time-lapse photography). “Real time” demands strict observance of the clock, giving it still greater power to circumscribe our every thought and sensation.

As the expression has become more colloquial, it has leaned more on a preposition: “In real time” corresponds to “live” or “as it unfolds,” which seems like a perfectly natural development; sometimes it means no more than “up to the minute” or “at this moment.” The expression retains a strong technical bias, but it has been available to arts writers for at least thirty years. The concept is easily grasped and we all labor under the computer’s yoke, so it has become common property; most of us are capable of using the phrase in ordinary conversation, without quotation marks. It’s also available as an adjective. Despite a superficial resemblance, “real time” probably has nothing to do with the older “have a real time of it” — a rough time — which is passing from the scene.

Improving communication speed has been a primary technical goal for many centuries now. The days are over when Cecil Rhodes (according to Mark Twain in “Following the Equator”) could land in Sydney and make a fortune because he caught and gutted a shark that thousands of miles away had eaten a man who happened to be carrying a newspaper with significant financial news — news much more recent than the “latest,” which came by steamship from England and was a month or two old. Those days ended with the invention of the telephone, the first long-distance real-time communications device. (It took several decades before intercontinental telephone calls became feasible, of course.) A hundred years later, in the 1970’s and ’80’s, a lot of money and effort were still being spent to improve data transmission speed and the ability of various kinds of software to incorporate fresh observations or calculations quickly and accurately. Data velocity has not decreased in the years since, and the packages have grown enormous. Files that would have taken days to send thirty years ago, if they could be sent at all without overwhelming the network, now arrive in seconds. The combination of increasing speed and vast volume have made possible dizzying advances in a range of fields, not to mention terrifying information overload.

The changes real time hath wrought — in banking, medicine, journalism, and on and on — are too numerous and well-known to list. We may think of it mainly in economic terms, counting the ways faster movement of bytes and more and more seamless coordination between networks and devices have enabled us to make money. But there are other forces at work. One is simply the drive to innovate and improve, so fundamental to technological advance. The other is its complement, greed for novelty, not necessarily caused by cupidity, which creates the cheering section for the engineers and programmers who find ways to make it all work faster and better. The early adopters, in turn, make it financially possible to maintain the techies by enabling the middlemen to make a profit off their work, and we’re back to money.

If my count is correct, this is the 400th expression Lex Maniac has written about at greater or lesser length. My first association is with the Four Hundred of nineteenth century New York society, perhaps not the most fortunate. Second is auto races, also a little out of place. “Into the valley of death”? Doesn’t sound right, either. An inauspicious milestone.

Tags: , , , , , , , , , , ,

standalone

(1980’s | computerese, businese | “independent,” “unconnected,” “separate,” “isolated”)

The earliest instances of “standalone” (sometimes hyphenated, even in this day and age) in Google Books date from the sixties and seventies, nearly always in talking about non-networked computers. The first hits recorded in LexisNexis all date from 1979 in that trusty journal American Banker — but invariably in discussions of the use of computers in banking. The word was used often in the early days of ATM’s, which could, in the manner of computers, be divided into the ones clustered together for protection (e.g., in a bank lobby) and the ones out in the field, far from help. (The latter had to be connected to a mainframe somewhere or they wouldn’t have access to anyone’s account data, of course. And even a standalone computer had to be connected to a power source. No computer is an island; no computer stands alone.) ATM’s were brave and new in the eighties, and I suspect their spread pushed “standalone” into prominence. Other business types were certainly using the word by 1990, generally in reference to computers. It was widely understood by then but remained primarily a hardware term at least until 2000. One mildly interesting point about “standalone” is that it could apply to an entire system as well as to a single device. A standalone device can function even if it is not part of a larger system, but an entire system can also absorb the adjective if it doesn’t depend obviously on comparable systems.

“Standalone” retains a strong business bias, even today, but it is available to describe many things besides computers. A complicated piece of legislation might be broken up into standalone bills. Or a film for which no prequels or sequels are planned (or one in which a character that had been a supporting player in other films becomes the protagonist) might be so described. A government agency that doesn’t rely on another agency for its writ. A restaurant that isn’t part of a chain. “Standalone” is not generally used to mean “freestanding,” although it seems like it ought to be, literally speaking. I am a little surprised that I find almost no examples of the word used as a noun (one does see it used as a trade name), although that seems inevitable. All it takes is the careless insertion of one lousy one-letter article, and the deed is done. You’d think it would be harder to blur fundamental grammatical categories, but no.

The rise of this term inevitably accompanied a change in how we use computers. In the seventies and eighties, when we began to get serious about turning them into tools for business, the idea was that each employee’s work station had to be connected to the mainframe, where all the applications and data were stored. In the nineties, we shifted to the opposite model: each employee’s computer should have a big enough hard drive to store software and data; every work station became its own mainframe (or server, as we would say now). In the last few years, we’ve pushed the other way, and now minuscule laptops and tablets run software from the cloud, and store data there as well. The same shift has taken place outside the office; home computers have undergone a similar evolution. There are no doubt good reasons for the shift; the rules and conventions of the computer game have changed quite a bit. But like many such sizable shifts in our culture, it has taken place with little or no consideration of why we did it the other way. Are the once highly-touted advantages of standalone computers no longer real or significant? We don’t know, because the issue was never debated out where most of us could hear. We did it the old way because there was money in it, and now the powers that be have found a new way to make money. You’re stuck with it whether it helps you or not, and you’re not even entitled to an explanation. That should be surprising, but in practice, it isn’t. Our policy debates routinely fail to explore how things got to be the way they are. It’s as if we all woke up one day and said, “Look, a problem! Let’s fix it!” With insufficient historical understanding, we attack large-scale problems with little or no attention to how they arose and fail to acknowledge the evils the existing approach has successfully prevented.

Tags: , , , , , , , , ,