Monday, August 18, 2025

Falling Behind.


A dear friend, an ex-partner, just sent me the Ogilvy-derived drivel above. 

There are a lot of problems with the message.

One: Why would anyone take counsel on business, on falling behind, on "ending," from an agency that's been halving itself in revenue every year for the last five years. That's like posting stock picks from the homeless shelter. Or diet advice from donald trump. Or abstinence advice from elon murk. Failing companies should not be issuing success dicta. They should be fixing the conditions that led to their decades-long free-fall.

Two: Agencies, not too very long ago, issued similar statements about the efficacy of open-plan offices. About 99-percent of people now agree they were an efficiency disaster, not motivated by better "communication and collaboration," but spurred on solely by the chance to cut back on rent. 



You're taking advice from this company?


Three: Agencies, not too very long ago, also told us about the wisdom of divesting themselves of media. That too, has proved to be a disaster. 

Four: Agencies, as we speak, have been selling the flavor of the month for decades. They regard fleeting fads, like Google+, NFTs, interactivity as seminal movements. Agencies always over-trumped such bushwa simply because they're terrified of seeming like they're behind. 

There's no accountability for false predictions. People are fired when a phony trend is missed.

The worst takeaway from the above is that it assumes, with the arrogance of typical corporate leadership, that there's only one manner of doing something, and it is our manner. If you're not doing things our way, you're shit-out-of-luck and therefore stupid.

As for artificial intelligence, it makes a huge and heinous assumption. That all intelligence comes from what's come before and works by using precedent and past examples.

In "Primal Intelligence: You Are Smarter Than You Know," author Angus Fletcher argues against the current "AI is Everything" mania. The book was reviewed in the weekend Wall Street Journal, and the subhead of the review speaks volumes.

 


A not-atypical hierarchy of learning is DIKW. D being the lowest. W being the highest.

It stands for Data. Information. Knowledge. Wisdom.


Data, sorry person who said the quotation at the start of this post, is a commodity. Working that data into something that provides wisdom is rare. And the province of a few practitioners of the advertising arts. (Having been fired, such practitioners are usually self-employed.)


As Brandy Shillace's review points out,


"Data, by its definition, is information that is known. It already exists. You can mix and recycle to make new tweaks, but it cannot lead to innovation—and it can’t tell you what to do when the future is unknown.

“To handle the unstable dark of worldly existence,” Mr. Fletcher writes, “our brain had to develop mechanisms for acting smart with little, even no, information.” The brain, we are told, has nonlogical intelligence, four “primal” powers: intuition, for perceiving the world’s hidden rules; imagination, which makes the future; emotion, which aids in personal growth; and common sense, which acts wisely in uncertainty.This list may seem unimpressively simple, but when we look at our modern system of education, we can see that they have been relegated to the fringes—often to humanities programs that are underfunded or cut altogether. In their place, schools drill students to think like computers at the expense of the “practical smarts” that made us human in the first place.

Whenever I see a statement like the Ogildrivel above--on AI or nearly anything else, I go through some steps.

1. What's the predictive track-record of the speaker.
2. Who's making money on the prediction.
3. Do I consider the speaker intelligent and successful.

I have a feeling that about 20 trillionaires are behind the AI onslaught. Thousands more people are acting as their willing-executioners. They're tulip salesmen in 16th Century Holland, they're bridge salesmen in Brooklyn. Crypto salesment in the trumphouse. When the trillionaires get their AI sold, they'll be quadrillionaires, and then they'll work on their next thing after that to become eventually undecilionaires. 

Despite all that money, AI doesn't have:

1. Intuition.
2. Imagination.
3. Emotion. And
4. Common sense.

That's why, we seen "college students doing better at standardized tests while having greater difficulty with real-world tasks.” 

In advertising we'll see brilliant prompt engineers making millions and insipid work costing brands billions.

In the real world, Fletcher says, "there are many paths to a goal, and those who succeed tap into something beyond black-and-white rule-following. They have creativity and flexibility" and a ....“natural cleverness that AI can’t replicate.”

I'll end with this, "AI can steal datasets from stories and recycle them into plots but the computer is still not reading the story. We are. We are the ones taking the word scrambles and lacing meaning into them."

I suppose to the same people who told us how great open plan offices are selling us their next gallon of snake oil, now made with AI-derived serpent juice.

Ask for proof.


No comments: