Volume 175: Culture & The Algorithm.

Culture & The Algorithm.

tl;dr: Is AI an unstoppable juggernaut of cultural homogenization?

Way back when, my former employer launched the logo for the 2012 Olympics in London to much furor. At the time, Jon Stewart described it as a “stop sign going down on a vending machine,” a clip we then used in every pitch for years. (Sorry, I can’t find it now)

Afterward, I remember talking to my friend Patrick, who’d been the ECD on the job, about why it looked the way it looked. His response was enlightening. He believed it represented one of the last opportunities to “open the aperture of acceptable taste.” He believed a steady drift toward cultural homogenization desperately needed an injection of difference, and what better vehicle than the Olympics identity, where years of tame, riskless design had already diminished expectations toward the trite and predictable.

Fast forward a decade to the Brand New conference in Austin a couple of years ago, and I remember being struck by how many speakers were seemingly anti-Patrick, homogenizing how they described their design practice and work. To summarize, they were all, to some degree or another, in the business of “designing for culture.”

I took note and, as the years passed, noticed this same theme popping up all over the creative services landscape - design agencies claiming it, advertising agencies claiming it, social agencies claiming it…you get the gist.

I even lost a pitch to an unknown competitor during this period because it had “a proprietary methodology for connecting the brand to the culture,” which was odd considering the client was in the enterprise software business ¯\_(ツ)_/¯.

You’d be forgiven for thinking that if the creme de la creme of the creative services world were so laser-focused on “designing for culture,” we’d be in a vibrant and explosively creative moment, yet we are not. Instead, our cultural landscape has become just as homogenized as the agencies claiming this statement as theirs.

Why?

Why are we seeing a global homogenizing of culture across every dimension that counts? Why has Hollywood become so creatively bankrupt that nobody bothers watching that oh-so-predictable-52nd sequel to a superhero movie? Why have pop songs become so objectively similar? Why is everyone following the same formula for their posts on Instagram and humble-brag stories 🤮 on LinkedIn? Why are the outrage merchants masquerading as news organizations so predictably apocalyptic? Why have ads become so non-memorable? And why is the aperture for creativity in design at its narrowest in a century, as branding continues to plumb the depths of monotony and digital design calcifies around what Dieter Rams was doing in 1962 because Jony Ive thought it was cool in 1998?

The answer is simple. We’re not designing for culture at all; we’re not even designing for people. We’re designing for the machines that act as our contemporary cultural gatekeepers. We’re designing for black-box algorithms built by massive, monopolistic tech firms whose sole goal is to keep us on their platforms for as long as possible, repeatedly doom-scrolling twenty-second dopamine hits while watching the ads they monetize go round and round.

In sum, culture is homogenizing because it’s economically advantageous to a select few, very large corporations for it to do so. The more culture homogenizes, the more predictable we become, and the more predictable we become, the easier it is to monetize our attention.

But is this really what we want?

There’s much bullshit spoken about how quickly culture is fragmenting, about how much faster culture and the overall economy are moving, and yet there’s scant evidence to support this view. The economy changed faster during Victorian-era industrialization than it does today, and if we zoom out beyond the ad-agency trend decks for a second, there’s plenty of objective evidence showing a richer and more diverse past.

Don’t get me wrong; this isn’t an exercise in self-indulgent shitstalgia. Even though I’m a child of the ‘90s, I’m not saying that everything was better “in my day.” It’s just that when you take a cold, hard look at the cultural world around us, it’s objectively more monotonous than it was, even while we like to pretend that the opposite has happened.

This matters because the age of generative AI, or, as Cory Doctorow calls it, “planet-destroying autocomplete” is now upon us, and unless we act proactively, things will almost certainly get worse before they get better.

To borrow a bad military analogy, the algorithmic mediation of culture carpet-bombed commercial creativity into submission in preparation for a full-scale invasion, with Generative AI being the invading army that’s now entering the field to little or no opposition.

It’s fascinating to me that when people discuss the risks of AI, they tend to focus on two extremes while avoiding the obvious elephant in the room. On the one hand, we have the tech-geek intelligentsia exercising their outsize egos around how dangerous AI could be - “Look at how powerful and amazing we are; we’ve created something that could destroy humanity!” On the other, we have the more pedestrian, yet perhaps more destructive, concern that GenAI is about to wipe out huge chunks of the employed population.

Yet, the true concern should be that because AI is so unbelievably expensive to develop, the winners in this arms race can be predicted as we speak. Unlike pretty much every historically transformative technology that came before, we’re not about to witness a swathe of creative destruction and disruption where new winners arise from the ashes of the old. Nope, the really big AI winners will be the same few tech firms that already homogenized culture for their own gain. It’s just that this time, they’re going to finish the job in the interest of feeding an insatiable appetite for scale.

Don’t let names like OpenAI or Anthropic fool you. These aren’t independent corporations in any meaningful sense. Instead, they’re funded to the tune of billions of dollars and thus controlled by the largest tech corporations on earth in a shallowly obvious attempt to deflect antitrust scrutiny.

So what to do? Ignore AI completely and do everything by hand?

No, of course not. The tsunami is already barreling toward the shore, so the only route to success is to surf that wave rather than let it pound you into the sand.

But in the same way that we can’t avoid the wave, sunlight remains the best disinfectant. In other words, because we can observe the entropic effects of algorithms on our cultural systems and because we’re already seeing the culturally enshittifying impact of early use cases for AI, we can instead seek ways to row in the other direction.

This is because the thing that isn’t being mentioned amid all the drooling over ChatGPT 4o and the myriad of things Google just announced is the power of human agency.

We don’t have to fall down the hole of cultural homogenization if we choose not to. In fact, we should do everything in our power to avoid it, for who could ever compete against a machine that never sleeps at something it’s proving to be exceptional at?

Instead, we must place our faith in the human condition and remind ourselves that no matter what the algorithms and big-data manipulations may say, people don’t actually like cultural homogenization all that much. The fact that they didn’t bother going to that 52nd sequel to a superhero movie and that people are historically unhappy just proves it.

In 1903, Sir George Bernard Shaw said the following (apologies for the sexism; it was a different time):

“The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man.”

If we are to successfully navigate the shift from the algorithmic mediation of culture toward the Generative AI manufacture of culture, we will need to become a lot more unreasonable.

Those of us in the creative industries will need the bravery to buck the algorithms, use AI tools to do things they weren’t intended to do, get subjectively closer to real human beings and ignore the ‘objective’ data from time to time, and see that in a future where everything has been relentlessly homogenized, that true value will lie in the constant pursuit of excellence in the development of the interesting, the novel, the new, and the different.

So, as you play with these new tools, as they help you with ideas and execution, and as they become an ever more solid part of the firmament, never forget that it is our hands on its controls, not the other way round. We can mess with it as much as we like, we can break what it was intended to do, we can find new and original use cases, we can combine ideas in new and interesting ways, and we can ensure that whatever the future holds, the power of human agency to drive creativity and cultural interest stays to the fore.

And. Above all else, remember that it is our unique job as human beings to be unreasonable in the face of the machines as they relentlessly homogenize our cultural lives for tech founder gain.

Previous
Previous

Volume 176: Oops, AI Did it Again.

Next
Next

Volume 174: Apple, Crusher of Dreams.