The Empire of Light, René Magritte, 1954 Every two weeks, our advertising agency convenes its regular AI committee. We spend about an hour listening to our very diligent and well-informed committee chair run through important new developments in the world of AI and how it might pertain to our day-to-day work: photo to video animation, text to video animation, bountiful new libraries of AI stock imagery, machine generated podcasts (irritating hosts and all), and much more.
Often, I will steer these meetings off-course when I point out that AI sucks, which is a valid thing to do because AI sucks. I’m not talking about the kind used in medical research or to talk to whales or whatever else the whitecoats with fancy degrees use it for and which is probably, for the most part, fine. I’m talking about creative work. Herein, I will outline some of the levels on which “creative” AI sucks. Level 1: It’s Not Interesting Many people seem to believe that 1) AI can be used as “a tool” to produce interesting work or 2) eventually, it will be able to produce interesting work autonomously. Creative work has two characteristics: it demonstrates intention (sometimes the intention is to demonstrate a conspicuous absence of intention) and, through that intention, attempts to differentiate itself from the bulky mass of other work. By definition, AI cannot do these things on its own, but more importantly, it’s not designed to help you do it either. The AI we use today is an averaging machine. It swallows vast swathes of data to churn out aggregates of those inputs. At its best, these are high-fidelity averages — exquisite replicas. That’s why we evaluate generative AI today not based on how interesting the outputs are but on how accurately they resemble work done by a human being with a pulse. This is the whole point of something like a Turing Test, which is being bandied around to prove that AI is creative. It doesn’t. Creative work is, at its foundation, a human thing. This is built into the way we interact with it. That's why “robotic” or “this looks/sounds like AI” tends not to to be something you want to hear said about your work. AI is designed to take you out of the creative process, less a tool than a collaborator to which you outsource your creative decisions. Interesting work produced with AI can only result by first vetting then circumventing, overriding, or approving the choices AI has made on our behalf. Level 2: It’s Displacing People Who Are Interesting Producing uninteresting or derivative work is not the biggest problem (you can skip to level 4 for that). People do it all the time, but they (we) are at least trying to not do it. But people (we, us) are slow and expensive, and even after a lot of time there’s no guarantee they (we) are going to reliably produce interesting stuff. Not to mention the intangibles like they might be difficult to work with, or using company time to write lengthy and probably pointless blogs. This is where you begin to see the business case for AI. Instead of hiring someone new or contracting a freelancer, management might keep on a handful of trusted idea people to sift through the slop that comes out of the machine. The ideas will suffer, but perhaps not enough (or not quickly enough) for most people to notice. More importantly, though, the pipeline aspiring creatives who want to set out on the long, arduous path of getting better at producing interesting work begins to buckle and break. Creative work was precarious to begin with. It's going to get less viable, and sustained interest — the kind required to develop the skills and the tastes and the motivation to do the job well — will wane. Level 3: It’s Degrading Our Sense of What’s Interesting AI gets a bad rap because its systems are built on theft. But stealing does not disqualify AI as creative. You can elevate ripping someone off to an art form, but AI can’t. With AI, even theft is boring. Given that creative AI exists mostly as a shortcut, we can safely assume that people who use AI are going to become even more hands-off as AI gets “better.” As this filter disappears, more AI-generated content will proliferate. This matters because good taste is as much about your diet as it is about your palette (so to speak). And when there’s more junk food out there, appetites begin to change. This feedback loop will also very likely affect the capabilities of future AI models. In 2023, one researcher raised the alarm on what has become known as the Habsburg AI, a phenomenon wherein AI-generated outputs begin to form a greater portion of AI training data. It is named after the European royal family that was famously deformed and driven insane by inbreeding. Level 4: It’s Making Me Less Interesting I would be arguing in bad faith if I had not used AI myself in an honest, open-minded attempt to keep abreast of the latest trends or whatever. I’ve found it useful as an early stage search engine substitute and for a handful of boilerplate copywriting tasks. Those results typically require enough follow-up and editing that they aren’t much of a timesaver, nor do I believe the final product was meaningfully improved with these new inputs. But in a different world — one where I was perhaps less motivated to find fault with this technology and cared less about doing my job well — I could imagine myself being lulled into a sense of complacency with some of these results. I could see my faculties eroding as I learned to accept all the results that are "good enough", kind of the way my forebears forgot how to forage and read paper maps. Will I change my mind about this one day? Perhaps. I am aware that I sound like your grandfather. Some of my colleagues have found AI useful to generate animations, stylized stock imagery, and for certain video editing applications. I am sure these applications will become more useful in their capacity to generate better quality ripoffs of human work. Perhaps there will be new applications and novel models that will make me wrong about all this. DJs make music. Illustrators don’t need to know how to draw. Despite all I’ve written, perhaps there will be more thoughtful creatives who use AI in ways that force similar changes. Perhaps. But that doesn't seem to be the direction in which we're headed. Whatever your definition of creative work, there is almost always an effort to communicate. AI undermines that. As Ted Chiang pretty convincingly argued, even cliched, unclear, or bad writing with a real effort to communicate contains something that is missing from coherent machine-generated writing. I think that’s true of creative work generally. Every day I see more writing that no-one has bothered to write; images that the “creators” have barely looked at; content that exists for no discernible reason except to farm clicks, to mislead, to automate a creative process that can’t be automated. AI is already an unstoppable force in our line of work. But frankly, the time I’ve spent enumerating the reasons AI sucks could have been spent creating stuff that doesn’t. So, that’s enough. Back to work. Comments are closed.
|
Past Essays |