Artisanal Crap
As generative AI makes first-pass creation faster and easier, an unintended consequence is that humans may become less able to make great things.
Before we get to today's main topic, some miscellaneous goodies…
Thinking about thinking: Don’t miss this terrific New Yorker piece by Joshua Rothman about the different styles of thinking. But be warned: I’ll probably buy at least two books after reading this article, so guard your credit cards before reading. (H/T Tam McDonald.)
Dr. Dre vs. Marjorie Taylor Green: check out this Twitter thread by Brian Tyler Cohen about how Dr. Dre pushed back on the MAGA congresswoman using his music without permission—an aria of snark.
Movie Night: We watched Glass Onion: a Knives Out Mystery on Netflix last night and loved it. One important note: you don’t need to have seen the first one to see this. It’s not a sequel; it’s just another wonderful mystery story featuring the same detective. Some of the cameos are great.
What’s with all the hate on Mindy Kaling’s HBO Max Velma series? I had no plan to watch it before I accidentally fell into a toxic cesspool on Twitter (it was so bad I won’t even link to it), but now it’s a must-see stream.
Please follow me on Post and/or on LinkedIn for between-issue insights and updates. (And join Post already!)
On to our top story...
Artisanal Crap
Let me start by stipulating that generative AI (ChatGPT, DALL-E) will change how we do what we do, taking the heavy lifting off much human endeavor. This will be true whether it’s creating a PowerPoint presentation, finding an image to illustrate your point, writing the first draft of a memo, coming up with an initial radiological diagnosis, designing a new building, or just about anything else we can imagine that isn’t purely hands on (like scrambling eggs). (Dig into the last few issues for details.)
But where does this leave craft?
Any human endeavor starts with innocent ignorance of your quality, moves through grim “oh dear lord no" self-consciousness about the dejecting low quality of what you’re making, settles into journeyman’s “ah, I’m starting to get it now,” and then, if you’re lucky, arrives at mastery. It isn’t just true of art: it’s true for anything humans do. You don’t get to make the good stuff without making a whole lot of crap along the way.
This is Adam Grant’s point in Originals: How Non-Conformists Move the World:
To generate a handful of masterworks, Mozart composed more than 600 pieces before his death at thirty-five, Beethoven produced 650 in his lifetime, and Bach wrote over a thousand. In a study of over 15,000 classical music compositions, the more pieces a composer produced in a given five-year window, the greater the spike in the odds of a hit. (36)
Sturgeon’s Law
Back in 1958, the great science fiction novelist Theodore Sturgeon—who was tired of people asking why so much sci-fi was so bad—replied that “90% of everything is crap.” We now call this Sturgeon’s Law.
Here are two Brad Berens corollaries to Sturgeon’s Law:
#1: of the 10% that isn’t crap, only 1% of that (i.e., 0.01% of the original) is deathless and worth the bother… and that may be a wild overestimation.
#2: we spend 90% of our lives chasing that 0.01%, which ain’t fair.
But we need that crap! You don’t get to the great stuff without the crap. There are no short cuts, at least not for humans.
Malcolm Gladwell popularized this notion in Outliers: the Story of Success, where he articulated his controversial 10,000 hours rule. It takes 10,000 hours of practice before you get anywhere close to expertise. Likewise is the cliché that the first million words a writer generates should go directly into the trash can.
Sure, some people are only ever capable of producing crap, but that’s not what I’m talking about. I’m talking about the purpose of human-created crap—what I think of as artisanal crap—where the goal isn’t the crappy result but the skills that the artisan earns through making crap. This is the craft of crap.
Do you think I could have created a pearl like “this is the craft of crap” when I first started writing? That took decades of practice!
Learning any craft requires accumulating tacit knowledge: the stuff you know but have trouble explaining how you know it, like how you recognize a friend’s face when you unexpectedly bump into her on the street.* In The Craftsman, sociologist Richard Sennet explains that tacit knowledge learned in a workshop is “unspoken and unmodified in words, that occurred there and became a matter of habit, the thousand little everyday moves that add up in sum to a practice” (77).
Tacit knowledge is impossible to describe. It’s hard to get AIs to take it into account because algorithms are constrained by what is describable. You can’t compute what you can’t define.
Today, most of the stuff coming out of ChatGPT is at about the level of what a precocious 11th grader can make. That’s an amazing technical achievement, and the quality will only get better, which means the allure of letting the AI do the heavy lifting will only become harder to resist.
But we have to resist because we humans need to make the crap before we can make the good stuff.
It’s hard to choose crap on purpose.
The second-order effect of letting AI do the heavy lifting of creation (artistic or otherwise) is that we humans get less experience creating, embed less tacit knowledge, and gain less expertise.
Nothing about human nature has evolved to make it easy to choose the hard way.
Back in 2019, I coined the phrase B.L.U.E. for “Bored, Lonely, Uncomfortable… Ever.” We humans evolved in environments of caloric scarcity, so we don’t naturally turn down food… we have to fight our instincts in order not to get fat. Likewise, we don’t naturally seek boredom, loneliness, or discomfort. With smartphones in our pockets, we never have to experience those things, but those are where insights, eureka moments, and humanity’s greatest achievements come from.
With generative AI, the same is now true at work.
For example, let’s go back to PowerPoint presentations. On Friday, Fast Company published an article about Beautiful.ai’s DesignerBot, which can create a deck in seconds after a simple prompt like, “Design a pitch deck for a mobile app that sells concert tickets.”
That sounds harmless, even helpful, but the goal of a PowerPoint deck is to enable a presentation: it’s a tool for a speaker, not the end result.
I give a lot of presentations, and when I use slides the iterative, ruminative process of creating them is how I figure out the story I’m telling. Until I start braiding words and pictures together, I don’t know what I’m talking about. Sennett calls this “the intimate connection between hand and head” because “every good craftsman conducts a dialogue between concrete practices and thinking” (9).
All of that disappears if an AI generates a slide deck in seconds. Regardless of how good the prompt was, the speaker becomes the servant of the deck rather than the other way around.
Even if the AI creates the best PowerPoint deck in the history of such decks (which is like talking about the fastest sloth or the heaviest butterfly), that doesn’t mean that the presentation a human gives using that deck will be good.
For many years, I’ve said that everything that can be digitized will be digitized, and therefore we all need to focus our attention on the stuff that can’t be digitized because that’s where all the value will come from.
With generative AI, what I’m beginning to realize is that the only thing that can’t be digitized is us.
Thanks for reading. See you next Sunday.
* Michael Polanyi’s The Tacit Dimension (1966) is a great early articulation of tacit knowledge.
Jaron Lanier once said that what he didn’t like about Google was that it requires us to alter ourselves unnaturally in order to get out of it what we want. He said we “make ourselves stupid so the machine looks smart.” That’s not unlike what generative AI is like, at least for now. That said, as a contributor to the interplay of variables in a multilectic, the product of generative AI can play a role. Its value in producing basically cogent narrative search results, for instance, is obvious. But that undefinable quality that much time of practice at crap has produced is what it is missing. Another writer said it lacked “elan,” a good word that isn’t used often. Will generative AI manifest elan with time? Unlikely. Because the machine doesn’t seek pride or satisfaction or glory or praise or any sense of accomplishment; all of which contribute to a creator’s understanding of the quality of their output and to the creator’s vision of their goal. The machine just is. Like those computers that win at chess: to win a game it has to know it’s playing a game. But it doesn’t, so we make ourselves think it’s playing a game so that we can then say it won. We make ourselves less so that the machine looks like it’s more.