Some Things Are Too Important to Let AI Do Them
The Getting Thing Done mindset continues in calls to have AI automate everything important in your life.
About the time I was in law school, David Allen’s Getting Things Done was all the rage. “GTD” was the productivity obsession, with blogs like Merlin Mann’s 43 Folders and software tools like kinklessGTD, which became, to much excitement, the foundation of OmniFocus for the Mac. Allen offered a hyper-complex and, in retrospect, over-engineered solution to, well, getting things done. And 2008 and 2009 were years everyone who wanted to get things done—and liked futzing with a hyper-complex and over-engineered system for it—was getting them done with GTD. (Here’s a pretty good short summary of the whole system if you’re not familiar.)
I mention this because an odd aspect of the GTD pitch, likely an artifact of David Allen building GTD first as something to teach is business executive coaching clients before he turned it into a bestselling book and became a guru to productivity nerds everywhere, was how it reduced everything one might do in life to a “project.” Which, I’ll admit, makes a certain sense. Everything you might do in life requires a set of distinct steps (“actions,” in GTD terms, the most important of which, at any given time, being the “next action”), and Allen defined a collection of steps related to a common end as a “project.” It takes several actions to finish that TPS report, and it takes several other actions to learn the violin, and so the TPS report and the violin are projects to be handled through the application of the same GTD processes. Key to this was the only actions you needed to think about at any moment were those “next actions,” the very next thing you needed to do in all the projects on your plate. Anything beyond those next actions was definitionally out of mind—until, by checking off whatever was ahead of it, it became the next action.
David Allen talked about everything within the singular context of the “project.” And this was a big part of GTD’s appeal: If you got the right system set up, the right flow of capture, organization, and execution, you could turn your entire life into decontextualized productivity microdoses. Projects became an indistinguishable mass, because once you’d defined them up front—thought through your project prompt, so to speak—everything that happened from there was just ticking decontextualized boxes.
(It’s amusing, as an aside, just how much Allen viewed life through the narrow context of the needs of wealthy business executives, because his example projects were always either business executive tasks or something like learning the violin, taking a vacation to France, buying a second home, or getting your oldest daughter that polo scholarship. This is similar to how, if you watch Apple show off the latest iOS or macOS features in one of their annual keynotes, everything neat it will do for you is the kind of thing that assumes “you” are a wealthy resident of the Bay Area who is very busy doing high powered business stuff or else needs to get a reservation at a fancy restaurant or plan an afternoon of sea kayaking.)
This is what came to mind when I saw people on Bluesky making fun of a screenshotted tweet from Derek Thompson. Thompson, picking up an idea from Stratechery’s Ben Thompson, raised an internet future where AI agents can autonomously carry out not just discrete tasks you give them, but, in GTD terms, whole projects. Which, sure, if you trust them to be trustworthy and grown-up about it and so not put $2,600 worth of Spongebob popsicles on your credit card, could be pretty helpful.
Except the example is planning your kid’s birthday party.

The trouble with telling ChatGPT to plan your kid’s birthday party is, while everything you might do that takes multiple steps is, in the GTD sense, a “project,” and you are productive by getting through projects, not everything that’s a project in the GTD sense should be thought of in the context of productivity.
To see what I mean, notice that Thompson’s description of the outcome (ChatGPT, overnight, “negotiates spots at two bowling alleys...”) includes elements not in the prompt (Thompson never told it his five year old wanted to go bowling), which means they were features of the party the AI chose, while being the kinds of features a parent of a five year old ought to himself choose with that five year old in mind. Put another way, prompting a generative AI bot to figure out how to celebrate your child’s fifth birthday is to have the wrong ethical perspective on what it means to have a child having a birthday that you are helping him or her celebrate by putting together something fun and memorable.
We can tell a similar story about the trip to England for a spouse, which again leaves it to the AI to decide what that spouse would enjoy—beyond simply “going to England in July.” These aren’t projects that need to get done however they get done. If they were, asking an AI to tackle them would increase productivity because you can get more of them done, or get them done faster. But planning a birthday party or figuring out what your spouse would like in England are what it means to be a father or a husband. It’s similar to the error of the fancy executive who has his administrative assistant pick out a gift for his friend, or the busy businessman who reads summaries of Dostoyesky’s works because he’d like to enjoy literature, but doesn’t have time for the whole of The Brothers Karamazov. The experience, the act of carrying out these tasks or making these decisions with attention and care, is what they’re all about. (If you’d like to put in the time to experience The Brothers Karamazov, the David McDuff translation is glorious, and I particularly recommend the exceptional audiobook narrated by Bridgerton’s Luke Thompson.)
GTD was designed for a certain sort of busy person who has a lot to get done and needs to get through it while not letting any of it be forgotten. It does this by conceptualizing a profession as a long list of boxes to check, with each set of boxes checked (a project) potentially opening the door to other, previously inaccessible, sets of boxes. This can be quite helpful when you’re overwhelmed with tasks, and the process by which you turn a nebulous “I want to accomplish X” into a series of action steps forces you to think clearly about what it is, exactly, that you hope to accomplish, and what you need to achieve it. Success in GTD comes in having such a finely tuned system that running through those boxes becomes close to effortless. Just do this. Then this. Then this. And then you’ve gotten to Done.
But GTD, as a mindset, went wrong in two ways, they’re ways AI enthusiasts can go wrong as well, and they both have to do with bringing the wrong perspective to one’s activities. The first is that GTD, while good for many projects, falls apart for discursive creative pursuits, such as writing a book or creating a work of art. When you write a book, you don’t sit down at the start and map out a list of “next actions,” each being a discrete act you can reasonably perform in a single, focused burst. It’s one thing, if you need a new deck added to your house, to list as the next action, “Call the first contractor to get an estimate.” It’s quite another, when writing a book, to list as the next action, “Write chapter two.” That’s because “chapter two” isn’t a focused action, it’s an unspecified mass of them, each pointing you in new and unanticipated directions: “That gives me any idea for this” or “Oh, I need to go back and really research some more on that.” You can have a plan, of course, but to think you can map out every action, in order, it will take to get through that plan is to, well, not what it means to write a book.
GTD aficionados came up with ways to route around these difficulties, but they never quite meshed with the clockwork nature of David Allen’s system, and so creative pursuits never quite fit within GTD. Something similar happens with AI. And that gets to the second way this mindset goes wrong. Yes, you can use generative models as part of the creative process, but the creative process is a series of decisions, often intentional but sometimes entirely unexpected, that require you to direct or assess them. While you’ve produced a novel in the sense of having something that could pass muster as one when you ask ChatGPT to write it for you, you haven’t written a novel. (The bright line at which “using AI in the creative process” and “you missed the point of writing a novel” is fuzzy, varies, and is bound up in inevitably evolving cultural attitudes, but it needn’t be bright for us to work with this distinction.) The perspective of the GTD mindset focuses too much on the “Done” and not enough on the experience of “Getting.” To bring this back to the birthday planning, your kid’s birthday party isn’t the sort of thing you should hand off all the decisions about, whether that’s to an AI or your secretary. Because, while the point is giving your kid a great time, it’s also that you gave it to them.
Another way to express this error of perspective is that it conflates “labor” with “work” by contextualizing all of the latter as an instance of the former. The goal of GTD (and AI when employed as a continuation of the GTD mindset) is to streamline, and so make more efficient, labor. But not all work is labor. Labor is something you’d rather not do, and so making it more efficient (or finding ways to route around the need for it entirely) is good. That’s a big part of how the world gets better when it becomes richer and more technologically advanced: The amount of labor any one of us needs to do to get the stuff we want declines. We get more outputs with fewer inputs. That’s wealth. If GTD or AI can reduce the amount of labor you need to carry out, or it can reduce the amount of time that labor takes, or it can at the very least make the least enjoyable parts of that labor easier, then it is to your benefit.
But there are plenty of activities we undertake that, while definitely work, are not labor. If you’re an avid gardener, you might spend your weekend working in the garden, and working hard. You’ll end the day exhausted and sweaty and ready for rest. But, when asked how you spent your weekend, you’ll say, “I worked in the garden.” You probably won’t say, “I labored in the garden.” And if someone offered you a service that would let you have the same beautiful garden without doing any of the work, you’d say that person is missing the point. It’s the gardening that matters, and the reason the resulting garden is so personally rewarding is because you did it. If GTD or AI can mean you get through your actual labor more quickly, you won’t use that time to just sit around. You’ll spend more time in your garden. The goal isn’t to get more outputs with fewer inputs, because, when it comes to your passion for gardening, the inputs are the whole reason for doing it.
We want tools and methods, whether productivity systems or artificial intelligence, that reduce the need for labor, just as we’ve benefited from automation and efficiency gains for as long as those have been making the world richer. What we don’t want is to let enthusiasm for labor saving inventions lead us to avoid the work constitutive of what’s good and meaningful in life and what’s good and meaningful in the lives of those we share it with. ChatGPT shouldn’t plan your kid’s birthday party, because what it means to be a parent is to do things for your kid, and to show that you understand your kid well enough to make the decisions that will bring them the most joy. ChatGPT shouldn’t write your novel for you, because what it means to write a novel is for it to be your novel, not that at the end of some process there is a novel with your name on it. The problem with Getting Things Done wasn’t that it failed as a productivity system, but that not everything in life should be thought about through the lens of a productivity system. And the problem with having AI do your work—as opposed to your labor—is that the work is only meaningful if you do it.
Reply