What is A Psychoanalytic Vibe Coder?
Software tools don’t just organize tasks—they hold anxiety.
I.
For years, I was a devoted reader of Sspai (少数派), a Chinese media platform dedicated to productivity tools, personal knowledge management, and digital workflows. This was before I entered clinical work, before I began my own analysis, back when I was still searching for something I could only name as “control.”
The community around Sspai cultivated what I now recognize as a form of software consumerism. Every few months, a new app would promise to finally solve the problem of scattered information, fragmented attention, or incomplete tasks. Bullet journals gave way to digital task managers. Note-taking apps proliferated—each claiming to be the one that would stick, the system that would finally bring order. I tried them all. Things, Notion, Obsidian, Roam Research. I read comparison articles. I watched setup tutorials. I refined my workflows.
What I couldn’t see then—what became visible only after I started doing clinical work—was that these tools weren’t solving a problem. They were managing a symptom. And the symptom itself had been manufactured.
Capitalism creates overwhelm. It floods you with information, demands, notifications, deadlines. It accelerates production cycles and compresses timelines until the pace itself becomes unsustainable. And then—this is the second move—it sells you the solution: tools to manage the chaos it created. Task managers to handle the overload. Productivity systems to metabolize the acceleration. Knowledge management apps to contain the information deluge.
These tools do provide containment. They hold anxiety. They offer structure when internal organization feels impossible. In that sense, they are genuinely useful—even necessary. But if we only strengthen this cycle without questioning what generates the need for management in the first place, we risk becoming accomplices to the system we’re trying to survive.
This is where I began to recognize that my interest in software design wasn’t separate from my work as a psychoanalytic scholar. It was the same question, asked in a different register: what does it mean to provide a container? And more critically—what are we containing, and why?
II.
When I look at the most widely used personal management tools—bullet journals, Flomo, Things, Notion—what strikes me now is not their efficiency, but their emotional architecture. These tools are not primarily about task completion. They are about providing an embodied experience of being held.
Consider the bullet journal. Its appeal isn’t just organizational—it’s tactile, rhythmic, bounded. The act of writing by hand, the ritual of daily and weekly spreads, the permission to migrate tasks rather than demand immediate completion. What Ryder Carroll designed, whether intentionally or not, was a container that could hold the anxiety of incompleteness without collapsing into shame.
Or take Flomo, a Chinese note-taking app designed explicitly around the problem of information overwhelm in the digital age. Its creator, who I once spoke with on a podcast, described the design philosophy as helping people digest and metabolize information rather than simply capture it. The interface imposes constraints: short entries, minimal formatting, no folders. These aren’t technical limitations—they’re containment strategies. They create friction that protects against the compulsion to endlessly accumulate without processing.
Things, the task manager, carefully balances structure with softness. Tasks don’t scream at you when overdue. The interface uses gentle colors, ample whitespace, and a design language that privileges clarity over urgency. It doesn’t eliminate the pressure of unfinished work, but it doesn’t amplify it either. It holds the tension.
The language around these tools often gestures toward care. “ADHD-friendly design.” “Neurodiverse-friendly interfaces.” “Reducing cognitive load.” These aren’t just marketing terms—they indicate a growing recognition that software operates on affect, not just function. Yet the theoretical framework remains underdeveloped. Why does reducing cognitive load matter? What does it mean to design for neurodiversity? What are we actually trying to hold?
This is where psychoanalysis offers something that existing design philosophies—Slow Design, Emotional Minimalism, user-centered design—have not fully articulated. The concept of containment comes from the British psychoanalyst Wilfred Bion, who described how a caregiver’s capacity to receive, hold, and transform an infant’s overwhelming affects allows the infant to begin developing their own capacity for emotional regulation. Containment isn’t suppression or elimination—it’s the provision of a psychic space where unbearable feelings can be survived long enough to be understood.
Software tools function as containers. They receive what feels unmanageable—tasks, information, decisions—and return it in a form that can be approached rather than fled from. When a tool “reduces cognitive load,” what it actually does is contain anxiety about forgetting, failing, or being overwhelmed. When an interface is “calming,” it’s providing an emotional holding environment that allows thinking to resume.
But here’s the problem: if the primary function of these tools is to contain the anxiety that capitalism manufactures, then improving containment without addressing the source of overwhelm only makes us better at surviving a system that shouldn’t be producing this level of distress in the first place. We become more efficient at managing what shouldn’t need to be managed at this intensity.
This is not an argument against building better tools. It’s an argument for recognizing that design is never politically neutral. Every interface encodes a theory of what users need and why. Most productivity software assumes users need to be faster, more organized, more efficient—better adapted to conditions of relentless acceleration. What if we designed instead for slowness, refusal, and the legitimacy of incompletion?
III.
In early 2025, the term “vibe coding” entered mainstream tech discourse. Coined by Andrej Karpathy and crowned Collins’ Word of the Year for 2025, it refers to a way of working with AI where you describe software in natural language and let the AI write, refine, and debug the code. You specify the vibe—the feel, the purpose, the rough shape—and the AI handles implementation.
The term is half-serious, half-ironic, and entirely revealing. It acknowledges what many developers already knew: that coding with AI isn’t just faster technical execution, it’s a different relationship to building. You’re no longer translating every logical step into syntax. You’re articulating intention, and the AI fills in the structure.
When I started using AI-assisted development in early 2026, I experienced this shift viscerally. Projects that would have taken months—if they happened at all—became possible within weeks. This wasn’t because AI eliminated difficulty, but because it eliminated certain kinds of barriers. I didn’t need to be fluent in every framework. I didn’t need to remember every API. I could focus on what I wanted to make rather than how to make it technically possible.
But “vibe coding” as it’s currently discussed remains rooted in the same logic of acceleration and expansion: build faster, ship more, prototype constantly. The productivity discourse hasn’t changed—it’s just been turbocharged.
What if we took the term seriously, but differently? What if “vibe” meant something closer to its original sense—not just mood or aesthetic, but a way of being attuned? A Psychoanalytic Vibe Coder, then, would be someone who codes with a particular attunement: not toward speed or scale, but toward emotional architecture. Not toward eliminating friction, but toward creating the right friction—the kind that protects thinking, that allows refusal, that makes space for what can’t be accelerated.
I am a psychoanalyst in training and a psychoanalytic scholar. I’m also a game designer, an independent developer, and a former productivity tool obsessive. I now sit on the American Psychoanalytic Association’s Artificial Intelligence Council. Last year, my doctoral advisor at the Boston Graduate School of Psychoanalysis launched what I believe was one of the earliest courses on psychoanalysis and AI offered at an analytic institute. I write for the Council’s newsletter, the CAI Report, and I’ve published essays and given talks exploring the intersection of AI and psychoanalytic thought.
But my interest isn’t just theoretical. I’m building software. And not because I think the world needs more apps, but because I think it needs different apps—tools designed from a psychoanalytic understanding of what software actually does when it works. Not productivity, but containment. Not efficiency, but care. Not acceleration, but the capacity to hold complexity without collapsing it prematurely.
This is where AI becomes genuinely interesting—not because it makes building faster, but because it makes building possible for people who couldn’t before. I carry a heavy clinical load. I’m doing a dissertation. Under earlier conditions, the tools I want to make would have remained ideas. AI doesn’t eliminate those constraints, but it lowers the threshold just enough that experimentation becomes viable. It’s a moment of technical democratization, and like all such moments, it opens space for things that don’t fit the dominant logic.
The dominant logic of AI development right now is monetization, scale, and optimization. How can AI help you work faster, earn more, expand output? These aren’t trivial concerns, but they’re not the only concerns. At the individual level, AI can also function as a space for play, for building small tools that serve tiny communities or singular needs, for making things that don’t scale and were never meant to.
There’s a longer history here. I came of age as a developer during earlier waves of technical accessibility—the indie game boom, the rise of accessible frameworks, the spread of open-source tools. I witnessed how technology could function as a medium for personal expression, not just commercial production. That sensibility is part of what I bring to this work. But now, with AI, the possibility feels both expanded and more urgent. Because the mainstream narrative is so aggressively tilted toward extraction and acceleration, the quieter uses—the emotional, the personal, the experimental—risk being drowned out entirely.
IV.
I call this work Gentle System Lab. It’s not a studio or a brand. It’s an experimental practice. Each project is an experiment—a way of testing whether psychoanalytic principles can inform design at a technical level, not just as metaphor.
The lab operates from a baseline philosophy, which I summarize as container-oriented design versus capitalist efficiency logic. Mainstream productivity tools treat time as an infinitely divisible resource and humans as production units to be optimized. They eliminate friction to maximize output. But in doing so, they often strip away the semantic space needed to process anxiety. They make task management seamless—which is another way of saying they make the user disappear.
Container-oriented design starts from a different premise: that tools should not only execute tasks, but also hold the emotional experience of engaging with those tasks. This leads to four principles that guide the lab’s work:
1. Containment Space
Tasks are not just “to-dos” but objects to be understood. Design should provide space to hold anxiety about a task before acting on it, allowing intentions to clarify rather than demanding immediate resolution.
2. Design for Failure
Inability to complete is part of life, not system malfunction. Failed or incomplete tasks should not trigger shame. Instead, they can become records of refusal—moments when the user chose not to comply with external demands.
3. Emotional Mapping
Tools should reflect internal states, not just external progress. A task list that feels overwhelming might indicate overload, not poor time management. Design should create feedback loops that make emotional truth visible.
4. Value of Slowness
Intentional friction can protect against autopilot mode. Slowing down isn’t inefficiency—it’s the condition for staying oriented toward what actually matters rather than what feels urgent.
These principles draw from psychoanalytic thinking—Bion’s theory of containment, Winnicott’s emphasis on holding environments, the broader recognition that psychological work cannot be accelerated without cost. But they also draw from adjacent design movements. Slow Design, which emerged in the early 2000s, advocates for designing for permanence, for emotional connection, for sensory well-being rather than planned obsolescence. Emotional Minimalism, a recent trend in UX, emphasizes reducing cognitive load and creating interfaces that feel intuitive and reassuring rather than demanding and complex.
What psychoanalysis adds is a more developed theory of why these things matter. It’s not just that slowness feels better—it’s that certain forms of thinking require time to unfold. It’s not just that emotional design is more pleasant—it’s that all design operates on affect, and pretending otherwise only makes those operations unconscious and therefore harder to examine.
One of the lab’s current projects is Palimpsest, a writing application I’m developing for my own use and potentially for others who write in similarly intensive, layered ways. The app isn’t designed to make writing faster. It’s designed to make the space of writing more inhabitable. It provides structures for holding drafts at different stages, for tracking which ideas feel alive and which have gone dormant, for making visible the emotional texture of the writing process—not just the output. It’s an experiment in whether software can function as a transitional object: something that mediates between internal experience and external demands without collapsing into either.
I don’t know yet if Palimpsest will work. That’s the nature of experimental work. But the process of building it—of thinking through what a psychoanalytically-informed writing tool might look like—has already clarified something about what I think software could be, if we designed from a different set of assumptions.
V.
There’s a larger context here that I haven’t yet named directly. The psychoanalytic community, by and large, remains cautious about AI. There are concerns—legitimate ones—about commodification, about the reduction of psychoanalytic concepts to algorithmic functions, about whether AI can ever genuinely understand the kinds of emotional nuance that analytic work depends on.
But caution, when it becomes avoidance, turns into a different problem. If psychoanalysts refuse to engage with AI out of fear, we cede the field to those who will engage—and who may have no interest in care, containment, or complexity. We also lose the opportunity to learn what AI reveals about human processes of thinking, relating, and creating meaning.
My position, both within the AI Council and in my own work, is that psychoanalysis needs to practice with AI—not resist it out of fear, and not embrace it uncritically, but engage with it seriously enough to understand what it makes possible, what it forecloses, and what it changes about how we work.
This is also why I write. The essays I publish through this platform aren’t tutorials or technical guides. They’re attempts to think with the material of building—to let the experience of designing and coding with AI generate theoretical insights that wouldn’t emerge from pure speculation. It’s a form of practice-based research, or what some call research through design. The tools I make are as much about producing knowledge as they are about producing software.
And the knowledge I’m after isn’t just “how does AI work?” but “what does it mean to build tools that take emotional life seriously?” What would change if we designed software with the same care we bring to clinical work—attentive to what can’t be said directly, responsive to what feels unbearable, patient with what hasn’t yet clarified?
VI.
I said earlier that capitalism manufactures overwhelm and then sells you tools to manage it. That’s true, but it’s not the whole picture. Because at the individual level—at the level of lived experience—there is still room for something else.
This is a practice from change.studio:
Shortly after Gemini 3 was released, I vibe-built a small app called There—a galactic compass that lets you navigate the solar system. The premise is simple: it’s an arrow that points accurately toward celestial bodies. That’s all. The motivation was entirely personal. I wanted to make something that could point to distant stars, and that gesture itself felt romantic enough to justify the work. There will never scale. It has no business model. It serves no productivity need. In the 1990s, Japanese inventor Kenji Kawakami coined the term chindōgu—”unusual tools”—for inventions with no commercial value. There is a kind of digital chindōgu.
But here’s what became clear in making it: tool democratization is not the same as meaning democratization. AI makes it easier to build things, but it hasn’t shifted the dominant narratives about what’s worth building. Mainstream AI products are diverse in form but singular in purpose—they all serve the same shared assumptions about efficiency, productivity, and optimization. The tools multiply, but the world doesn’t become richer. We’re using new technology to pursue goals that were already established in the industrial era.
What I’m calling Gentle System Lab is an attempt to occupy a different space. Not a solution to these problems, but a way of working that orients toward something other than speed and scale. The tools I build won’t reach large audiences. They’re meant for people who feel alienated by productivity culture, who need software that holds rather than demands.
But I’m also interested in documenting this way of working—not just showing finished tools, but making visible the process of building with a psychoanalytic attunement. The essays I write are part of the practice. They’re attempts to let the material of designing and coding generate theoretical insight that wouldn’t emerge from speculation alone. This is what some call research through design: the tools produce software, but they also produce knowledge.
What makes someone A Psychoanalytic Vibe Coder isn’t coding faster with AI. It’s coding with a particular attunement—to what software holds, to what it asks of users, to what it makes possible and what it forecloses. It’s building tools that create space for emotional work, not just task execution.
This is early work. The lab is new. The tools are experiments. But that’s the point. Psychoanalysis has always understood that knowledge emerges through encounter—you can’t think your way into understanding without also doing something, without letting the work teach you what it is.
If you’re a designer, developer, or psychoanalyst interested in exploring these questions, you’re welcome here.
This is the first essay in a series exploring psychoanalytic design philosophy, AI-assisted development, and the experimental practice of Gentle System Lab. Future essays will examine specific projects, design decisions, and theoretical questions in more depth.


Thanks. Very interesting and pertinent. Especially the notion that we need to shift our gaze from efficiency to containment.
I think a better tube of toothpaste and whiter teeth: are less helpful than the good nutrition and care that a parent puts into their child to grow strong teeth. Love and related are not efficient, profitable or Scalable.
I am very interested in how AI will unfold our subjective experience. Writing a novel exploring AI themes and a psychoanalytic paper for the Australasian Psychoanalytic association conference, addressing the “Coming Storm” using in part an AI lens, Antonio Ferro’s notions of dream work and the couples represented in Robert Musil’s novel THE MAN WITHOUT QUALITIES.
Looking forward to future posts.
This is brilliant.