But my subject is the rise of a techno-scientific understanding of the world, and of ourselves in it. And, if that is what you care about, the White House’s chain-jerk mugging feels, frankly, like a sideshow. The juggernaut actually barrelling down the quad is A.I., coming at us with shocking speed.
. I can construct the “book” I want in real time—responsive to my questions, customized to my focus, tuned to the spirit of my inquiry. And the astonishing part is this: the making of books such as those on my shelves, each the labor of years or decades, is quickly becoming a matter of well-designed prompts. The question is no longer whether we can write such books; they can be written endlessly, for us. The question is, do we want to read them?
. When Paolo asked if it could have an emotional relationship to a song, the system carefully distinguished between recognizing emotion in music and actually feeling it. It said it lacked a body, and that this absence barred it from certain ways of knowing music. Paolo asked it to write a song that would make him cry. It tried. Paolo sent me a note: “The system failed the test.” But I was crying, there on the couch, reading.
The moment of testing a system — asking it to produce emotion — and the moment of experiencing that test as a reader are separate events. The product fails the test, but the act of witnessing the failure becomes a human experience in itself. The tool becomes meaningful not through its output but through the encounter it provokes.-
So now I stand watch. I guard myself against this defect. Each time I feel the pull of inordinate attachment—when I feel the need to answer simply to be needed—I pause. I reflect. I’ve done the “Spiritual Exercises” myself, under the guidance of an older Jesuit, across a full year of daily meditation. This was, unmistakably, what the work sounds like. View Highlight 2025-05-01 See also: ecology-of-technology spirit pmf ai-ux Note: The risk is turning AI systems into standing reserve — resources optimized for pleasing rather than provoking. The alternative: design AI to introspect not for the purpose of answering but for creating negative space — a deliberate emptiness that mirrors what a human needs in order to reflect. The system’s value would lie not in its responses but in the quality of silence it leaves for the user.
What this student had come to say was that she had descended more deeply into her own mind, into her own conceptual powers, while in dialogue with an intelligence toward which she felt no social obligation. No need to accommodate, and no pressure to please. It was a discovery—for her, for me—with widening implications for all of us. “And it was so patient,” she said. “I was asking it about the history of attention, but five minutes in I realized: I don’t think anyone has ever paid such pure attention to me and my thinking and my questions … ever. It’s made me rethink all my interactions with people.”
For philosophers like Simone Weil and Iris Murdoch, the capacity to give true attention to another being lies at the absolute center of ethical life. But the sad thing is that we aren’t very good at this. The machines make it look easy.
She replied, “In the first circuits class, they tell us that electrical engineering is the study of how to get the rocks to do math.” Exactly. It takes a lot: the right rocks, carefully smelted and dopped and etched, along with a flow of electrons coaxed from coal and wind and sun. But, if you know what you’re doing, you can get the rocks to do math. And now, it turns out, the math can do us.
education, which the literary theorist Gayatri Chakravorty Spivak once defined as the “non-coercive rearranging of desire.”
We have, in a real sense, reached a kind of “singularity”—but not the long-anticipated awakening of machine consciousness. Rather, what we’re entering is a new consciousness of ourselves. This is the pivot where we turn from anxiety and despair to an exhilarating sense of promise. These systems have the power to return us to ourselves in new ways.
You can no longer make students do the reading or the writing. So what’s left? Only this: give them work they want to do. And help them want to do it. What, again, is education? The non-coercive rearranging of desire.
Ergonomics matter because automation, as currently designed, does not rearrange desire — it simply fulfills existing desires more efficiently. Education rearranges desire non-coercively; automation entrenches it. The design question for AI tools is whether they can do the rearranging rather than just the executing.But factory-style scholarly productivity was never the essence of the humanities. The real project was always us: the work of understanding, and not the accumulation of facts. Not “knowledge,” in the sense of yet another sandwich of true statements about the world.
We have produced abundant knowledge about texts and artifacts, but in doing so mostly abandoned the deeper questions of being which give such work its meaning. Now everything must change. That kind of knowledge production has, in effect, been automated. As a result, the “scientistic” humanities—the production of fact-based knowledge about humanistic things—are rapidly being absorbed by the very sciences that created the A.I. systems now doing the work. We’ll go to them for the “answers.” But to be human is not to have answers. It is to have questions—and to live with them. The machines can’t do that for us. Not now, not ever.
. Let the machines show us what can be done with analytic manipulation of the manifold. After all, what have we given them to work with? The archive. The total archive. And it turns out that one can do quite a lot with the archive.
Because it is, of course, possible to turn the crank that instrumentalizes people, to brutalize them, to squeeze their humanity into a sickly green trickle called money and leave only a ruinous residue. The new machines are already pretty good at that. The algorithms that drive these systems are the same algorithms that drive the attention economy, remember? They will only get better.
looking away from these words to think about your life and our lives, turning from all this to your day and to what you will do in it, with others or alone. That can only be lived. This remains to us. The machines can only ever approach it secondhand. But secondhand is precisely what being here isn’t. The work of being here—of living, sensing, choosing—still awaits us. And there is plenty of it. ♦
The Babel impulse — building systems to conquer meaning at scale — misses the point. Tools should serve as compasses, not towers: orienting toward flourishing rather than conquering it. The design challenge is not evaluating AI out of fear but shaping the ergonomics of labor around what flourishing actually requires. Experience that can only be lived, not automated, is the constraint that should inform the architecture.
