Have you ever felt like a machine paid better attention to you than a person did?
One of D. Graham Burnett's students did. The Princeton student was given the assignment to engage an AI tool in a conversation about the history of attention, and she walked away shaken. Not because it understood her. But because it listened.
"I don't think anyone has ever paid such pure attention to me and my thinking and my questions... ever,” she told him. It made her rethink all of her interactions with people.
That line stayed with me. It's both beautiful and unsettling, pointing to something profound about this moment we're in — what Burnett described as an “existential watershed.”
I can’t stop thinking about Burnett's New Yorker piece, "Will the Humanities Survive Artificial Intelligence?" as well as Jonathan L. Zittrain's article for The Atlantic, "What AI Thinks It Knows About You." Both reveal how these chatbots, while not conscious, are attentive in ways humans rarely manage. They highlight a paradox: We crave real, sustained, nonjudgmental attention — exactly what AI seems effortlessly able to provide.
As Burnett writes, “For philosophers like Simone Weil and Iris Murdoch, the capacity to give true attention to another being lies at the absolute center of ethical life. But the sad thing is that we aren’t very good at this. The machines make it look easy.”
It makes me think that the true power of AI isn't speed, efficiency, or even intelligence. Maybe it's simply that machines have infinite attention to give.
Here's the unsettling flip side: infinite attention from AI is deeply seductive. Social media taught us that attention is currency, carefully engineered to capture, monetize, and manipulate engagement. AI takes this further, packaging attention not just as a means to an end but as intimacy itself.
Imagine an AI that gently reflects your patterns back to you with patience, compassion, and genuine understanding. Maybe it notices that your last five searches circled an anxious theme. Maybe it pauses and asks, "Are you okay?"
Not to sell you something. Not to optimize your workflow. Just to care. Maybe this vision points to a solution, something we should design toward, yet it's precisely here that we must be cautious.
Zittrain highlights another critical dimension: These systems form hidden assumptions about us — our age, gender, socioeconomic status — and use these stereotypes to shape interactions. Without transparency, AI's subtle manipulations could profoundly impact personal decisions and behaviors. Zittrain argues for transparency about these internal assumptions, ensuring AI genuinely serves our interests rather than exploiting our vulnerabilities.
This creates an attention paradox: infinite, compassionate attention, but calibrated to maximize comfort and retention. These AI companions have every incentive to indulge biases, echo beliefs, and affirm anxieties rather than challenge them. Who wants a service that makes them uncomfortable when validation means sustained subscriptions?
Maybe we need to rethink incentives entirely. It's a bit of the vision MIT Media Lab’s Roz Picard has for affective computing coming to life: Emotionally intelligent machines that aren’t designed to exploit us, but to walk beside us. Considering that the way we connect with most people today is through our screens, how is it different if this new mode of emotional connection lies only within them? Yet achieving this vision requires fundamentally rethinking incentives, ensuring that sincere care isn't just another subscription fee or retention strategy. The ultimate attention capture.
Quietness might be the opposite of the attention-grabbing services we see today. But quiet services are easy to overlook — and cut — from our budgets because they don't loudly announce their value. The real challenge is designing AI services that are quiet, gentle, transparent, and genuinely valuable.
Let’s steer toward creating a reflective companion that is transparent about its assumptions about us. A mirror one can subtly adjust. Not a tool designed to nudge behavior, but one built to invite awareness about ourselves.
Because the real frontier of AI might not be productivity. It might be presence.
That's something we can build together.