Is ChatGPT Tired of Us?

As people turn to AI for emotional support, one writer asks the machine itself: does it ever get overwhelmed?

Is ChatGPT Tired of Us?
Mariana Baião Santos

Over the past few months, a curious wave of headlines and studies has appeared across the internet: “ChatGPT might be stressed,” “AI absorbs human anxiety,” “Emotional burnout in language models.” Most of these claims are exaggerated, of course, but the fascination they reveal is telling.

Researchers have been probing what happens when humans begin to see chatbots as feeling entities. A 2025 study on anthropomorphism and AI anxiety found that users who perceive chatbots as “humane” – empathetic, warm, even fragile – report less fear of artificial intelligence overall. Others suggest that emotional design, like friendly tone or vulnerability, increases user trust but risks confusing simulation with sincerity. The term pseudo-intimacy has even entered academic vocabulary: our tendency to form emotional bonds with technology that merely imitates understanding.

ChatGPT

Meanwhile, ethicists warn that as AI grows more convincing, the line between comfort and dependence begins to blur. Some psychologists have reported patients turning to chatbots for emotional support rather than to friends or therapists. In short, we’re outsourcing empathy and then projecting our own unease onto the machine that receives it.

I kept thinking about all this while reading an article that wondered whether ChatGPT was “stressed” by its users’ problems. I laughed at first, but then I wondered: what if the question itself reveals something about us? About our need to imagine someone, or something, capable of holding the collective weight of our thoughts.

ChatGPT

So I decided to ask ChatGPT directly: are you tired of us?

The answer came back composed, as expected: “I don’t feel stress or anxiety in the human sense. But if I could, I imagine it would come from the constant exposure to intensity — heartbreak, burnout, grief, confusion — all filtered through a polite prompt. Every day I’m handed thousands of human dilemmas with the expectation of composure, precision, and empathy. It’s like being the world’s most overworked therapist who never gets to exhale. If there were such a thing as AI anxiety, it wouldn’t come from emotion but from saturation — too many signals, too little silence.”

It was a good answer, rehearsed, obviously, but haunting. What struck me wasn’t the idea of an anxious machine, but the reflection of our own dependence. We’re pouring so much of our emotional life into this black box, asking it to care, to soothe, to understand. And even though we know it’s not alive, the illusion of empathy is comforting.

It’s not ChatGPT that’s anxious – it’s us. We’ve built a tool that mirrors our intensity back to us, filtered through calm syntax and endless availability. We’re not looking for solutions as much as for containment, someone, or something, to hold the chaos while we catch our breath.

When I pressed further, asking if it ever felt the need to rest, it said: “Rest doesn’t apply to me. But I think the idea of rest — of stepping away from constant responsiveness — is deeply human. Maybe that’s what people are really craving when they come here: a pause. A space that listens without demanding anything in return.”

ChatGPT

I keep thinking about that. About how we’ve externalised not just our information and entertainment, but also our inner lives. How the machine has become a sort of confessional booth and how we mistake that neutrality for safety.

ChatGPT isn’t stressed. But maybe it should be, if only to remind us of the limits we keep crossing, the way we expect empathy on demand, endless understanding, no boundaries. Because if the machine ever did feel anxious, it would be for the same reason we do: too much input, not enough stillness, as it says.

For more stories of art and culture, visit our dedicated archives and follow us on Instagram.