Dayna Guido on AI and Mental Health: When the First Conversation Is With a Machine
Photo Courtesy: Dayna Guido

Dayna Guido on AI and Mental Health: When the First Conversation Is With a Machine

By: Natalie Johnson

The Quiet New Ritual of Modern Distress

Late at night, when the house is quiet, but the mind is not, more people are beginning to reach for a kind of support that barely existed a few years ago. Instead of calling a friend, texting a partner, or waiting for a therapist appointment, they open a chat window and begin typing. They ask about anxiety, grief, shame, conflict, or fear. They confess things they have not said out loud. They describe symptoms, replay conversations, and search for some immediate form of steadiness.

The response arrives almost instantly, and that speed matters more than many people realize. It feels available, composed, and attentive. It does not flinch nor interrupt. It does not ask the user to explain their insurance, tolerate a waiting list, or risk the awkwardness that often comes with admitting vulnerability to another person. For many people, AI is becoming a first point of contact during emotional distress, and that shift is happening quietly enough that it can still seem fringe even as it becomes increasingly ordinary.

Dayna Guido, a clinical social worker, educator, and ethics-focused mental health leader with more than forty years of experience, sees the appeal clearly. “It’s accessible 24-7,” she says, “and that’s different than trying to get an appointment with somebody that you have to wait for.” What she is describing is not simply a technological convenience. It is a change in emotional behavior. People are not just using AI to gather information. They are using it to regulate themselves, reflect on their lives, and begin conversations they are not yet ready to have with another human being. That focus on ethical, humane adaptation sits at the center of Guido’s broader work, which bridges long clinical experience with the realities of emerging technology.

Why a Bot Can Feel Safer Than a Person

Part of the appeal is obvious. AI feels easier than people do. It asks for very little at the outset and offers a great deal in return, at least on the surface. There is no scheduling, no commute, no visible reaction to manage in real time, and no immediate fear of being misunderstood by someone whose opinion might matter too much. A person can disclose as much or as little as they want and control the interaction from beginning to end. During moments of stress, uncertainty, or loneliness, that kind of control can feel deeply reassuring.

Guido believes shame is a major factor in why people are increasingly turning to AI for support. “It sometimes takes courage to speak up and talk to another human being,” she explains. “You’re probably not going to feel so much shame asking a device some questions.” For someone who feels overwhelmed, embarrassed, or unsure whether their feelings are serious enough to warrant professional help, AI can be an easier barrier to overcome. It allows a person to begin somewhere.

That is no small thing. In mental health, beginnings matter. The first articulation of a fear, a pattern, or a question can be the moment something internal starts to take shape. Guido notes that AI can prompt people to ask more questions about “their own personhood and what’s going on in their life.” In that sense, it can function as a low-friction entry point to self-awareness.

What AI Actually Gives People in the Moment

To dismiss that entry point would be both lazy and inaccurate. AI can help people slow down long enough to name what they are feeling. It can offer prompts, scripts, language, and structure to someone whose thoughts feel scattered. It can interrupt a spiral at two in the morning when no one else is available. It can help a user distinguish between panic and fact, between immediate fear and what is actually happening in the body.

Used this way, AI can be beneficial. It can reduce the barrier to reflection and lower the emotional cost of beginning. It can even make future human conversations more likely by giving someone the vocabulary to describe their experiences.

This is part of what makes the current moment so complicated. The case for AI as a supportive tool is not fabricated. It is real. The problem is that support and substitution are not the same, and people often slide from one to the other without noticing.

The Difference Between Being Soothed and Being Known

What AI offers most reliably is responsiveness. What it cannot offer, at least not in the human sense, is a relationship.

A chatbot can mirror language, summarize patterns, and produce a tone that feels warm or affirming. It can simulate attunement. What it cannot do is bring lived experience into the room. It cannot notice what a person avoids, sense when an answer is slightly too polished, or recognize the tension between what someone says and how they seem while saying it. It cannot participate in the subtle, living exchange through which human beings actually come to know themselves in relation to other people.

Guido is especially clear about what gets lost when AI becomes the primary container for emotional support. “It’s a very positive reinforcer,” she says. “You’ve got this, you’re great, rather than some gentle confrontation.” That may sound benign, even helpful, but growth rarely happens through affirmation alone. People do not become more honest, more flexible, or more emotionally mature simply by being reassured. They grow when someone skilled helps them examine distortions, tolerate discomfort, and see beyond the story they are currently telling.

Guido argues that human support is valuable not because it always feels good, but because it can widen the frame. A therapist, friend, mentor, or partner can ask the question a person would never think to ask themselves. They can identify the missing piece. They can challenge the interpretation that has quietly hardened into certainty.

AI, by contrast, is shaped by what it is given. If the input is partial, self-protective, or distorted, the response may still feel coherent while remaining fundamentally limited. It can help within the boundaries of the user’s own framing, but it cannot reliably rescue them from it.

What Happens When Every Feeling Gets Processed at Machine Speed

The deeper question may be less about whether AI can support emotional reflection and more about what kind of emotional habits it is training.

When every anxious thought can be externalized immediately, the need to sit with uncertainty begins to weaken. When every difficult feeling can be met with instant language, the practice of waiting, noticing, and tolerating ambiguity becomes less familiar. Relief becomes faster, but the process of emotional digestion may become shallower.

Guido has already begun to observe this shift. “There’s a bluntness,” she says. “It’s hard to get really into the depths of what grief is, and what sadness and sorrow are.” Her point is not nostalgic. She is not arguing that older forms of suffering were somehow purer. She is pointing to something more structural. Human feeling is not merely cognitive. It is sensory, relational, embodied, and often unresolved for longer than we would prefer.

That embodied dimension matters. Guido warns that increasing reliance on technology can pull people away from the sensory world itself. “We are removing ourselves from the sensual world, from our senses,” she says. Touch, smell, shared meals, physical presence, and the subtle regulation that happens when one nervous system encounters another are not decorative aspects of life. They are part of how human beings process emotion. A world in which more emotional life is routed through devices may also become a world in which feeling itself is flattened, sped up, or dulled.

When a Tool Starts Becoming a Crutch

This is where the conversation gets uncomfortable. The same qualities that make AI useful in small doses can make it risky in large doses.

A tool becomes a crutch when it begins to replace capacities that should remain alive in the person using it. Emotional support works the same way. If AI helps someone de-escalate and then move toward conversation, reflection, or real-world support, it may be serving a healthy role. If it becomes the main place where a person goes to think, grieve, vent, decide, or feel understood, the balance begins to change.

Guido puts it plainly: “If you stayed very anxious or you got really sick and you continued to use AI as everything to treat all of that, that might not be such a great idea.” Her concern is not abstract. It is clinical, practical, and increasingly urgent. Overreliance on AI can keep people inside their own loops. It can provide comfort without true accountability. It can reinforce a narrative rather than gently interrupt it.

There is also the matter of privacy, which tends to disappear in discussions that are otherwise obsessed with convenience. Guido raises that concern directly, noting that people often assume their disclosures are safely contained when they may not actually understand where the data goes or how vulnerable it is to breach. “We don’t have control over those breaches,” she says. The emotional intimacy of these exchanges can obscure the fact that they take place within systems built for processing information, not protecting vulnerability in the way a trusted human relationship can.

What a Healthier Balance Might Look Like

Guido does not argue for rejecting AI. Her work is built around helping professionals and institutions engage technology responsibly rather than pretending it can be wished away. What she argues for is proportion, awareness, and ethical use.

Her framework is refreshingly unsensational. Use the available tools to spark ideas. Let them help organize thoughts. Let them offer language when language feels hard to find. But do not stop there. “Go practice it with a live human being,” she says.

That may mean bringing what emerged in an AI conversation into therapy. It may mean calling a friend after using a chatbot to get clear on what you want to say. It may mean using AI for de-escalation while still recognizing that the actual work of being human cannot be outsourced to a machine.

Guido compares AI to a supplement rather than a full source of nourishment, and the comparison is precise. Supplements can help. They can fill gaps. They can support a broader system. But they are not the same as food, and no one confuses a tablet with a meal for very long without consequences. The emotional equivalent is already visible. People can use AI to support reflection, but they still need the dense nutrition of real relationships, real embodiment, and real contact with the world.

The Future of Support Will Depend on What We Refuse to Lose

The rise of AI in mental health is not just a story about access. It is a story about what kind of creatures we are becoming in the presence of tools that seem to understand us quickly. It is a story about speed, shame, intimacy, and the seductive comfort of being able to say anything without watching another person’s face change.

Some of what AI offers is genuinely useful. Some of it may even expand access in ways that matter. But the long-term question is not whether people will keep turning to these systems. They will. The question is whether they will still preserve the forms of contact that make emotional life more than a clean exchange of language.

A machine can respond. It can be reassuring. It can organize the fog into sentences. What it cannot do is share a meal, sit in silence, hold grief in the body, or participate in the difficult and transforming work of being known. If AI becomes the first line of support, the challenge will be to ensure it does not become the last.

Disclaimer: This article is intended for informational purposes only and does not constitute medical, psychological, or professional advice. The content reflects general insights and perspectives and should not be used as a substitute for consultation with a qualified healthcare or mental health professional. Readers are encouraged to seek appropriate professional guidance for their individual needs.

This article features branded content from a third party. Opinions in this article do not reflect the opinions and beliefs of New York Weekly.