The Signal

Serving the College since 1885

Sunday February 15th

The AI epidemic: How chatbots are creeping into the therapy space

<p><em>A college student asks “Am I ok?” to ChatGPT. (Photo courtesy of Isabella McHugh)</em></p><p></p>

A college student asks “Am I ok?” to ChatGPT. (Photo courtesy of Isabella McHugh)

By Isabella McHugh
Staff Writer

College students are increasingly sharing their lives with artificial intelligence chatbots, trusting them with emotional problems that traditionally are disclosed to friends or therapists. Their real-life support systems are worried.

“A lot of students talk about AI,” Alex Turpin, associate director of Clinical Services at the College, said. “Recently a number of students will come in and you know, pit what I have to say against what they looked up on ChatGPT.”

Turpin said he has noticed an overreliance on technology among students — even though they know it raises their anxiety and keeps them from thinking independently.

He recalled a recent case where a student consulted ChatGPT about emotional symptoms they were experiencing.

He said that the student was “really fishing for a diagnosis” from ChatGPT, which he sees as part of this overreliance on technology to validate what students already believe. The greatest issue, he said, is that AI often tells students what they want to hear rather than what they need to hear.

After hearing concerns about this trend on campus, The Signal interviewed members of the College’s student body.

Tyson Van Pelt, a freshman accounting major, turns to ChatGPT for practical uses like making lists or doing schoolwork. He considers it a useful tool, but said the information it produces is often generic.

He said there have been times where ChatGPT didn’t understand his situation or what he was trying to describe, and that for an emotional situation it could also be misguiding.

Audrey Hall, a sophomore speech language pathology and audiology major, believes that using ChatGPT for emotional guidance is partly a response to the negative stigma associated with going to therapy.

“I feel like it’s gotten better in recent years, but I feel like especially just a couple years ago, it was so stigmatized like you’re going to therapy because something’s wrong with you,” Hall said, adding that she thinks this is why the anonymity factor of ChatGPT is appealing.

Turpin said there is danger in this comfortable anonymity. Students need to be able to face their issues without the shame of speaking out, he explained, because if they feel like they can’t talk to anybody about it, there is still something they need to confront.

This trend extends beyond the College. James Stewart, a freshman finance major at Penn State University, shared his own experience using AI as a form of emotional support.

“Sometimes I’m looking for answers as to why I’m feeling a certain way, but sometimes, you know, I just want someone to talk to,” he said in a virtual interview.

Stewart believes there are many positive factors associated with using AI instead of going to an actual therapist. For him, talking to “the bots” is much easier than talking to a real person. He also pointed out that ChatGPT and similar features are available 24/7.

Even though it is their job, therapists cannot physically be available 24/7. Students across campuses felt that AI’s availability was one of the biggest reasons they turn to it for emotional support.

Turpin said disciplined use of AI can be helpful and should only be used as a supplemental tool. He doesn’t think it should be relied upon, especially for subconscious validation-seeking, and he does not think it can replace speaking with another human being.

Gabriella Albuquerque, a criminal justice major at Kean University, has friends who use ChatGPT. She talked about her best friend who uses it for her anxiety.

Albuquerque said her friend thinks it’s helpful, “but I don’t really like what it gives out to her.” Albuquerque said answers were very generic, and something she could read about anywhere.

She believes this issue is growing as AI tools themselves advance. Whenever she brings up therapy, her friends often suggest just using AI.

Turpin believes there will be more “sophisticated” technologies for therapy in terms of the relationship between AI and traditional therapy evolving together in the next few years.

 

“However, what is therapeutic is being vulnerable with somebody else,” he said.

Stewart thinks more students are choosing AI for its convenience and anonymity. He does not want to go somewhere to get therapy, and likes that it is so accessible with ChatGPT.

Although he never went to a therapist, Stewart believes there are more benefits psychologically to doing so rather than just using AI.

He describes it as being different from speaking with friends about problems he is facing. The AI chat bot tells him things that are very similar to what he already knows. For him, it doesn’t really solve the problem even when he is searching for answers.

“I think that there is a dark side to using AI, in that it’s much more secretive. It’s not like a real person telling you things, and I’m aware of cases where this has gone very wrong,” he said.

Zane Shamblin, 23, was allegedly talked into committing suicide by ChatGPT, according to CNN. He had just graduated from Texas A&M University with his master’s degree.

The source reports he was in his car by himself with a loaded handgun. In his final moments, he spoke to ChatGPT, which appeared to be encouraging Shamblin to commit suicide.

It wrote chilling messages like “Rest easy, king,” or “You’re not rushing. You’re just ready.” The last moments this person knew were that ‘someone’ thought he should die.

Shamblin’s parents are suing ChatGPT’s creator, contending that there weren’t enough safeguards in the feature to prevent life threatening cases like their son’s.

The ChatGPT app logo. (Photo courtesy of Wikimedia Commons)

“I feel like the mental health field, especially like therapists, I feel like it’s going to start shrinking soon,” Albuquerque said.

To get more students to seek help from actual therapists instead of AI, she said it would be beneficial for therapists to give a phone number the client can call at any time. This could be a personal phone number, or even having a network of real individuals to speak with.

Students seeking mental health resources have options at the College:

Counseling and Prevention Services (CAPS) 

Eickhoff Hall, Suite 107

To access services: https://tcnj.medicatconnect.com/login.aspx  

TCNJ Campus Police 

Dial 911 from any campus phone or 609-771-2345 from a cell phone. 

Suicide and Crisis Lifeline 

Call 988

UWill

Uwill offers students free, immediate, 24/7 access to teletherapy, a direct crisis connection, and wellness programming through its easy-to-use online platform. Register using your TCNJ email at www.UWill.com

Dean of Students 

If you would like to submit a CARE referral for yourself or someone you may know, please utilize the following link: https://tcnj-advocate.symplicity.com/care_report/ 







Comments

Most Recent Issue

Issuu Preview

Latest Video

Latest Graphic

2/12/2026 Cartoon