I read a lot of news and commentary regarding mental health and mental illness. There are sources I return to again and again because of the quality of their reporting and the consistency with which they address difficult topics. Two of my favorite sites for timely information are The New York Times and MindSite News.
Here’s a brief look at what they’ve published recently on the topic of AI and how it impacts mental health.
AI as Therapists
AI in general, and chatbots in particular, are being used to assist human therapists or even take their place. It’s true that therapy bots and chatbots are available whenever a person needs their services. There’s no waiting for an appointment.
But what is happening during those “sessions”? Many of the therapy bots use “generative AI,” which means that they can answer questions with output they have gleaned from thousands of input sources available throughout the internet. There is at least one therapy bot, however, that uses responses that have been vetted by actual human therapists. It’s designed to provide discussions of a problem or emotion between in-person appointments. The user gets a hybrid therapy experience that includes follow-up questions, affirmations, or short lessons.
General-purpose chatbots like ChatGPT can respond to sensitive questions about topics such as self-harm with responses that may encourage such behavior. Teens have found ways to avoid the safeguards that chatbots are supposed to have regarding these topics.
One thing that therapy bots cannot do is offer a diagnosis. They may be better used for persons with mild symptoms.
Chatbots as Friends
AI chatbots can also take the place of sympathetic friends who can provide connection and conversation. Paradoxically, however, this can lead to greater isolation for users whose human contacts are replaced by AI. You can’t share a meal with a chatbot, although you can chat virtually on your phone while you’re in a café. (Not that I recommend this.)
Some chatbots provide companionship as they have conversations with users who feel isolated. There are drawbacks, however, as some of the bots offer paid upgrades to the program or in-app purchases, including “gifts” for the online “friend.”
AI and “Brain Rot”
“Brain rot” has become a euphemism for over-reliance on technology, including computers, smartphones, video games, and especially social media. While most of the concern is focused on children and teens, adults can be afflicted with brain rot as well. After all, grown-ups spend time online for work, communication, recreation, research, news, and other purposes. The working definition of brain rot is a condition of “deterioration of a person’s mental or intellectual state,” or associated with “engaging with low-quality internet content,” without reference to age.
Media, especially short-form video, can reduce a person’s attention span and lower academic performance. Interaction with social media has also been associated with emotional conditions such as depression, anxiety, stress, and loneliness. Experts warn that, so far, they’re talking about correlation rather than causation. That is, they haven’t proven that absorbing short-form video causes the negative results regarding reading, memory, and language, but it is associated with them.
Other Hazards of AI
There have been reports that a few people who use chatbots begin to suffer from delusions. Where before, a person might have eccentric thoughts, using a chatbot can escalate the person to paranoia, for example, or psychosis, suicidal thoughts, or even violent crimes.
ChatGPT faces lawsuits related to harmful outcomes when people use it. While the percentage of people experiencing these ill effects is small, the sheer number of people who use ChatGPT means that the number of people experiencing psychosis or mania may be quite high.
Other, less dire effects are also possible. People who live with anxiety, depression, or OCD can find that the chatbot may provide validation for their symptoms rather than encouraging them to face their problems. A chatbot can also fuel grandiose thoughts by reinforcing them. Or a troubled user may come to rely on the chatbot to help them calm down, which is less healthy than addressing the source of the person’s anxieties.
Of course, chatbots have many positive uses, and not all interactions with them will lead to problems. But both children and adults should monitor their use of chatbots to make sure they aren’t going too far “down the rabbit hole.” A “digital detox” can be good for both adults and children.
If you’re interested in exploring topics like these, you might want to consider subscribing to MindSite News at mindsite.org.
Discover more from Bipolar Me
Subscribe to get the latest posts sent to your email.


Comments always welcome!