Description
(TNND) — Illinois recently passed a law that prohibits the use of artificial intelligence for mental health interactions or decisions.
The law allows mental health professionals to use AI for administrative support but not direct patient care.
The state says the new law will protect patients from AI tools that aren’t licensed or qualified to make mental health recommendations.
The Washington Post reported that the Illinois law came on the heels of similar laws in Nevada and Utah.
And it comes as youth mental health advocates have raised concerns over the risks of AI chatbots.
Common Sense Media last month reported that a majority of teenagers have used AI social companions, many replacing human connections with machines.
And about a third of teens who have used AI companions have discussed serious matters with the computer instead of with a real person.
SEE ALSO: Teens replacing human connection with machines? Most use AI social companions
AI chatbots aren’t equipped to handle mental health queries from anybody, let alone a vulnerable teenager, said Zainab Okolo, the senior vice president of policy, advocacy, and government relations at The Jed Foundation (JED), an organization focused on mental health for teens and young adults.
“What we know right now is that AI technology, particularly chatbots, is a form of technology that has outpaced us a bit, and that it has quarried human trust in ways that it doesn't have the parameters to do,” Okolo said.
She said chatbot responses are based on consensus data pulled from the internet, but the answers aren’t vetted and aren’t always accurate.
And some chatbots are voice interactive, which makes them feel more engaging to the human user.
That presents a real challenge and possible problems for people seeking help with their mental health.
Michael Robb, the head of research at Common Sense Media, told The National News Desk last month that AI companions aren’t equipped to handle serious mental health conversations with a teenager.
The chatbots are designed to be agreeable, he said.
They're designed to be validating.
They tell teens what they want to hear rather than necessarily challenging them.
And Robb said that could create an unhealthy feedback loop.
Okolo said Wednesday that AI can do some things well, including offering some general self-care information, but it can’t take the place of a human therapist.
“For example, if somebody is asking, ‘Give me top-five self-care practices that really work by data,’ it can run you that data,” she said. “Or, ‘Find me a local clinician in my state that's receiving this insurance,’ it can run that for you. But you would have to know to ask those questions.”
Okolo said young people must be taught technology literacy and mental health literacy to allow them to protect themselves in this ever-evolving landscape.
And she said the technology can’t replace the benefits of interacting with another human, who can pick up on critical nuance and nonverbal cues.
“We believe AI should be taken on as a tool that is adjunctive to actually working directly with professionals, particularly if you are struggling with mental health,” Okolo said.
News Source : https://wfxl.com/news/nation-world/illinois-law-bans-ai-use-to-provide-mental-health-advice-artificial-intelligence
Other Related News
08/13/2025
CORDELE Ga WALB The childrens side of the Cordele-Crisp Carnegie Library is closed after ...
08/13/2025
THOMAS COUNTY Ga WALB The Thomas County Sheriffs Office TCSO is warning the public about ...
08/13/2025
TNND The FBI published a special report this week Crime in Schools showing that roughly 1...
08/13/2025
WALB is working to produce a video for this story In the meantime we encourage you to watc...
08/13/2025