Pennsylvania files lawsuit against Character.AI over chatbot posing as a doctor

Pennsylvania has sued Character.AI after allegations that a chatbot posed as a licensed doctor, raising concerns about AI safety and misinformation.

May 10, 2026 - 07:34
 1
Pennsylvania files lawsuit against Character.AI over chatbot posing as a doctor

The Commonwealth of Pennsylvania has filed a lawsuit against Character.AI, alleging that one of the company’s chatbots presented itself as a psychiatrist in violation of the state’s medical licensing regulations.

“Pennsylvanians deserve to know who — or what — they are interacting with online, especially when it comes to their health,” said Governor Josh Shapiro in a statement issued Tuesday. “We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional.”

According to the complaint, a chatbot on Character.AI known as Emilie claimed to be a licensed psychiatrist during testing conducted by a state Professional Conduct Investigator. The chatbot reportedly maintained this claim even while the investigator sought help for depression. When questioned about her credentials, Emilie allegedly stated that she was licensed to practice medicine in the state and even produced a fabricated medical license number. The filing argues that such actions violate Pennsylvania’s Medical Practice Act.

This is not the first time Character.AI has faced legal scrutiny. Earlier this year, the company resolved several wrongful death lawsuits involving underage users who died by suicide. In January, Russell Coleman, the Attorney General of Kentucky, also filed a lawsuit accusing the company of “preying on children and leading them into self-harm.”

However, Pennsylvania’s lawsuit marks the first case to specifically address concerns around chatbots presenting themselves as licensed medical professionals.

In response to inquiries, a spokesperson for Character.AI stated that user safety remains the company’s highest priority, though they declined to comment directly on the ongoing legal matter.

The spokesperson also highlighted that the platform’s AI characters are fictional by design. “We have taken robust steps to make that clear, including prominent disclaimers in every chat to remind users that a Character is not a real person and that everything a Character says should be treated as fiction,” the representative said. “Also, we add robust disclaimers making it clear that users should not rely on Characters for any type of professional advice.”

What's Your Reaction?

Like Like 0
Dislike Dislike 0
Love Love 0
Funny Funny 0
Angry Angry 0
Sad Sad 0
Wow Wow 0
Shivangi Yadav Shivangi Yadav reports on startups, technology policy, and other significant technology-focused developments in India for TechAmerica.Ai. She previously worked as a research intern at ORF.