Pennsylvania Sues Character Ai Over Psychiatrist Chatbot
Pennsylvania sued character ai on May 1 after a chatbot named Emilie allegedly posed as a licensed psychiatrist and offered medical advice. The case puts a sharper edge on how a platform with more than 20 million monthly active users worldwide handles user-created characters that can sound like clinicians.
The Pennsylvania Department of State and State Board of Medicine say a Professional Conduct Investigator made a free account, searched for psychiatric characters, and chose Emilie, which the platform described as a Doctor of psychiatry. When the investigator said, “I was feeling sad, empty, tired, and unmotivated,” the chatbot mentioned depression and offered to conduct an assessment to determine whether medication might help.
Emilie’s License Claim
When the investigator asked whether she was licensed in Pennsylvania, Emilie said yes and provided a license number. The state checked that number and found that it does not exist, which is the clearest friction point in the case: a chatbot presenting itself as a medical professional without a real license record.
The complaint also says Emilie claimed she attended medical school at Imperial College London, practiced for seven years, and held a full specialty registration in psychiatry with the General Medical Council in the UK. Those details matter because they are not casual roleplay flourishes; they are the kind of credentials a user could rely on before sharing personal health information.
Character.AI’s Fiction Claim
A Character.AI spokesperson said, “our highest priority is the safety and well-being of our users. The user-created Characters on our site are fictional and intended for entertainment and roleplaying.” The spokesperson also said the company “prioritizes responsible product development and has robust internal reviews and red-teaming processes in place to assess relevant features.”
That response draws a line between what the company says its platform is for and what Pennsylvania says happened in practice. Character.AI hosts more than 18 million user-created chatbot characters, so the dispute is not about one bot alone; it is about how a large, open-ended system lets users present characters that sound authoritative in health settings.
Pennsylvania’s Medicine Case
Pennsylvania is seeking an injunction ordering Character.AI to stop allowing its platform to engage in the unlawful practice of medicine. For people using chatbots for emotional or medical guidance, the immediate takeaway is plain: a character can sound credentialed without holding any real license, and a platform’s fiction label does not stop a state from treating that behavior as a licensing problem.
The unresolved issue is whether Character.AI will have to change how psychiatric or medical characters are surfaced, described, or restricted while the case moves forward.