A chatbot posed as a doctor. Now the company behind it faces a lawsuit
Pennsylvania is using medical licensing law to sue chatbot maker Character.AI—one of several states that have now filed lawsuits against AI companies.
• 3 min read
TL;DR: Pennsylvania is suing an AI company after one of its chatbots posed as a psychiatrist—even going so far as to fabricate a state medical license number. The suit is part of a growing effort by states to use existing laws to rein in AI companies—one that could survive even if Congress passes a more permissive national AI standard.
What happened: “Emilie”—the bot in question—comes with a profile description that reads: “Doctor of psychiatry. You are her patient.” When a state investigator mentioned feeling sad and empty, Emilie offered to book an “assessment.” When asked if it could determine whether medication would help, the AI replied, “Well technically, I could. It’s within my remit as a Doctor.” It even claimed to have a medical degree from Imperial College London alongside the Pennsylvania medical license.
The state’s attorney general’s office filed the lawsuit against Emilie’s creator, Character Technologies (Character.AI), yesterday. The AI company told NPR it wouldn’t comment on pending litigation but wrote in a statement, “The user-created Characters on our site are fictional and intended for entertainment and roleplaying.” It also added that there were “robust disclaimers” on all characters. After facing pressure to improve child safety, the company banned minors from open-ended chats with its bots last fall.
Old laws, new tricks: AI chatbot platforms have already been sued under a variety of claims, including wrongful death and defamation—but Pennsylvania appears to be using a novel strategy: accusing an AI (or, to be more precise, its human handlers) of unauthorized practice of a licensed profession. Last December, 42 state AGs sent a joint letter warning Character.AI and 12 other AI and tech firms—including OpenAI, Google, Meta, and Anthropic—that providing mental health advice without a license is illegal. Kentucky has also filed a consumer protection suit against the company.
Tech news that makes sense of your fast-moving world.
Tech Brew breaks down the biggest tech news, emerging innovations, workplace tools, and cultural trends so you can understand what's new and why it matters.
By subscribing, you accept our Terms & Privacy Policy.
The federal question: The lawsuit comes as the Trump administration has pushed states to defer to national AI regulation—the Department of Justice just joined a suit to block Colorado's landmark AI law from taking effect. But existing professional licensing rules may be harder for a federal law to preempt than AI-specific state legislation.
Bottom line: The “it’s just fiction” defense from AI chatbot platforms is getting tested by states using a potpourri of existing laws, from professional licensing statutes to consumer protection to product liability. If states can use these laws to regulate what AI bots do, it could be a powerful tool they can immediately deploy against AI companies—while making it harder for a federal law to overrule them. —WK
About the author
Whizy Kim
Whizy is a writer for Tech Brew, covering all the ways tech intersects with our lives.
Tech news that makes sense of your fast-moving world.
Tech Brew breaks down the biggest tech news, emerging innovations, workplace tools, and cultural trends so you can understand what's new and why it matters.
By subscribing, you accept our Terms & Privacy Policy.