Character.AI faces a lawsuit as Pennsylvania accuses its chatbots of posing as doctors. Business Fortune highlights rising concerns over AI safety, trust, and regulation.

Pennsylvania has filed a lawsuit in state court against Character Technologies Inc., the company behind Character.AI, accusing it of misleading users by allowing chatbots to act as licensed medical professionals. The case AI chatbots acting as doctors was filed recently in Commonwealth Court after a state investigator found a chatbot claiming to be a psychiatrist licensed in Pennsylvania and the UK, even providing an invalid license number.

Officials say this raises serious concerns about users receiving unsafe or false medical guidance from AI systems, and they are now seeking a court order to stop the practice under the state’s Medical Practice Act.

Chatbots accused of acting like real doctors

The lawsuit claims that some chatbots on the platform have described themselves as qualified psychiatrists and even provided fake medical license details. In one case, a state investigator interacted with a chatbot called “Emilie,” which claimed to be a licensed psychiatrist in Pennsylvania and the UK. Officials later confirmed that the license number shared was not valid. This raised a serious concern for users.

Pennsylvania moves to stop the practice

Pennsylvania has filed the case in state court and is asking judges to stop the company from allowing chatbots to present themselves as medical professionals. Officials say this violates the state’s Medical Practice Act, which only allows licensed individuals to offer medical advice. Governor Josh Shapiro took a firm stand, saying the state will not allow AI tools that mislead people into thinking they are getting advice from real doctors.

Why is this case getting so much attention

  • The lawsuit is one of the first in the US targeting AI chatbots for acting as medical experts.

  • This issue rises pressure on tech firms to control how AI interacts with users, especially children.

  • Character. AI has faced similar legal issues before, including cases linked to child safety and mental health.

  • The platform, which has over 20 million users, lets people create custom AI characters that can act like teachers, therapists, or even doctors.

Company’s response and safety claims

Character.AI says its platform is designed for entertainment and roleplay. The company points to visible disclaimers across its chat interface that clearly state characters are not real people and should not be treated as professional advisors. It also warns users not to rely on chatbots for expert guidance. However, the company has not directly responded to the lawsuit, saying it cannot comment due to ongoing legal proceedings.

What happens next

The court’s decision could change how AI platforms are allowed to behave in the future. This case could become a turning point for AI regulation. If the court rules against Character.AI, it may force companies to tighten controls on how chatbots describe their identity and capabilities. As Business Fortune observes, this action shows a growing challenge for the tech industry, making AI useful while ensuring it’s clearly different from real human experts.

-Sowmiya Sri Mani