No More Legal or Medical Advice for ChatGPT users

OpenAI has updated its usage policy for ChatGPT, explicitly excluding legal and medical advice. In the revised version dated October 29, 2025, it now states a clearer rule against giving
 
tailored advice that requires a license… such as legal or medical advice, without appropriate involvement by a licensed professional.
 
The changes apply not only to legal advice but also to medical questions. If ChatGPT is asked for a legal assessment, the system initially responds with a disclaimer: I’m not a lawyer, so I cannot provide legal advice
 
A key risk remains the “hallucination” of large language models. That is, generating information that sounds plausible but is factually incorrect. This risk affects not only ChatGPT but also other AI systems, such as Gemini or Copilot.
 
For companies, it is therefore essential to ensure human oversight and implement a binding AI guideline. This guideline should clearly define how and which AI systems may be used, particularly with regard to the requirements of the EU AI Act.