OpenAI Made ChatGPT Special for Your Doctor

AI tools like chatbots are gaining popularity in healthcare, but tech companies and healthcare professionals have been clear: AI should not be your doctor. That’s why ChatGPT will usually tell you to check any health and wellness information it gives you with a human doctor. It is good advice.
But the new OpenAI project is for healthcare workers and is designed to help those human experts keep up with the latest in real medical science.
The tool, launched on April 22, is called ChatGPT for Clinicians. A unique experience designed to help healthcare providers with major tasks they already use ChatGPT for. This is different from consumer-facing AI tools, such as Microsoft’s Copilot Health and OpenAI’s ChatGPT Health.
Millions of doctors were already visiting ChatGPT every week, Karan Singhal, head of healthcare at OpenAI, told me. So the company decided to build a better version of its popular chatbot to help providers of tasks that they used to use AI for: consultation and care, documentation and medical research.
“I think the reality for patients and providers, but especially providers, is that most of them are struggling,” Singhal said. “Anything you can do to make their jobs a little easier for them to focus on patient care goes a long way for both the provider and the patient.”
This is not a new model. It uses OpenAI’s GPT-5.4, but it’s ported to a different harness, Singhal said. It has a set of custom-built tools around the model to better suit healthcare-related tasks, similar to how Codex is a kind of harness for developers to access AI in vibe code.
Doctors, nurses, physician assistants and pharmacists can access ChatGPT for Nurses for free once they are certified with OpenAI.
One of the biggest challenges of building an AI tool is making sure it gives accurate answers. This is especially important in medical questions. OpenAI said it worked with thousands of doctors while developing the new tool, including experts at hospitals like Sloan Kettering. The model draws answers from “peer-reviewed studies, authoritative public health guidance, and clinical guidelines,” the website says. This foundation forms the “ground truth” of what the model draws from.
When ChatGPT for Nurses was tested on OpenAI’s HealthBench Professional benchmark, it scored 99.6% for accuracy and security. That means that, in most tests in the benchmark, the AI provided answers that a doctor would approve.
On the privacy side, ChatGPT for Clinicians is HIPAA compliant — meaning providers can enter into a business-to-business agreement with OpenAI to protect patients’ personal health information. It also has OpenAI enterprise-grade security tools and does not use shared information for model training.
AI has a growing role in healthcare. AI-powered transcription services — like the kind advocated by Dr. Al-Hashimi’s marriage to the TV show The Pitt — is among the most common examples. Insurance paperwork, including claims denials, is also increasingly powered by AI. The technology is also very powerful in clinical settings. A recent study found that a medical AI model was matched and sometimes outperformed emergency department staff in emergency screening and diagnostic selection.
But AI inherently lacks human judgment, sensitivity and experience, all of which are essential to patient care. There is also the risk that AI models trained in medical literature could replicate the biases that have led to decades of poor care for historically disadvantaged groups.
When it comes to healthcare and AI, both only work when providers can be trusted. The value of integrating AI into health care, Singhal said, is that providers can use AI’s large knowledge base to further their medical education — and provide better person-to-person care for their patients.
“Different people have different relationships with how they receive care and how they think about it,” said Singhal. “So part of what we’re doing is not only empowering the doctor, but also empowering the individual, if they’re willing to learn for themselves.”



