DrugGPT: new AI tool could help doctors prescribe medicine in England

Drugs are a cornerstone of medicine, but sometimes doctors make mistakes when prescribing them and patients don’t take them properly.

A new AI tool developed at Oxford University aims to tackle both those problems. DrugGPT offers a safety net for clinicians when they prescribe medicines and gives them information that may help their patients better understand why and how to take them.

Doctors and other healthcare professionals who prescribe medicines will be able to get an instant second opinion by entering a patient’s conditions into the chatbot. Prototype versions respond with a list of recommended drugs and flag up possible adverse effects and drug-drug interactions.

“One of the great things is that it then explains why,” said Prof David Clifton, whose team at Oxford’s AI for Healthcare lab led the project.

“It will show you the guidance – the research, flowcharts and references – and why it recommends this particular drug.”

Some doctors already use mainstream generative AI chatbots such as ChatGPT and Google’s Gemini (formerly Bard) to check their diagnoses and write up medical notes or letters. International medical associations have previously advised clinicians not to use those tools, partly because of the risk that the chatbot will give false information, or what technologists refer to as hallucinations.

Drugs are a cornerstone of medicine, but sometimes doctors make mistakes when prescribing them and patients don’t take them properly.

A new AI tool developed at Oxford University aims to tackle both those problems. DrugGPT offers a safety net for clinicians when they prescribe medicines and gives them information that may help their patients better understand why and how to take them.

Doctors and other healthcare professionals who prescribe medicines will be able to get an instant second opinion by entering a patient’s conditions into the chatbot. Prototype versions respond with a list of recommended drugs and flag up possible adverse effects and drug-drug interactions.

“One of the great things is that it then explains why,” said Prof David Clifton, whose team at Oxford’s AI for Healthcare lab led the project.

“It will show you the guidance – the research, flowcharts and references – and why it recommends this particular drug.”

Some doctors already use mainstream generative AI chatbots such as ChatGPT and Google’s Gemini (formerly Bard) to check their diagnoses and write up medical notes or letters. International medical associations have previously advised clinicians not to use those tools, partly because of the risk that the chatbot will give false information, or what technologists refer to as hallucinations.

Dr Lucy Mackillop, a consultant obstetric physician at Oxford University Hospitals NHS Foundation Trust who has advised Clifton’s team, said the potential advantage of DrugGPT was that it would give busy doctors more information about the drugs they were prescribing.

“If you discuss it with the patient, they are more likely to understand and be compliant with medication, and the medication is therefore more likely overall to work and do the job it’s meant to do,” she said.

Dr Michael Mulholland, vice-chair of the Royal College of GPs, said that in the vast majority of cases, prescriptions were made correctly.

But “doctors are only human and errors can happen, particularly when doctors are working under intense workload and workforce pressures, as GPs and our teams currently are. This is particularly the case with patients who take lots of medications at once, as there will be many different ways the medications may interact with each other.

“We are always open to introducing more sophisticated safety measures that will support us to minimise human error – we just need to ensure that any new tools and systems are robust and that their use is piloted before wider rollout to avoid any unforeseen and unintended consequences.

“Ultimately, the most effective long-lasting solution to delivering safe patient care is to ensure that general practice is adequately funded and staffed with enough GPs and other healthcare professionals working at safe levels.”

Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *