WEDNESDAY, June 7, 2023 (HealthDay News) -- New York University doctors and hospital executives are using an artificial intelligence (AI) computer program to predict whether a newly discharged patient will soon fall sick enough to be readmitted.
The AI program “NYUTron” reads physicians' notes to estimate a patient’s risk of dying, the potential length of their hospital stay, and other factors important to their care.
Testing showed that NYUTron could predict four out of five patients who would require readmission to the hospital, according to a report published online June 7 in the journal Nature.
NYUTron is what its developers call a “large language model,” which can read and understand the creative and individualized notes frequently taken by doctors.
It’s an improvement over earlier health care computer algorithms that required data to be specially formatted and laid out in neat tables, the researchers said.
“Our findings highlight the potential for using large language models to guide physicians about patient care,” said lead researcher Lavender Jiang, a doctoral student at NYU’s Center for Data Science.
“Programs like NYUTron can alert health care providers in real time about factors that might lead to readmission and other concerns so they can be swiftly addressed or even averted,” Jiang said in a school news release.
Jiang and her colleagues trained NYUTron to scan unaltered text from electronic health records and, from what it learns, to make useful assessments about patient health status.
The study results showed that the program could predict about 80% of those who were readmitted, which was about a 5% improvement over a standard computer program that requires reformatting of medical data.
By automating basic tasks, such technology could provide doctors more time to spend with their patients, Jiang noted.
Large language models work by predicting the best word to fill in a sentence, based on how likely real people would use a particular term in that context.
The more data fed into a computer to teach it how to recognize such word patterns, the more accurate its guesses become over time, Jiang explained.
The researchers trained NYUTron using millions of clinical notes collected from the electronic health records of 336,000 men and women who received care within the NYU Langone hospital system between January 2011 and May 2020.
This resulted in a 4.1-billion-word language “cloud” that included any record written by a doctor, such as radiology reports, patient progress notes and discharge instructions, the study authors said.
Importantly, the clinical notes did not contain any sort of standardized language, forcing the program to learn to interpret abbreviations and terms unique to a particular writer.
In testing, NYUTron identified 85% of those who died in the hospital (a 7% improvement over standard methods) and estimated 79% of patients’ actual length of stay (a 12% improvement over the standard model), the researchers reported.
The tool also successfully assessed the likelihood that a patient might have additional conditions along with their primary disease, as well as the chances that insurance would deny coverage.
“These results demonstrate that large language models make the development of ‘smart hospitals’ not only a possibility, but a reality,” senior researcher and neurosurgeon Dr. Eric Oermann said. “Since NYUTron reads information taken directly from the electronic health record, its predictive models can be easily built and quickly implemented through the health care system.”
Future studies could explore the model’s ability to extract billing codes, predict risk of infection, and identify the right medication to order, among other potential applications, Oermann said.
However, Oermann emphasized that NYUTron is considered a support tool for health care providers, not a replacement for a doctor’s judgment tailored to an individual patient.
Funding for the study was partly provided by the U.S. National Institutes of Health.
The Brookings Institute has more about large language models.
SOURCE: NYU Grossman School of Medicine, news release, June 7, 2023