Print Friendly

Why Doctors Of the Future May Know Code

Or at least the basics, as data and algorithms are unleashed in their service

Main_Day1_Healthcare_MOV8_Patient_v06

On any given day, radiologists at UC San Francisco (UCSF) and across the country view thousands of x-rays.

These radiologists, with many years of experience in the medical and research world, have become adept at identifying which images require immediate follow-up or intervention. They can determine, for example, whether the lungs of a trauma patient are collapsed or if there is internal bleeding that requires immediate intervention.

But with all of the talk about making healthcare more efficient, the ability of clinicians to read the images and know what they’re seeing isn’t what’s holding back further progress. The challenge is that radiologists only have two eyes and so many hours in a day. The speed at which they can accurately read and evaluate images can only be so fast. While brilliant, they are, in fact, human.

And while the images wait, so does the patient.

 

Of the images sent to them for analysis each day, a large number turn out to be normal. The rest reflect a wide variety of conditions that may require immediate treatment, but those images may be anywhere in the queue, waiting quietly for the radiologist to read them. Not only do the radiologists need to find the needles in the haystack of images, but they need to know where to start looking to minimize any delay in providing important treatment to patients.

This “needle in the haystack” problem exists throughout healthcare, whether in identifying high risk patients in the Emergency Department, a primary care office, or the ICU. Also, identifying which findings are important and which patients are at risk is getting more difficult with the ever-increasing volume of patients and the number and types of tests and images that threaten to overload the system.

This challenge is the focus of a new partnership between UCSF’s Center for Digital Health Innovation (CDHI) and GE Healthcare. Together, the two institutions are developing a deep learning library of algorithms that aims to expedite and aid differential diagnosis and improve clinical workflows, shortening time to treatment and improving patient outcomes.

“There’s a lot of hype about Artificial Intelligence and Machine Learning in healthcare,” said Dr. Michael Blum, associate vice chancellor for informatics, director of CDHI and professor of medicine at UCSF. “People see smart computers all around them – Apple’s Siri, Amazon’s ECHO, Tesla’sTM self-driving car – and they think healthcare should be the same. Obviously, healthcare is far more complex, requires much higher accuracy, and has less for margin error. At this point it is like an autonomous car trying to drive without the maps to support its navigation.”

At GE’s annual Minds and Machines gathering of the Industrial Internet community in San Francisco, Dr. Blum touted the combination of the clinical expertise of a major clinical and research institution like UCSF and a global leader in medical software and technology like GE Healthcare. “What’s really needed is a much deeper and more accurate understanding of healthcare globally, of what individuals and providers are facing day in and day out in the office, at home, and in the hospital. Only then can we deploy these advanced technologies to improve patients’ lives and reduce costs,” he said.

Enter the algorithms based on data and analytics, including images from patient scans at UCSF, the very images that the radiologists there are reading every day. In the simplest terms, once unleashed, this ‘code in the cloud’ will be focused on reviewing images in a radiologists queue, analyzing them using an ever-growing library of similar images for which a diagnosis is known, and pushing to the front of the queue the images that it recognizes as requiring urgent action. The goal is to enable radiologists to review and recommend treatment faster.

As algorithms are trained and the library of available algorithms expands, the associated applications will have the potential to do everything from predicting patient trajectories, to automating the triage of routine care, to improving process efficiency and enabling the development of more personalized therapies. By rapidly delivering information to clinicians about abnormalities, inefficiencies and personalized interventions, algorithms are designed to help providers improve diagnostic accuracy and patient outcomes, as well as improve clinical workflows and productivity.

Main_Day1_Healthcare_MOV9_MRMachine_v04

“What is so powerful about combining analytics, deep learning and cloud technology is that the solutions will only get smarter and more scalable over time,” said Charles Koontz, Chief Digital Officer of GE Healthcare. “While this partnership begins here in Silicon Valley, it’s the global users of the algorithms who will disrupt the way care is delivered.”

The algorithms will further be used to ensure providers around the world can access new knowledge and insights delivered through deep learning – a method by which machines can rapidly generate new levels of clinical and operational value from large imaging and textual data sets in ways that traditional machine learning methods cannot.

“In medical school, physicians learn to use a stethoscope and to read x-rays to help identify what’s happening inside a patient’s body,” says Dr. Blum. “Now we will add technologies including artificial intelligence and machine learning to our arsenal. We are eager to help develop these transformational tools that will help us more accurately and efficiently treat our patients.”

This code – the algorithms that will be embodied in applications on the GE Health Cloud and in service of these clinicians – may be what cracks the code to finally turning digital health from hype to realistically helpful.

 

Tesla is a trademark of Tesla Motors, Inc