Where healthcare challenges find solutions

Care Delivery

Artificial intelligence takes on medical imaging

Radiologists look at a new image every three to four seconds during an eight-hour workday.

That's hardly enough time to find the patterns, abnormalities and other markers essential in making a diagnosis.

Hospitals are hoping to lessen that load by outsourcing some of that work-not to people across the ocean, but rather to machines.


These computers, running artificial intelligence and machine-learning algorithms, are trained to find patterns in images, identify specific anatomical markers. But also go deeper and spot details the human eye can't catch. Early versions of these algorithms, currently in trials, are both accurate and fast.

Though hospitals are welcoming robotic overlords, radiologists need not worry about their #8212;at least not yet. After all, people are still necessary to read the information the machines produce and make sense of the data.

What's more, it's still the early days for artificial intelligence in imaging, and though the technology is promising—potentially lowering costs, improving quality and making providers more efficient and effective—there are significant hurdles to overcome.

The Takeaway:

Big data analytics in imaging could lower costs and improve efficiency, but first it must get past some roadblocks.

“We'll see our changing slowly,” said Dr. Keith Dryer, vice chairman of radiology at Massachusetts General Hospital, Boston. “If you look 10 or 25 years from now at what a radiologist is doing, it'll probably be dramatically different.”

Indeed, just as the advent of digital imaging and communications in medicine—DICOM—drove transformation in the field decades ago, so could algorithms driven by big data, once the kinks are worked out.

As radiologists do, artificial intelligence learns as it goes. In fact, learning is how it gets started in the first place. To “train” an algorithm to recognize, for instance, a stroke, developers feed the algorithm tons of imaging studies of a brain suffering from an attack. Teaching the machine the nuances that make pattern recognition possible. Then, as the algorithm goes into action in the real world, acting on what it's already been trained to do, it can gain new information from new images, learning even more in a perpetual feedback loop.

Using images of the heart's ventricle surfaces, shown at right, are generated manually. Arterys can generate themselves automatically with a deep learning algorithm that identifies the contours of the ventricle surfaces on each slide of the study, seen on the left. Using images of the heart's ventricle surfaces, shown at right, are generated manually. Arterys can generate themselves automatically with a deep learning algorithm that identifies the contours of the ventricle surfaces on each slide of the study, seen on the left.

For instance, Arterys' cardiac MRI automates the most tedious steps in cardiac analysis, drawing on what it has learned from thousands of examples and applying deep learning algorithms. This kind of automation “frees up a lot of physician time and brings a huge amount of consistency to imaging and tracking changes over time in a patient,” said Carla Leibowitz, Arterys' head of strategy and marketing. The browser-based software is in use at 40 sites around the world, including the University of California at San Diego and Fairfax (Va.) Radiological Consultants.

Like Arterys, Zebra Medical Vision relies on a vast supply of medical case data to train its algorithms so radiologists can find what they're looking for—and what they don't yet know they're looking for—more accurately, more quickly and more consistently. “That's a win for everyone,” said Elad Benjamin, co-founder and CEO of Zebra Medical. “Radiologists are able to deliver better care at lower costs, and patients get the benefit of improved diagnoses.”

Zebra Medical's algorithms draw on one of the largest databases of anonymized medical imaging data—millions of patient records and their associated radiology reports. Each of Zebra Medical's algorithms is dedicated to a particular finding, such as emphysema in the lungs. The company has partnered with Intermountain Project Japan, which will use these algorithms for population health. The Salt Lake City-based health system has conducted a preliminary validation of the algorithm and is currently running more assessments. Once the technology is further developed, Intermountain hopes to use it to prevent excess hospitalizations by giving special treatment to those patients most at risk of a health problem.

Using AI for clinical decisionmaking depends in part on how the information is presented. “AI provides information in discrete answers to questions,” said Dr. Keith White, medical director of imaging services at Intermountain. “It's interesting that AI and this kind of output corresponds to a change that radiologist leaders are already trying to work toward—which is to transform radiology away from being a narrative, prose-based dictation system into being discrete data and answers.”

Getting the technology into hospitals isn't just a matter of having mature, capable technology. There are regulatory roadblocks, clinicians must be trained how to use AI, and it has to be integrated into the workflow.

Getting AI through the Food and Drug Administration's regulatory process is the first order of business. According to some, the FDA hasn't caught up with how AI works. The agency's draft guidance on software changes, released in 2016, calls for re-approval of some medical devices—including those running algorithms—every time they change significantly. That's particularly burdensome for AI, since changing quickly is at the very heart of what learning algorithms are supposed to do.

FDA regulation “is quite burdensome today,” Leibowitz said, “and sometimes it's confusing and adds huge chunks of time to our development timeline.” But she and others recognize that regulation is necessary if anyone's going to trust these algorithms in the first place.

Getting an algorithm certified is just the first step. It then has to be integrated into existing systems. Because AI usually produces discrete data elements, as Intermountain's White said, it's theoretically possible to bring those data elements smoothly into workflows. Ideally, AI will run on a case automatically, producing discrete assessments that the radiologist can validate and add to, which are then pulled into the electronic health record, where downstream providers can act accordingly.

“If we put more structured information into the EHR, it can follow the patient more consistently, as opposed to how we do it today, where we create a report,” Dryer said.

But theory is often neater than practice, and some worry about how the output from AI will actually fit into the workflow, not to mention the EHRs themselves. “We need to ensure that there's interoperability,” said Dr. Bibb Allen, chief medical officer of the American College of Radiology's All about medicine Science Institute. Much as the industry created the DICOM standard to ensure that medical images were interoperable, it'll need to create standardized use cases with common data elements for AI.

One potential problem is how the algorithms are initially trained. Sometimes, the data they're fed in the learning process come from just one specific model of imaging machine. Because different models have different radiation doses and slightly different technologies, “you've got an inherent bias that's built in,” said Steve Tolle, vice president and chief strategist of IBM's Watson Health Imaging.

To help avoid that bias, IBM is using a collaborative approach, working with 20 health systems to use images from many different sources to develop its Watson cognitive platforms, which one day, Tolle said, will be able to perform image analytics.

Even when the technology is strong, doctors may still be reticent to use it. “A real challenge is physician acceptance,” Tolle said. “We believe you must have transparency so a doctor knows how a machine is driving toward a conclusion or recommendation. Doctors need to understand the science.”

Once they do, they'll be more likely to accept the technology as a tool they can use with confidence, and not fear it as something that may replace them.

“We don't think this is going to replace the physician at all,” said Arterys' Leibowitz. “The physician does a lot more than look for patterns and connect the dots.”

Allen sees the technology as a way to allow radiologists to do more.

“It's an opportunity for radiologists to expand their role,” he said. “There could be a way to push patients with emergent disease to the forefront of our work list. Radiologists have an opportunity to become managers of information that might go beyond just what we see in the images.”

To facilitate that, the American College of Radiology established the All about medicine Science Institute to work on the snags that could halt AI in its tracks: verification of algorithms, integration into workflows, FDA regulations and use cases.

It'll be awhile before all those areas are figured out, Allen admitted. But the potential—for population health, for precision medicine, for quality in general—points to the even broader potential to use AI not just for imaging but across the industry, making clinicians more effective and efficient, thereby lowering costs and improving quality for patients.



Loading Comments Loading comments...