future timeline technology singularity humanity
 
Blog»

 

4th May 2018

AI outperforms humans in diagnosing heart failure and cancers

Deep learning programs at a diagnostic imaging lab in Case Western Reserve University (CWRU) now routinely defeat their human counterparts in diagnosing heart failure, detecting various cancers and predicting their strength.

 

mri scan future technology

 

Anant Madabhushi, PhD, Director at the Center for Computational Imaging and Personalized Diagnostics in CWRU, can point to several recent examples of apparent cyber-superiority witnessed in his laboratory. He and his team work on computer vision, image analysis, pattern recognition and machine learning tools for breast, prostate, head and neck cancers, and brain tumours, as well as epilepsy and carotid plaques. But despite the increasingly sophisticated technologies being developed, he dismisses the idea of a coming future when such machines might replace pathologists and radiologists.

“There’s initially always going to be some wincing and anxiety among pathologists and radiologists over this idea – that our computational imaging technology can outperform us or even take our jobs,” says Madabhushi.

He contends that his research is not only creating powerful new diagnostic tools, but also helping to identify those patients with less advanced diseases who may not need more aggressive therapy.

“It’s not so much that we were able to ‘beat’ the pathologist or the radiologist, but rather that the machine was able to add value to what they can offer,” Madabhushi explains. “There is desperate need for better decision-support tools that allows them to serve patients, especially in places where there are very few pathologists or radiologists.

“By providing them with decision support, we can help them become more efficient. For instance, the tools could help reduce the amount of time spent on cases with no obvious disease or obviously benign conditions and instead help them focus on the more confounding cases.”

These new tools have been producing exceptionally good results in Madabhushi’s laboratory:

• A computational imaging system based on deep learning correctly predicted – with 97% accuracy – which patients were already showing evidence of pending heart failure, compared to human pathologists who were correct only 74% of the time.

• While human radiologists can flag up to half of all nodules that show up in a CAT scan as "suspicious" or "indeterminate", about 98% eventually turn out to be benign. In a recent study, published by the Journal of Medical Imaging, Madabhushi and his group showed that their computational imaging technique was up to 8% superior, compared to human experts, in distinguishing benign from malignant lung nodules on CAT scans.

• In a study of prostate cancer scans in the U.S., Finland and Australia, the algorithms outperformed their human counterparts in two ways:
– In over 70% of cases where radiologists missed the presence of cancer on an MRI scan, the machine algorithm caught it.
– In 50% of cases where radiologists mistakenly identified the presence of cancer on an MRI scan, the machine was able to correctly identify that no clinically significant disease was present.

 

heart health future technology scan

 

"This is all very exciting data for us – but now we need more validation and to demonstrate these results on larger cohorts," says Madabhushi. "But we really believe this is more evidence of what computational imaging of pathology and radiology images can do for cardiovascular and cancer research and practical use among pathologists and radiologists."

So, what exactly is the AI doing that humans can't? The short answer could be said for virtually all computer advantages in the last 50 years: the machines work at far greater speed and volume.

The precise difference here is that the diagnostic imaging computers at CWRU can read, log, compare and contrast literally hundreds of slides of tissue samples in the amount of time a pathologist might spend on a single slide. Then, they rapidly and completely catalogue characteristics like texture, shape and structure of glands, nuclei and surrounding tissue to determine the aggressiveness and risk associated with diseases. This is where 'deep learning' comes in: from all of that, they create algorithms able to look beyond what human eyes can see in comparing and contrasting those multitudes of images. Finally, they work towards predicting everything – from how aggressive a disease will be, to whether a scanned nodule is likely to even turn cancerous.

In the end, all of this new information can help pathologists and radiologists with interpretations of slides and scans, but more critically it can help clinicians make more-informed treatment recommendations. This can help a single pathologist do his or her work more efficiently by more accurately triaging patients by true need for care – or provide hope to an entire nation.

"I always use the example of Botswana, where they have a population of 2 million people – and only one pathologist we are aware of," says Madabhushi. "From that one example alone, you can see that this technology can help that one pathologist be more efficient and help many more people."

 

Comments »

 


 

If you enjoyed this article, please consider sharing it:

 

 

 

 
 

 

Comments

 

 

 

 

⇡  Back to top  ⇡

Next »