Machine Learning and Radiologists: Friends or Foes?
Machine learning and artificial intelligence are looming disruptors in the field of radiology. What are leading health systems doing to tackle this issue?
Machine learning is well-poised to disrupt healthcare delivery over the coming decade. Among the subspecialties in healthcare, diagnostic radiology has been broadly identified to be among the first to experience significant disruption [1].
For the purposes of this exercise, I’ll use Cleveland Clinic’s Imaging Institute as an organization which will have to tackle this issue. However, over the coming decade, any healthcare delivery organization will have to address this issue.
In the US alone, radiological images have gone from gigabytes to petabytes of data generated per day [2]. To interpret this data, Cleveland Clinic, ranked the #2 US hospital, employs 234 radiologists practicing in 13 subspecialties [3].
The use of machine learning in radiology has many implications, making it important to the future of healthcare. First, physician shortages present a major obstacle to accessing care in the United States [4]. Experts on machine learning in healthcare, such as Sun Microsystems co-founder Vinod Khosla, estimate that up to 80% of a physician’s work could be completed by a computer in the foreseeable future [5]. The resulting increase in capacity would allow for timelier access to healthcare and a reduction in the growing personnel costs that have resulted from growth in demand outpacing growth in supply for healthcare professionals over several decades [6]. Finally, an important impact of the application of this technology to healthcare is the potential to reduce errors. Current estimates of errors in radiology range from 2-5%, with the potential to nearly eliminate these errors through the use of machine learning enabled computers in the diagnostic process, especially when coupled with a radiologist [7].
Cleveland Clinic has taken many actions to be at the forefront of this looming disruption. One of the most prominent examples is their long-standing partnership with IBM. Cleveland Clinic and IBM announced in 2012 their intent to partner on IBM Watson Health in order to advance the field of machine learning in healthcare [8]. As part of this partnership, IBM Watson “enrolled” in the 5-year curriculum at Cleveland Clinic’s Lerner College of Medicine in order to learn how physicians are educated [9]. The partnership was further deepened when Cleveland Clinic sold its data aggregation platform, Explorys, to IBM for an undisclosed sum in 2015 [10]. Given the complexity of the structure of healthcare data which tends to reside in several different systems which are not necessarily well-interfaced with one another, the capability to aggregate data in a meaningful way is a critical hurdle to overcome in order to advance the objectives of machine learning in healthcare. A collaborative relationship with IBM, leveraging IBM’s strengths in computing and machine learning as well as Cleveland Clinic’s strengths in healthcare delivery, research, and education, culminated in IBM building their newest Watson Health facility on Cleveland Clinic’s campus [11].
Other strategies that Cleveland Clinic has undertaken to position itself in a changing healthcare environment are admittedly “lower-tech.” Since 2007, Cleveland Clinic has become relentlessly focused on the patient’s experience [12]. This focus has helped Cleveland Clinic stand out as a leader in one of the most important aspects of medicine that will be among the hardest to disrupt through a machine learning technology. With a focus on physician-patient communication and an empathetic approach to care, Cleveland Clinic is doubling down on strengths which won’t soon be disrupted by technological advances [13].
Through a meaningful partnership with IBM, as well as a focus on the fundamentals of the healthcare delivery value chain, Cleveland Clinic has positioned itself to create, rather than be subject to, these coming changes in healthcare.
Another area that Cleveland Clinic should further consider is solutions which are immediately compatible with the regulatory landscape in the US. Currently, the FDA has not approved the use of computers alone in the review of radiological images; however, it could occur in the near future [14]. There may be ways to advance the objectives of improved access, reduced cost, and improved time to treatment in the interim. For example, some suggest that computers can be used today to read and then triage radiological cases so that radiologists render an opinion on the most critical cases first, rather than using a “FIFO” method to manage their work queue [15]. Images would still be interpreted entirely by radiologists, but the prioritization would be completed by a computer. This approach would allow radiologists to become comfortable with the technology, create data on the concordance/discordance of the computer and radiologist which could be used to refine the machine’s learning process, and create immediate clinical benefits and operational efficiencies.
As we begin having more involvement of machine learning in healthcare, I would pose the following questions to my HBS classmates: As a consumer, would you be comfortable with a computer alone providing complex medical opinions? If not, how would you want your physician to interact with you and your health information as you consume healthcare?
(Word Count: 795)
References
- Edith Walach, “Which Area Of Medicine Is Most Ripe For AI Disruption?,” Forbes, April 12, 2018, https://www.forbes.com/sites/forbestechcouncil/2018/04/12/which-area-of-medicine-is-most-ripe-for-ai-disruption/#32b93b3d2943, accessed November 2018.
- Rasu B. Shrestha, “Imaging on the Cloud,” Applied Radiology, April 28, 2011, https://www.appliedradiology.com/articles/imaging-on-the-cloud, accessed November 2018.
- Cleveland Clinic Foundation, “Our Doctors,” https://my.clevelandclinic.org/departments/imaging/staff, accessed November 2018.
- Richard A. Cooper, Thomas E. Getzen, Heather J. McKee, and Prakash Laud, “Economic And Demographic Trends Signal An Impending Physician Shortage,” Health Affairs, Vol. 21, No. 1, January 2002, Google Scholar, accessed November 2018.
- Vinod Khosla, “20 Percent Doctor Included & Dr. Algorithm: Speculations and Musings of a Technology Optimist,” September 30, 2016, https://www.khoslaventures.com/20-percent-doctor-included-speculations-and-musings-of-a-technology-optimist, accessed November 2018.
- Richard Dargan, “Radiology Salaries Increase, but so Do Workload and Burnout,” Radiology Society of North America, October 1, 2017, https://www.rsna.org/news/2017/october/radiology-salary-survey, accessed November 2018.
- Mark Graber, Robert Wachter, and Christine K. Cassel, “Bringing Diagnosis Into the Quality and Safety Equations,” Journal of the American Medical Association, Vol. 308 No.12, December 2012, Google Scholar, accessed November 2018.
- IBM. “Cleveland Clinic and IBM Work to Advance Watson’s Use in the Medical Training Field.” press release, October 20, 2012. IBM Website, https://www-03.ibm.com/press/us/en/pressrelease/39243.wss, accessed November 2018.
- Steve Lohr, “IBM’s Watson Goes to Medical School,” New York Times, October 30, 2012, https://bits.blogs.nytimes.com/2012/10/30/i-b-m-s-watson-goes-to-medical-school/?mtrref=www.google.com&gwh=CE8721D947A883BA30E7F689F93F9941&gwt=pay, accessed November 2018.
- Plain Dealer Business Staff, “IBM acquires Cleveland-based Explorys,” Cleveland Plain Dealer, April 16, 2015, https://www.cleveland.com/business/index.ssf/2015/04/ibm_buys_cleveland-based_explo.html, accessed November 2018.
- IBM. “Cleveland Clinic, IBM Continue Their Collaboration to Establish Model for Cognitive Population Health Management and Data-Driven Personalized Healthcare.” press release, December 22, 2016, IBM Website, https://www-03.ibm.com/press/us/en/pressrelease/51290.wss, accessed November 2018.
- James Merlino and Ananth Raman, “Health Care’s Service Fanatics,” Harvard Business Review, May, 2013, https://hbr.org/2013/05/health-cares-service-fanatics, accessed November 2018.
- Anne Trafton, “Doctors rely on more than just data for medical decision making,” MIT News, July 20, 2018, http://news.mit.edu/2018/doctors-rely-gut-feelings-decision-making-0720, accessed November 2018.
- Mike Miliard, “As FDA signals wider AI approval, hospitals have a role to play,” Healthcare IT News, May 31, 2018, https://www.healthcareitnews.com/news/fda-signals-wider-ai-approval-hospitals-have-role-play, accessed November 2018.
- Jennifer Huber, “Enlisting artificial intelligence to assist radiologists,” Stanford Bio-X, June 22, 2016, https://biox.stanford.edu/highlight/enlisting-artificial-intelligence-assist-radiologists, accessed November 2018.
Feature Image Source: http://blog.cleveland.com/metro/2011/10/watson_computer_downs_clinic_c.html (accessed November 2018)
You bring up some interesting issues on human/machine collaboration/handoff that I think is crucial to address going forward in healthcare. My personal belief is that machines and machine learning, at least in today’s context, are extremely narrow in training – that is, they are unable to understand broader context around the radiology feature classification problem.
As an example, consider the following scenario:
Radiologists of the future not only use ML models to screen for routine conditions, but also to change features to look for as they are managing patients in the hospital – based on what the patient is currently experiencing now. What the system may initially present as lesions signalling organ failure may actually be tied to some unknown upstream effect (i.e. say a metastatic tumor originating in the brain) that the physician only recently discovers after consultation with his/her fellow physicians. In this way, similar to the priority sequencing mechanism you describe in your article, I also see ML tools for radiology being quick “second opinion” checkers that can enable physicians to act quickly and deliver intervention more efficiently.
Really nice article and very well-written. There is a lot of hype about computers and machine learning in healthcare, particularly in radiology has you have noted. People do talk in the extremes, saying that radiologists will be rendered useless by machine learning in the future. However, radiologists do much more than just interpret images. They determine proper scan parameters given the clinical question at hand, and they administer proper contrast agents (which can have serious side effects) taking into account the patient’s clinical history. Furthermore, human anatomy is quite variable from person to person, study to study, moment to moment (e.g. intestinal movement), so I imagine a machine learning algorithm would require exponentially more images and interpretations to learn from given this dynamic nature. I loved the FIFO comment and agree that prioritization is key in this field. At this point, our only prioritization is really clinical presentation, but it seems like there is room for improvement with AI.
Great article! Would love to learn more about the impact on physician workflow and clinical practices since the IBM Watson Health collaboration. As a consumer, I’m comfortable with my physician using technology/ML when arriving at a medical opinion. I agree with the comments above that machine learning will more likely augment the role of a physician, rather than replace them. For example, I imagine most radiologists see 99% of “no tumor” images and 1% of “possible/confirmed tumor” images; I’d think machine learning algorithms can easily identify the 99% “no tumor” images and flag the 1% “possible/confirmed tumor” images for the radiologist as a way to prioritize their time.