From gaming to healthcare: Microsoft’s approach to visualizing cancer

Can machine learning help transform radiation therapy for cancer?

The notion that the same underlying technology that currently helps you master the Boot Scootin’ Boogie line dance on Country Dance All Stars for Xbox 360 could soon help your doctor develop your cancer treatment plan sounds absurd, but that’s exactly what Microsoft is attempting to do with Project InnerEye [1].

For over four decades, Microsoft has been at the forefront of consumer technological innovation and has launched a host of products that have dramatically transformed various aspects of human interaction. From MS-DOS and Windows altering how humans interact with machines; to Microsoft Office changing how people interact with their colleagues, clients, and customers; to Xbox Kinect transforming how gamers interact with their consoles and, by extension, their friends [2]. But the tech giant now has its sights set on disrupting yet another interaction: the delicate relationship between physicians and their patients. Microsoft’s Project InnerEye is leveraging machine learning and computer vision technology to develop tools to aid radiation oncologists in identifying, targeting, and monitoring cancer in their patients. The question remains, however, whether the team will be able to successfully apply their deep expertise in AI research to the complex, slow-moving, and heavily regulated world of healthcare [3].

Unlike the Kinect, which captures a gamer’s movements by looking at the outside of their body, Project InnerEye is doing exactly what its name suggests: looking inside a patient’s body. However, the software and hardware innovations that enable both pieces of technology are the same: (1) a supervised machine learning algorithm developed by Microsoft called Random Forest and (2) Field Programmable Gate Array (FPGA) supercomputers that enable both computational speed and programming flexibility and which are based upon Graphical Processing Units (GPUs) initially developed for gaming applications [4]. With Project InnerEye and other healthcare-focused programs that are currently being pursued under Microsoft’s Healthcare NExT initiative, the company is trying to marry its history of technological prowess with recent advances in machine learning and cloud computing to compete in the deeply challenging but meaningful space of healthcare innovation [5].

Project InnerEye’s approach to machine learning is one that its researchers call “Assisted AI,” which focuses on developing tools that supplement rather than substitute the expertise of the human physicians that have traditionally been tasked with performing the work Project InnerEye is automating [6]. Without the assistance of machine learning, radiation oncologists are forced to spend hours per patient manually analyzing a set of diagnostic images slice by slice in order to identify and delineate cancerous vs. healthy tissue. Only after carefully performing this time-consuming and highly specialized task can he or she begin to develop the radiotherapy treatment plan. Project InnerEye promises to offload this particular task from the physician in order to enable him or her to focus on other equally critical but less tedious tasks. With InnerEye, a physician simply sends anonymous and encrypted scan images directly to the program, which is then able to complete the markup and develop a 3D model of the tissue in a matter of minutes [7]. To get to this point, the team at Project InnerEye trained its machine learning algorithm using scans from hundreds of patients that represented a wide range of hospital geographies, imaging modalities (e.g. MRI vs. CT vs. PET), and image resolution and then validated its accuracy by comparing the results to those of resident experts [8].

Looking forward to the future, in addition to offloading tasks currently performed by physicians, the Project InnerEye technology could be used to alter the treatment paradigm for cancer patients. Whereas today imaging and markup is generally only performed once due to the time and expense involved, Project InnerEye could help usher in an era of “adaptive radiotherapy” in which these tasks are performed more regularly during a treatment series to actively monitor the tumor’s progression and more precisely target the therapy [7]. Interestingly, Project InnerEye will likely face some challenges with adoption that other AI-enabled technology has not. Whereas AI and automation have already started to dramatically alter the workforce for many low-skilled jobs that managers have determined machines can perform more cost effectively than human labor (e.g. cashiers, call centers, etc…), Project InnerEye’s targeted end-users are the very same people whose work it intends to perform. This reality requires the team to carefully thread the needle of providing valuable automation without wholly replacing the role of the physician in the process, which requires close-knit and interactive relationships with the physician community to understand their needs and motivations.

While Project InnerEye has shown impressive potential, many questions remain regarding its future adoption. Will physicians ever fully trust a machine to provide patient-critical health information? If the machine learning algorithm makes a mistake that results in a serious adverse event for a patient, who is to blame – Microsoft or the treating physician?

 

(Word Count: 798)

 

[1] Game Mill Entertainment, “Country Dance All Stars,” http://game-mill.com/games/country-dance-all-stars/, accessed November 2018.

[2] Malathi Nayak, “Timeline: Microsoft’s journey: four decades, three CEOs,” Reuters Business News, February 4, 2014, https://www.reuters.com/article/us-microsoft-succession-timeline/timeline-microsofts-journey-four-decades-three-ceos-idUSBREA131R720140204, accessed November 2018.

[3] Allison Linn, “Microsoft looks to healthcare partners for ways to bring AI benefits to cancer patients,” The AI Blog, Microsoft, November 28, 2017, https://blogs.microsoft.com/ai/project-innereye-cancer-treatment/, accessed November 2018.

[4] Gerald Lynch, “From Kinect to InnerEye – How Microsoft is supercharging gaming tech with AI smarts to help diagnose cancer,” Tech Radar, May 9, 2017, https://www.techradar.com/news/from-kinect-to-innereye-how-microsoft-is-supercharging-gaming-tech-with-ai-smarts-to-help-diagnose-cancer, accessed November 2018.

[5] Tom Warren, “Microsoft Healthcare is a new effort to push doctors to the cloud,” The Verge, June 27, 2018, https://www.theverge.com/2018/6/27/17509096/microsoft-healthcare-cloud-systems, accessed November 2018.

[6] Microsoft Research, “Five-minute overview of the InnerEye research project ,” YouTube, published September 22, 2016, https://www.youtube.com/watch?v=9IXgVmLxVtQ, accessed November 2018.

[7] Ian Sample, “’It’s going to create a revolution’: how AI is transforming the NHS,” The Guardian, July 4, 2018, https://www.theguardian.com/technology/2018/jul/04/its-going-create-revolution-how-ai-transforming-nhs, accessed November 2018.

[8] Cynthia E. Keen, “AI drives analysis of medical images,” Healthcare-In-Europe.com, February 27, 2018, https://healthcare-in-europe.com/en/news/ai-drives-analysis-of-medical-images.html, accessed November 2018.

[9] Ian Sample, “Joseph Stiglitz on artificial intelligence: ‘We’re going towards a more divided society,’” The Guardian, September 28, 2018, https://www.theguardian.com/technology/2018/sep/08/joseph-stiglitz-on-artificial-intelligence-were-going-towards-a-more-divided-society, accessed November 2018.

Previous:

Machine Learning in the Shale Patch: A Look at the “Mother Fracker”

Next:

Beauty in the Age of Individualism: Sephora’s Data-Driven Approach

Student comments on From gaming to healthcare: Microsoft’s approach to visualizing cancer

  1. Such a great piece.

    Project InnerEye clearly stands to deliver huge benefits to patients and doctors, not only by speeding up diagnoses, but saving costs, too, and both points are well-researched and articulated here.

    What I’m left wondering is a question common to any application of AI to matters of great moral or ethical significance: even if InnerEye achieves near perfect accuracy, will doctors and patients ever be comfortable relying on it when a life is on the line? Will doctors have to double-check its analysis, and if so, how does that limit the efficiencies it delivers?

Leave a comment