Google Meets Breast Cancer Challenge

Posted on: October 19th, 2018 by admin 3 Comments
Print Page

Four years ago Google bought a British artificial intelligence (AI) company called DeepMind for half a billion US dollars. The founders studied computational neuroscience together at University College London, and their technology applies deep learning on a convolutional neural network using a form of reinforcement learning. The company gained considerable publicity in 2016 when its AlphaGo program beat the world champion Go player in a five-game series that was featured in a documentary film.

In November 2017 DeepMind initiated a research project with Cancer Research UK at Imperial College in London to apply their technology to improve the reading and assessment of mammograms. They will be using the OPTIMAM database of 30,000 women whose mammograms are archived at the Roral Surrey County Hospital in Guildford, as well as another 30,000 mammograms provided by the Jikei University Hospital in Japan. According to Imperial College, “the team believes it has the potential to increase the accuracy of breast screening interpretation, and improve the ability to detect breast cancers on mammograms.”

In another Google initiative, their AI team based at corporate headquarters in Mountain View, California, has applied deep learning to metatstatic breast cancer detection. In collaboration with clinicians at the Naval Medical Center in San Diego, they have just published a paper in Archives of Pathology & Laboratory Medicine entitled “Artificial intelligence-based breast cancer nodal metastasis detection: Insights in the black box for pathologists.” A laborious aspect of a pathologist’s job is to conduct microscopic examinations of biopsy specimen slides, and determine if the malignant cells have spread from the primary site in the breast to the nearby lymph nodes.

The researchers developed a deep learning algorithm called LYNA – short for LYmph Node Assistant – which has reached a level of sophistication enabling it to tell the difference between cancerous and healthy tissue with an impressive 99% accuracy. Seen in the slide on the left (© Google) are lymph nodes with multiple artifacts, including an air bubble, cutting streaks, necrotic tissue and blood. LYNA was able to identify the tumour region (highlighted in red in the slide on the right), and it correctly classified the surrounding region, despite being laden with artifacts, as non-tumour (shown in blue).

The partners in the mammography project in the UK stated: “We wanted researchers at both DeepMind and Google involved in this research, to take advantage of the AI expertise in both teams, as well as Google’s supercomputing infrastructure. We hope this will achieve more impactful results for patients, which is everyone’s priority.”

3 Responses

  1. Ed Blignaut says:

    All seems to be well as long as AI websites are open for use/inspection by scientific bodies with a direct interest in mammography e.g. CapeRay
    Ed

  2. The partners in the mammography project in the UK stated: “We “wanted researchers at both DeepMind and Google involved in this research, to take advantage of the AI expertise in both teams, as well as Google’s supercomputing infrastructure. We hope this will achieve more impactful results for patients, which is everyone’s priority.”
    Has CapeRay considered getting Google to help in the development of a 3D ABUS combo machine (for which you have the patents), enabling you to identify up to 95% of early breast cancers in women with and without dense breast tissue?

  3. Murray Izzett says:

    Great article again Kit. What often undermines the outstanding performance of AI in healthcare is the sensitivity and specificity of the human in comparison to the machine. To some the 1% inaccuracy of the machine is used to say “how would you feel if that was you”. I would hazard a guess that the human pathology screening would be in the region on 70-80%