A team of researchers at the Massachusetts Institute of Technology (MIT) have developed a deep learning model that can predict breast cancer from mammogram images up to five years before a diagnosis can be made by doctors.
create jobs 51 | Shutterstock
Breast cancer is the most common cancer in women and is responsible for around 500,000 deaths each year worldwide. There are now many effective treatments for breast cancer, but a successful outcome is still dependent on an early diagnosis.
Later diagnoses require more aggressive treatments, which come with considerable side effects and often fail. Identifying patients at risk of developing breast cancer has therefore been a key focus for researchers looking to reduce the number of breast cancer-related deaths.
Screening programs using mammography are employed to enable early detection and treatment of breast cancer. However, such screening requires scrutiny of each mammogram for signs of abnormality, which is highly labor-intensive due to the large number of women that must be screened.
In addition, since the images are manually reviewed, there remains an element of subjectivity and the risk of human error. In order to speed up the review of mammograms and enable objective assessment of risk, researchers have been working on developing computer models that can rapidly and reliably screen mammogram images for breast cancer risk.
‘Unique patterns of breast tissue’
Using information from more than 90,000 mammograms taken at Massachusetts General Hospital (MGH), a team at MIT’s Computer Science and Artificial Intelligence Laboratory has developed a new deep-learning model that detects subtle patterns of change in breast tissue that the human eye is unable to detect.
Programmed using mammograms and known outcomes of over 60,000 patients, the model identifies precursors to malignant tumors and can predict from a mammogram if a patient is likely to develop breast cancer.
Unlike assessment of key risk factors, such as age, family history of breast cancer, hormonal status, and breast density, the MIT deep-learning model identifies patterns indicative of breast cancer. The predictions are thus data-driven and can be made up to 5 years before the cancer develops. The screening thus provides an individual risk assessment that could be used to customize screening and prevention programs on a patient-by-patient basis.
Constance Lehman, professor of radiology at Harvard Medical School and division chief of breast imaging at MGH, commented on the research:
Since the 1960s radiologists have noticed that women have unique and widely variable patterns of breast tissue visible on the mammogram…These patterns can represent the influence of genetics, hormones, pregnancy, lactation, diet, weight loss, and weight gain. We can now leverage this detailed information to be more precise in our risk assessment at the individual level.”
The results were ‘striking’
The model was used to retrospectively identify women at high risk of developing breast cancer from almost 89,000 consecutive screening mammograms taken between 2009 and 2012. It correctly placed 31% of all the patients who had subsequently developed breast cancer in the top risk decile. The corresponding value achieved using the existing Tyrer-Cuzick model was only 18%.
Allison Kurian, an associate professor of medicine and health research/policy at Stanford University School of Medicine, commented.
It’s particularly striking that the model performs equally as well for white and black people, which has not been the case with prior tools…If validated and made available for widespread use, this could really improve on our current strategies to estimate risk.”
The team aims to make their model a part of the standard of care. By predicting which individuals will develop cancer in the future, management strategies can be tailored accordingly, preventing breast cancer from developing and saving lives.
Source:
Yala A., et al. (2019). A Deep Learning Mammography-based Model for Improved Breast Cancer Risk Prediction. Radiology. doi.org/10.1148/radiol.2019182716.