A clinical research study published in the British Journal of Surgery shows that fluorescence guidance (where a fluorescent chemical compound re-emits light with a long wavelength) can enable a colorectal surgeon to assess cancer tissues visually and with more specificity in real-time during surgery, by utilizing near-infrared (NIR) light from an administered fluorophore in conjunction with artificial intelligence (AI) methods.
In this study, supported by the Disruptive Technologies and Innovation Fund 2018, videos from 24 patients (11 with cancer) surgeries were studied. Numerous ROIs (Regions of Interest) from each area of abnormality were selected for analysis from each video.
NIR intensities were extracted by tracking the ROIs within each video, focusing on the initial wash-in period. The data set used for analysis included 435 ROI profiles each with 12 perfusion-characterizing features with balanced outcomes. At the patient level, the system correctly diagnosed 19 of 20 cancers (95%.)
Speaking about the study, Prof Ronan Cahill, Professor of Surgery at University College Dublin (UCD) and the Mater Misericordiae University Hospital (MMUH) said 'Surgery has the substantial role to play in the therapy of over two-thirds of all cancers and key surgical decisions are traditionally made by human visual judgments, which assume a static biological FOV (Field of View) during the time frame of the observation (which in surgery is moments).'
'The process for uptake and release of an external substance, such as drugs and contrast agents, are unique in cancerous tissues.
As such, we envisaged that an approach combining biophysics-inspired modeling and AI could analyze intraoperative changes in NIR intensities over time in varied tissue, enabling clinically useful lesion classification with high specificity.
To translate this knowledge for the first time into an intraoperative surgical decision support tool, a computer vision-AI real-time tissue-tracking and categorizing protype has been developed. As the prototype relies only on the NIR fluorescence data stream, it is usable with commercially available imaging systems.'
Also speaking about the publication of this study in the British Journal of Surgery,
Targeted agents for cancer imaging currently under trial adhere rigidly to conventional paradigms of fluorescence-guided surgery mechanisms, and in the main are administered systemically before surgery, with the operation scheduled for when maximum stable contrast between the tumour and other issues exist. However this timing is often unpredictable, it can take some days and false positives can occur. Clinical usefulness is further limited by dosing practicalities, scheduling challenges and patient-to-patient and cancer-to-cancer differences. This work instead indicates a novel pathway and process for the immediate, perfect realisation of agent information during surgery which would greatly improve efficiency and effectiveness of cancer care."
Donal O'Shea, Professor, Department of Chemistry, RCSI University of Medicine and Health Sciences
This early experiential report describes the achievement of this real-time decision support tool for the first time. Furthermore:
- the findings are relevant to other cancers and metastases and indeed to other dyes used in cancer surgery
- next stages of work include expanding the tissue classification from operator-selected ROIs to the entire FOV
- additional patient-specific surgery-guided information is envisaged including an AI heat-map display of the classification results to the surgical team along with further fluorescence data-mining via AI.
Source:
Journal reference:
Cahill, R. A., et al. (2020) Artificial intelligence indocyanine green (ICG) perfusion for colorectal cancer intra-operative tissue classification. British Journal of Surgery. doi.org/10.1093/bjs/znaa004.