In brief

The AI-driven diagnostic tool created by researchers could reduce the need for specialised endoscopic equipment in cancer diagnosis.

© Unsplash

Cancer diagnosis gets a helping hand

10 Mar 2023

A new diagnostic innovation powered by artificial intelligence could empower surgeons to make more accurate and effective clinical decisions for colon cancer patients.

Oncologists may soon add artificial intelligence (AI) and computational platforms to their toolkits, which will be good news for the fight against some of the most difficult-to-treat tumour types such as colorectal tumours. The choice between the two resection options for colorectal tumours—endoscopic based resection including endoscopic mucosal resection (EMR) and endoscopic submucosal dissection (ESD) or conventional surgical resection—hinges on the depth of tumour invasion.

“Early colorectal cancers with an invasion depth of more than one millimetre into the submucosal layer have a higher risk of lymph node metastasis, which are contraindicated for ESD and EMR and require the more intensive surgery as a safety precaution,” explained Jiahao Wang from A*STAR’s Institute of Bioengineering and Bioimaging (IBB).

However, gauging the invasion depth of tumours in real-time and thus making the call on which surgery to choose can be a challenge for oncologists who don’t always have access to specialised diagnostic endoscopic equipment. Moreover, while early iterations of computer-aided diagnostics (CAD) have been developed, they are currently too inaccurate to be reliable.

Wang and IBB Senior Principal Investigator Hanry Yu set out to develop an effective alternative to help surgeons out in the operating theatre. The team built an AI-driven CAD which analyses images captured using white light colonoscopy (WLC), a powerful technique for classifying and visualising colon disease.

The team created a training model using over 7,000 WLC images, which were from 657 cancerous lesions that were marked up by trained cancer histologists. They then fed this data into a deep convolutional neural network constructed on GoogLeNet architecture by utilising contextual information inspired by relevant diagnostic processes by experts. This approach was selected so that the platform can be run efficiently in hospital settings with low computational resources.

Next, they validated their model with a testing dataset of over 1,600 WLC images, about 10 percent of which depicted deeply invasive colon cancer at early stage. The researchers reported that the new model was more accurate, sensitive and specific than existing technologies. Besides boasting an accuracy level of over 91 percent, the model could differentiate between superficial and deeply invasive colorectal tumours just as well as experienced endoscopists could.

Wang is hopeful that their innovation could one day help more cancer patients. “Even in situations where experienced doctors are unavailable, the algorithm can flag cancerous regions,” he said.

Nevertheless, some barriers still stand in the way of widespread clinical adoption. The current iteration was trained using still images, but it is uncertain how the algorithm will perform when presented with video feeds. Future studies in collaboration with other industry experts will aim to unlock the full potential of the platform, say the researchers.

The A*STAR-affiliated researchers contributing to this research are from the Institute of Bioengineering and Bioimaging (IBB).

Want to stay up to date with breakthroughs from A*STAR? Follow us on Twitter and LinkedIn!


Luo, X., Wang, J., Han, Z., Yu, Y., Chen, Z., et al. Artificial intelligence-enhanced white-light colonoscopy with attention guidance predicts colorectal cancer invasion depth. Gastrointestinal Endoscopy, 94(3) (2021). | article

About the Researchers

Hanry Yu is a Professor of Physiology & Mechanobiology and NUS College at the National University of Singapore; Senior PI at the Institute of Bioengineering and Bioimaging, A*STAR; and Co-lead PI for a MIT research entity in Singapore. He was trained in cell biology but integrates imaging, biomaterials, tissue engineering, drug testing, contextual AI, and mechanobiology to solve problems in GI tract diseases and recently in growing cultivated meat as functional ingredients to improve plant-based meat analogs. He is an award-winning serial technopreneur who strives to build integrated teams to equip future graduates with skills relevant in both industrial and future academic settings. He has taught students in leading universities in the US and Asia.
Jiahao Wang works as a research intern under Farah Tasnim at the Institute of Bioengineering and Bioimaging, A*STAR. He obtained his bachelor degree with honours at the School of Medicine and Chu Kochen Honors College of Zhejiang University. He is currently a PhD candidate at the Mechanobiology Institute Singapore, National University of Singapore. Wang has research interests in bioimaging, image processing and computer aided diagnosis in medical imaging through machine learning and deep learning.

This article was made for A*STAR Research by Wildtype Media Group