Two years into the COVID-19 pandemic, and most of us have settled into a new normal of working from home, attending virtual events and eating delivered meals. But the global vaccine rollout appears to be the light at the end of a long, socially distanced tunnel—teasing the possibility of reviving large, in-person events like concerts, festivals and parades.
To stay safe during mass gatherings in the post-pandemic era, attendees would have to be screened for COVID-19 using rapid tests. One such possible test utilizes loop-mediated isothermal amplification (LAMP) which provides results within half an hour, turning yellow in the presence of the coronavirus and pink in its absence. Sometimes, however, the tests yield an inconclusive orange color, which can be misclassified by untrained users.
To tackle these ambiguous results, researchers led by Samuel Gan, Senior Principal Investigator at the Antibody and Product Development Lab at A*STAR’s Experimental Drug Development Centre (EDDC) and Bioinformatics Institute (BII), sought to automate and quantify the color changes using smartphones. By combining advanced color detection algorithms with high-definition cameras, Gan and colleagues leveraged upon the average smartphone to provide a more objective way to analyze LAMP results than by eye alone.
With the smartphone app developed under the A*STAR Coronavirus (A*CRUSE) Taskforce Project ARCHER, the team trained the platform to recognize positive and negative cases through LAMP result images. The app, called the APD LAMP Diagnostic App, was then configured to interpret LAMP result tubes by precisely measuring how far their hues differed from the yellow positive and pink negative reference colors.
Images captured on smartphones are typically affected by the amount of ambient lighting or the quality of smartphone cameras. To account for these factors, the user must also capture a photo of the positive and negative control LAMP result tube under identical conditions. The control image is then used for within-image calibration, boosting the app’s accuracy and reliability as well as mitigating against environmental or lighting effects.
“The app serves as a good form of documentation and potentially alleviates validation burden in mass screening,” Gan said. “We also incorporated a barcoding feature for easy tracing, allowing the tubes to be easily traced to individuals.”
While the app has yet to be commercially released, the team hopes that it could help decentralize diagnostics and usher in a future where an accurate diagnosis can be achieved on the go with smartphones. “In countries that adopt LAMP, this app can eventually be used by non-technically trained people for events and public space entrances to help in screening,” concluded Gan.
The A*STAR-affiliated researchers contributing to this research are from the Experimental Drug Development Centre (EDDC) and the Bioinformatics Institute (BII).