In brief

The technology combines 3D computer vision and machine learning to diagnose malnutrition and prevent long-term health complications using facial morphometrics.

© Pexels

Smartphones unmask hidden malnutrition

13 Oct 2023

A new digital healthcare platform allows physicians to remotely monitor elderly patients at risk of malnutrition with a smartphone.

With the abundance of food options around every corner, it is difficult to imagine that there are those among us who are not eating enough. Elderly people in high-income countries like Singapore can, surprisingly, become victims of undiagnosed malnutrition.

Not only do our metabolisms tend to slow down and reduce our appetites as we get older, but many older adults also battle chronic health conditions that reduce their ability to absorb nutrients.

“Malnutrition can remain unaddressed for long periods of time, exacerbating the malnutrition and potentially giving rise to a host of other complications,” explained Christiani Jeyakumar Henry, Deputy Executive Director at A*STAR’s Singapore Institute of Food and Biotechnology Innovation (SIFBI).

Picking up the subtle signs of malnutrition’s tightening grip has proven to be difficult using conventional monitoring protocols. Patients often need to travel to facilities where trained professionals assess their nutritional statuses using specialised equipment. Besides being costly and inconvenient, this approach misses fluctuations or gradual changes that might occur in between sessions.

Henry and colleagues proposed a digital healthcare solution that puts the power of conventional malnutrition diagnostics in patients’ palms. “Patients' smartphones can capture facial features using built-in cameras and sensors, making the process automated and convenient,” said Henry.

Wesley Tay, first author of the study, worked with the team to develop a platform that uses facial morphometrics (technologies that measure and analyse facial attributes) for real-time malnutrition monitoring in patients.

Powered by smartphone imaging, 3D computer vision and machine learning, the platform scans the patient’s face and detects the tell-tale signs of malnutrition, such as drooping skin or protruding bones.

The system can be used in the comfort of the patient’s home and delivers the results straight to their healthcare teams. This valuable data is able to not only support the health and wellbeing of individual patients but can help healthcare institutions and policymakers make better decisions about public health, said the team.

Henry and the team are in the process of commercialising and patenting their technology and have partnered with groups in the healthcare industry to validate their platform in clinical settings.

Henry added that the novel malnutrition diagnostic can also be used for applications in the consumer health and wellness industry. “Fitness applications can incorporate this technology to monitor and track clients' progress towards their goals, recommending interventions or behaviours that best align with their needs,” concluded Henry.

The A*STAR-affiliated researchers contributing to this research are from the Singapore Institute of Food and Biotechnology Innovation (SIFBI).

Want to stay up to date with breakthroughs from A*STAR? Follow us on Twitter and LinkedIn!


Tay, W., Quek, R., Kaur, B., Lim, J. and Henry, C.J. Use of facial morphology to determine nutritional status in older adults: Opportunities and challenges. JMIR Public Health and Surveillance 8 (7), 1-16 (2022). | article

About the Researcher

Christiani Jeyakumar Henry is a Senior Advisor at A*STAR. He obtained a PhD degree in nutrition from the London School of Hygiene and Tropical Medicine. Henry’s research focuses on translating nutrition research into food applications. In 2010, he was awarded the British Nutrition Foundation Prize for his outstanding contributions to nutrition, and in 2019, he was awarded the Kellogg’s International award for food research that led to a global impact.

This article was made for A*STAR Research by Wildtype Media Group