25 August 2021

A baby in Flinders Medical Centre's intensive care neonatal unit, one of seven infants whose vital signs were remotely monitored in the study.

University of South Australia researchers have designed a computer vision system that can automatically detect a tiny baby’s face in a hospital bed and remotely monitor its vital signs from a digital camera with the same accuracy as an electrocardiogram machine.

Using artificial intelligence-based software to detect human faces is now common with adults, but this is the first time that researchers have developed software to reliably detect a premature baby’s face and skin when covered in tubes, clothing, and undergoing phototherapy.

Engineering researchers and a neonatal critical care specialist from UniSA remotely monitored heart and respiratory rates of seven infants in the Neonatal Intensive Care Unit (NICU) at Flinders Medical Centre in Adelaide, using a digital camera.

“Babies in neonatal intensive care can be extra difficult for computers to recognise because their faces and bodies are obscured by tubes and other medical equipment,” says UniSA Professor Javaan Chahl, one of the lead researchers.

“Many premature babies are being treated with phototherapy for jaundice, so they are under bright blue lights, which also makes it challenging for computer vision systems.”

The ‘baby detector’ was developed using a dataset of videos of babies in NICU to reliably detect their skin tone and faces.

Vital sign readings matched those of an electrocardiogram (ECG) and in some cases appeared to outperform the conventional electrodes, endorsing the value of non-contact monitoring of pre-term babies in intensive care.

The study is part of an ongoing UniSA project to replace contact-based electrical sensors with non-contact video cameras, avoiding skin tearing and potential infections that adhesive pads can cause to babies’ fragile skin.

Infants were filmed with high-resolution cameras at close range and vital physiological data extracted using advanced signal processing techniques that can detect subtle colour changes from heartbeats and body movements not visible to the human eye.

UniSA neonatal critical care specialist Kim Gibson says using neural networks to detect the faces of babies is a significant breakthrough for non-contact monitoring.

“In the NICU setting it is very challenging to record clear videos of premature babies. There are many obstructions, and the lighting can also vary, so getting accurate results can be difficult. However, the detection model has performed beyond our expectations.

“Worldwide, more than 10 per cent of babies are born prematurely and due to their vulnerability, their vital signs need to be monitored continuously. Traditionally, this has been done with adhesive electrodes placed on the skin that can be problematic, and we believe non-contact monitoring is the way forward,” Gibson says.

Professor Chahl says the results are particularly relevant given the COVID-19 pandemic and need for physical distancing.

In 2020, the UniSA team developed world-first technology, now used in commercial products sold by North American company Draganfly, that measures adults’ vital signs to screen for symptoms of COVID-19.

The results have been published in the Journal of Imaging.

Notes for editors

Non-Contact Automatic Vital Signs Monitoring of Infants in a Neonatal Intensive Care based on Neural Networks” is published in the Journal of Imaging. For a copy of the paper please email UniSA media officer candy.gibson@unisa.edu.au

 

Contact for interview:  Professor Javaan Chahl M: +61 429 459 394 E: javaan.chahl@unisa.edu.au
Kim Gibson T: +61 8 8302 2706 E: kim.gibson@unisa.edu.au

Media contact: Candy Gibson M: +61 434 605 142 E: candy.gibson@unisa.edu.au

 

 

 

Other articles you may be interested in