Using machine vision and AI for COVID monitoring

Posted 22 Apr

Lisa Bailey
unisa covid remote test

When everything changed in 2020, the research conducted by Sensor Systems group at UniSA STEM changed too.

Adaptability is really just the ability for a system to change in evolving conditions. When the pandemic struck, Javaan Chahl and his group were able to adapt a machine vision and AI process, initially designed around search and rescue, to provide health monitoring and feedback on COVID-19 symptoms.

The team uses 4K cameras to monitor things like breathing rate, heart rate, and a thermal camera to measure body temperature.

The idea – COVID monitoring from the sky

This software was licensed to Canadian-based drone manufacturer Draganfly in March 2020, just as the surge of cases of coronavirus cases worldwide took off. The intention was to match machine vision and health monitoring detection with drone surveillance, to give a rapid, large scale population screening tool. Yep, they wanted to fly drones around cities to monitor people for COVID symptoms.

If being scanned from the sky, potentially without you even realising it, makes you uncomfortable, then you’re not alone.  Remote surveillance with drones raises many ethical concerns about privacy, including whether individuals are identifiable, and how that data is being collected, stored, or who it is shared with.

The adaptation – COVID kiosks for rapid screening

These privacy concerns eventually led to an early failed pilot of the drone monitoring program in April.   Although the AI was programmed to blur the faces of people captured in drone footage to protect privacy, concerns about the ethics and privacy of this type of large-scale surveillance remained.  The project pivoted to installing the health monitoring system into kiosks that could be installed in high traffic areas, scanning health signs in 15 seconds.

Providing this feedback then allows individuals to make choices (should I go to the health care centre to get this checked out?) and at a population level allows organisations to make decisions on how to better manage the flow of people in and out of areas that generally have a lot of visitors like airports, school and universities.

The system had its first real world test at Alabama State University in August 2020 and is now being installed in other locations around the US.

Building a resilient algorithm

Software that needs to be able to recognise these health indicators in all sorts of people needs to learn from a data set that includes a wide variety of people.  There’s a bunch of times that facial recognition algorithms have given poor results for women or people of colour.  This was something that Titi Ogunwa, a PhD Student working on the project, was conscious that their system needed to be able to do. They worked with the team to ensure that their system was trained with a variety of body shapes, sizes and skin tones.

The team is now looking at adapting the system further into new applications, like detection and monitoring of exotic animals or for agriculture.


Comments are closed.

Plan your visit