The thought of robots taking over our jobs—and doing them better than we ever could—isn’t always a particularly welcoming thought. One major exception is when artificial intelligence software can analyze rapid diagnostic tests with near-perfect accuracy in under-resourced areas, using only a standard smartphone.
Researchers from University College London and the Africa Health Research Institute have designed an app to do just that, using deep learning technology to double-check the results of rapid HIV tests administered outside of clinical settings, based only on photos of the tests.
The app was trained on a library of more than 11,000 images of HIV tests captured throughout the KwaZulu-Natal province of South Africa, by dozens of the research institute’s trained field workers.
Researchers then launched a study comparing its ability to read HIV tests to staff with a range of training and experience.
In the newly published study, five workers—ranging from newly trained community health workers to professional nurses—used the app to record both their own interpretations of 40 HIV tests and input images of the results for the program’s attempt.
On average, the workers read the test results with just over 92% accuracy, while the algorithm did so with 98.9%.
Past estimates of field workers’ abilities have ranged from 80% to 97% accuracy. The lateral-flow tests, producing thin, colored lines, can be especially difficult to interpret by eye by people who are colorblind or nearsighted.
According to the researchers, the app could be used to significantly reduce testing errors and improve treatment for the estimated 100 million HIV tests administered around the world each year, especially in low- and middle-income countries where reliable clinical resources and trained readers are scarce.
The software could also be adapted to read rapid diagnostic tests for many other diseases, including malaria, syphilis, tuberculosis and influenza, they said.
“Having spent some time in KwaZulu-Natal with fieldworkers organizing the collection of data, I’ve seen how difficult it is for people to access basic healthcare services. If these tools can help train people to interpret the images, you can make a big difference in detecting very early-stage HIV, meaning better access to healthcare or avoiding an incorrect diagnosis,” said the study’s first author, Valérian Turbé, of the UCL London Centre for Nanotechnology.
Next up, the researchers will launch an even larger study of the app to ensure it can be effectively used by people of all ages, genders and levels of digital know-how.
They’re also developing a new feature for the app that would automatically transmit test results to nearby healthcare providers and testing labs.
Not only would that feature ensure that people who test positive are immediately linked to proper care resources, but it could also help map outbreak hotspots, showing care providers and labs where to direct their fieldworkers and testing resources.