Connect with us

Technology

AI tongue scanner can diagnose diseases with 96 percent accuracy

Avatar

Published

on

AI tongue scanner can diagnose diseases with 96 percent accuracy

A new machine learning model based on artificial intelligence is able to accurately diagnose certain diseases almost every time just by looking at a patient’s tongue. Although the new technology is state-of-the-art, it is inspired by medical approaches that people have used for more than 2,000 years.

When it comes to diagnosing conditions, traditional Chinese medicine and other practices often turn to the tongue for clues. Based on its color, shape and thickness, the muscle can reveal a number of potential health problems – from cancer to diabetes, to even asthma and gastrointestinal problems. Now, after more than two millennia of looking into patients’ mouths for answers, doctors will soon be able to get a second opinion from artificial eyes powered by machine learning.

[Related: An experimental AI used human brain waves to regenerate images.]

“Human tongues possess unique features and characteristics associated with the body’s internal organs, which effectively detect diseases and monitor their progress… Of these, tongue color is of utmost importance,” says a team of engineering researchers working with the University of South Australia (UniSA) and Iraq’s Middle Technical University (MTU) in a recent study published in the journal, Technologies.

Ali Al-Naji, senior author of the paper and adjunct associate professor in UniSA’s Department of Medical Instrumentation Techniques Engineering, presented a number of scenarios in the study. Announcement August 13.

“Normally, people with diabetes have a yellow tongue; cancer patients have a purple tongue with a thick fatty layer; and acute stroke patients have an unusually shaped red tongue,” he explained. A white tongue, meanwhile, can indicate anemia, while indigo or violet color indicates vascular and gastrointestinal problems or asthma. In more recent cases, deep red tongues can indicate severe cases of COVID-19.

A researcher demonstrates how a camera captures images of the tongue and analyzes them for disease. Credit: Middle Technical University

Like similar machine learning algorithmic programs, Al-Naji’s team built their own system by visually training it on two data sets. First, they input 5,260 images in seven colors in different saturations and lighting conditions. Of those, 300 “gray” entries represented various unhealthy soles, along with 310 “red” selections instead of healthy examples. Then, two Iraqi teaching hospitals in Dhi Qar and Mosul trained the system in real time using 60 photos showing a mix of healthy human tongues and people with various diseases, including mycotic infections, asthma, COVID-19, fungiform papillae and anemia.

Finally it was time to test the algorithm in person. After connecting the program to a USB webcam, both healthy and sick volunteers were asked to place their tongue 20 cm from the camera to scan. According to Al-Naji’s team, the results showed “remarkable precision.”

“The proposed system could efficiently detect several conditions that exhibit distinct changes in tongue color, with the accuracy of the trained models exceeding 98 percent,” they write in the study’s conclusion. The program achieved an accuracy of 96.6 percent on 60 tongue images.

According to researchers, they believe the experiments illustrate the promising feasibility of integrating similar or improved AI systems into medical facilities to one day provide a “safe, efficient, easy-to-use, convenient and cost-effective method of disease screening.”