Vaccines, variants and vigilance — here’s how to navigate this
flu season›››
Often, when the doctor tells you to stick out your tongue and say “ahhh,” he’s usually using a tongue depressor to move it out of the way to get a look at your throat. But the look of the tongue itself can tell a physician a lot about a person’s overall health, and now thermal imaging and AI are joining the tongue-diagnosis game that’s been around for centuries.
Traditional Chinese Medicine, or TCM, has been using the tongue as a diagnostic tool for at least 3,000 years.
It observes three tongue criteria to reveal our health: color, shape and type of coating covering the surface. For example, a healthy tongue would be some shade of pink but if it’s dark red, it might indicate sleep issues or anxiety, and a bluish tinge could indicate poor circulation.
While TCM uses the tongue as a main diagnostic tool, Western medicine might observe the tongue’s condition alongside many other indicators, like medical history and lab results.
This “gap” between the two, however, is nearing bridge status as technology develops — thermal imaging and AI-powered tools in particular.
A team of researchers recently introduced an AI health detector tool designed for TCM using thermal radiation image recognition and showcasing the seamless integration of human computer interaction (HCI) principles into health-care applications.
Infrared thermography captures detailed tongue images and records tongue-heat distribution to create thermal images that represent temperature variations.
The team says its portable, hand-held thermal radiation diagnostic tool, integrated with HCI, and created in collaboration with TCM practitioners, sets their research apart.
The dental mark tongue recognition model, using DenseNet T algorithm architecture, resulted in an average accuracy of 25 percent higher than other dentate tongue-recognition models that are designed to standardize and automate traditional Chinese medicine tongue diagnostics.
Another recent advance in tongue diagnosis leans on AI and machine learning for results.
A paper, published in Technologies, presents a new computer vision system that analyzes tongue color changes, offering potential for real-time diagnosis.
These analyses and machine learning predict health conditions with an accuracy exceeding 98 percent.
The researchers used a webcam to capture images in real time of both sick and healthy individuals and were able to differentiate between them simply by tongue color.
The system applies six machine learning algorithms to classify tongue images under a variety of lighting conditions.
“There have been studies where people tried to (diagnose via tongue color) without a controlled lighting environment, but the color is very subjective,” says co-author Javaan Chahl of the University of Australia.
The model was trained on more than 5,000 images across seven color classes. The results show that AI systems for tongue diagnosis are accurate, efficient, cost-effective and non-invasive. This is particularly important in areas with minimal access to health care, addressing the impact of lighting on the colors of the tongue, a key challenge for tongue diagnosis.
So, the next time you’re looking in the mirror, make sure to observe the conditions of your tongue and see what might be a little out of the ordinary. Sticking out your tongue at yourself might just be the key to preventing health issues.
More like this: How often you breathe could help detect Alzheimer’s