The technology itself is not brand new – it’s currently used to diagnose the early signs of Parkinson’s disease, wherein doctors compare samples to root out speech peculiarities caused by Parkinson’s.
Patel says the symptoms of those two mental illnesses are quite similar in their early stages.
"Your voice is really important," she said. "You react to [stimuli] with your voice automatically, without even realizing it… now, the question is if we can use [voices] to indicate other neurological disorders as well."
“If patients can be diagnosed in an early stage of disease, treatment and drug therapies can start at that time, possibly slowing the disease progression,” Patel added.
But current Alzheimer’s diagnosis methods leave many patients untreatable.
In Patel’s research, volunteers are connected to an electroencephalogram (EEG) machine and a microphone, and are asked to simultaneously pronounce sounds and listen to themselves though headphones.
“They are asked to maintain a steady sound, but we make it tricky by changing the auditory feedback (the sound they hear) slightly, such as pitch or loudness, and measure their neural and voice responses.”
Patel also hopes to create a smartphone app that people may someday use for diagnosis themselves, without even visiting doctor.
According to official data, there are over 44 million Alzheimer’s sufferers around the globe, and only one person in four is aware of his or her illness. The world spends some $605 billion on dementia treatment every year, according to Alzheimers.net, equivalent to one percent of combined global GDP.