Definitive diagnoses
New test will help improve early detection of cognitive diseases
Full access isn’t far.
We can’t release more of our sound journalism without a subscription, but we can make it easy for you to come aboard.
Get started for as low as $3.99 per month.
Current WORLD subscribers can log in to access content. Just go to "SIGN IN" at the top right.
LET'S GOAlready a member? Sign in.
A new technological twist on a standard test for Alzheimer’s and other cognitive diseases promises to deliver more accurate diagnoses much earlier.
Researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have combined digital pen technology and computer learning algorithms to create a predictive model for early detection.
For a number of years, doctors have used the Clock Drawing Test (CDT) to screen patients for conditions including Parkinson’s and Alzheimer’s. The test asks the subject to draw the face of a clock with the hands showing 10 minutes past 11 and then asks the patient to copy a pre-drawn clock face showing the same time.
The CDT, which is typically done with pen and paper, requires the neurologist to rely on a subjective analysis of the final drawing. In the digital version of the CDT, the MIT researchers replaced the ink pen with an Anoto Live Pen, a digitizing ballpoint pen that can measure its position on the paper up to 80 times per second using a small built-in camera. The digital pen can pick up every one of a subject’s hand movements and hesitations, providing a more complete picture for medical analysts.
Using a database of 2,600 tests administered over the past nine years, the team from MIT built computer models that show promise, not only for early detection of cognitive impairment, but even determining precisely what condition the patient has. In tests against other standardized methods of diagnosis the machine learning models were significantly more accurate.
Having demonstrated the effectiveness of the digital CDT, the researchers are developing an interface that would allow nonspecialists as well as neurologists to use the technology.
“With the right equipment,” says CSAIL principal investigator Cynthia Rudin, “you can get results wherever you want, quickly, and with higher accuracy.”
Unspoken speech
A new device being developed in Britain transforms paralysis sufferers’ breath into speech.
The invention is an Augmentative and Alternative Communication (AAC) device that analyzes changes in breathing patterns and converts “breath signals” into words using pattern recognition software. A digital speech synthesizer then translates those words into audible speech.
The device, developed at Loughborough University, uses a nose and mouth breathing mask to capture users’ breath patterns, allowing them to develop a customized set of phrases by training the computer to associate certain breath signals with those words or phrases.
“When it comes to teaching our invention to recognize words and phrases, we have so far recorded a 97.5 percent success rate,” said David Kerr, a senior lecturer in the School of Mechanical and Manufacturing Engineering. “Current AAC devices are slow and range from paper-based tools to expensive, sophisticated electronic devices. Our AAC device uses analogue signals in continuous form, which should give us a greater speed advantage because more information can be collected in a shorter space of time.”
Well-known users of AAC devices include Stephen Hawking, who communicates using a single cheek muscle attached to a speech-generating device.
Atul Gaur, a consultant anesthetist who is collaborating with the Loughborough team, believes the device could transform the way people with severe muscular weakness or other speech disorders communicate “by allowing patients, including those on ventilators, to communicate effectively for the first time by breathing—an almost effortless act which requires no speech, limb or facial movements.” —M.C.
Please wait while we load the latest comments...
Comments
Please register, subscribe, or log in to comment on this article.