Epilepsy is the fourth most common neurological disorder. 1% of Americans suffer from some form of epilepsy and nearly every 1 in 26 individuals will develop epilepsy in the future. One most common and dangerous aspect of the disease is that the seizures hit suddenly and sometimes they are extreme in severity and no one can predict when the next seizure will strike. Hence a wearable which can detect and predict the early signs of seizures and alerts the people which then helps alleviate the side effects of the convulsions is the most needed invention of the recent times. Researchers at IBM claim that they have developed a portable chip that can do a similar job.
This system was built on a bulk of 16 years worth of data collected from the epileptic patients who had surgically implanted electrodes in their head. All of their brain activity from the type and time of seizures and all kinds of brain waves were collected.
The Scientists at IBM Research Australia then made use of the data sets to feed a range of deep learning algorithms namely the neural networks. These algorithms learned a variety of brain activity associated with the onset of seizures. Scientists ran the neural networks on a neuromorphic engineered chip they developed called TrueNorth. It runs on extremely low power and is the size of a postal stamp. This chip can be used in a wearable or be fitted into a cellular device.
“One-third of epilepsy patients don’t improve with drugs or other treatment, so for those people, a prediction system could be the only tool that gives them some control over their disease,” stated Stefan Harrer, the head researcher of the following project. “In fact, the burden of not knowing when a seizure will occur tends to lead people to avoid socializing, playing sports, traveling or doing anything where they don’t want to get surprised with a seizure. That kind of confinement, in turn, often leads to depression. Our motivation is to put control and knowledge back into their lives.”
However, this chip is still in the proof-of-concept phase and has not been tested upon humans. “This is a demonstration of the feasibility of building a verifiable seizure prediction system,” said Harrer. The group led by Harrer tested the chip on a simulation by using the collected data.
The sensitivity of the device can be enhanced or decreased depending upon the needs and type of the patient. In the increased sensitivity mode the system predicted seizures for almost 90% of the time but the rest of the time it was also lightened up on the warning mode. “We need to reduce the false positive rate before this system can be used as a real device on patients,” said Harrer.
Activity patterns of the brain which indicates an upcoming seizure are extremely hard to identify. Such patterns differ on individual basis and can change too in the lifetime of a person. This means that a lot of data needs to be fed to the deep learning algorithms and there is a need to make them more efficient so that they can identify the numerous variety of brain activity and adapt to a person’s needs.
IBM is amongst those groups that have been working on epileptic seizure prediction systems. The 16 years of EEG data was collected by a team of researchers at the University of Melbourne who also developed algorithms for the prediction of the onset of seizures.
“But previous restraints in computing capabilities have limited robustness of these algorithms as well as their real-time capabilities,” stated Harrer.
Because the process was requiring hours of labour, the team at UoM focused on selected intervals of brain activity instead of the whole data set, to make the analysis sapid. This resulted in diminished robustiousness of the prognosis in some of the patients. This also meant that the system was unable to adapt to the patient’s illness and brain activity pattern in real time.
IBM developed chip inspected UoM’s entire EEG database and didn’t rely upon features picked by humans. “We let the algorithm plow through it and find patterns of interest by itself,” said Harrer. “The computer constantly trains and re-trains on those changing patterns over time,” making a real-time prediction system feasible,” he stated.
“IBM would like to improve the performance of the algorithm further by exploring other neural network architectures and by including other factors and biomarkers, Harrer commented. Moreover, he would also like to find a way to train the algorithms on data collected outside the skull, rather than from electrodes implanted in the brain.
You can read the original research paper here.