Scientists Combine Brainwave Technology With Robotics to Create Bionic Limbs

Many technologies that seemed like science fiction only a few years ago, are now becoming realities. Advanced prosthetics, exoskeletons, and other brain-computer interfaces. While these advances are encouraging, there are still many issues to workout. Interpreting signals from the brain is extremely difficult. Scanning the entire brain can usually be done noninvasively, but results in a noisy stream of information. Very precise areas of the brain can be monitored, but this usually requires surgically implanting some kind of device.

In recently presented work, a group based in Switzerland may have a solution to some of these issues. (via EurekaAlert) They've been developing a range of interfaces that work on a principle of shared control. Shared control is a relatively new concept in this field, and essentially means that's the machines themselves take part in the information processing.

Many prosthetics and other devices simply try to interpret raw neural signals, whether from a peripheral nerve or the brain directly. While software built into these devices does decode these signals, that only gets you so far. The primary limitation behind this approach is the fact that the brain only does so much.

Sure our brain, especially the cortex, is responsible for our ultimate decision-making. But turning that decision into action takes much more. The brain stem, spinal cord, and other pathways in the muscles themselves, all take part in the information processing. This complexity is what allows us to perform certain tasks in a very natural or intuitive way, even if our mind is distracted.

Going directly from brain signals to mechanical motion loses some of the nuance of normal brain control, and requires a great deal of concentration from the user. The various technologies that the Swiss team has been developing incorporated sensors and onboard processing. Using input from these sensors, as well as noninvasive EEG inputs from the users, these interfaces better interpret what is trying to be done from the context.

So far this has resulted in better control in things like prosthetic arms, as well as brain controlled steering for wheelchairs and telepresence devices. Fine controlled skills such as writing were possible in prosthetics, and new environments were easily navigated after only nine training sessions with this interface. More testing is necessary but they think that a brain controlled wheelchair could become a potentially common device in the future. Shared control that relies on devices to help interpret context will likely become a growing trends in brain-computer interfaces and other prosthetic devices.

Join the Discussion

Recommended Stories

Real Time Analytics