This brain-machine interface (BMI) software and equipment finds the functional connection between brain activity and intended physical action. Available technology requires the patient to have the ability to move to create such a connection. However, patients may have limited or no movement capabilities for a variety of reasons, including stroke, paralysis or amputation. Each year, 800,000 Americans suffer strokes, and as many as 20,000 experience traumatic spinal cord injuries. The military efforts in Iraq and Afghanistan already have resulted in more than 1,700 major amputations for American soldiers as well. Researchers at the University of Florida have created an architecture that provides the learning necessary to use BMI software and equipment for patients who are physically unable to move. This invention addresses an unmet need for paralyzed patients or patients with other motor neuropathies, such as amputees and stroke patients, who are unable to generate the movement trajectories necessary for BMI training. The mechanism underlying this architecture is learning control policies using feedback from the external environment. In addition to learning control without movements, this creates a bridge to adapting control in a dynamic environment. This is a key challenge in brain machine interfaces and adaptive algorithms in general.
Software and equipment that coadapts to translate neural activity into physical movement for patients with paralysis or prosthetics
This BMI architecture translates neural activity into goal-directed behaviors without first having to map the patient’s movement to control computers or prosthetic devices. Available technologies require a patient to physically make movements to train the BMI control systems. UF researchers have developed a semi-supervised BMI control architecture that uses reinforcement learning to co-adaptively find the neural state to motor mapping in goal-directed tasks. Here the algorithm is able to learn from a noisy control signal (the patient’s brain) and a changing environment. Instead of imposing rigid and often movement-based mappings, the system can coadapt to consistently make the most beneficial decisions. This breakthrough could improve quality of life for a large population of patients who could benefit from prosthetic technologies.