Robotic Arm Control Through Algorithmic Neural Decoding and Augmented Reality Object Detection
A brain-machine interface (BMI) is a device that uses neurally implanted sensors to translate brain neuron activity into commands capable of controlling external software or hardware, such as a computer or robotic arm. BMIs are often used as assistive living devices for individuals with motor, sensory, and/or verbal impairments. They can also restore more independence than voice-controlled prosthetics by eliminating the social anxiety induced by announcing everything one wants to do. The Andersen Lab’s previous BMIs used brain signals of paralyzed individuals to control only the speed of a robotic arm as it moved a predefined object between predefined locations.
This project’s BMI incorporated Microsoft’s HoloLens2 augmented reality headset, allowing users to select an object, its start/end location, and action to be executed on it by a JACO robotic arm. HoloLens2’s eye-tracking, image recognition, object detection, AR object positioning, spatial awareness, and pixel to real-world coordinates training features determined the robotic arm’s trajectory. The desired action (grasp, drink, drop) and execution speed on the object were identified by neural decoding a paralyzed individual’s thoughts. Data analysis of tasks created to analyze possible control signals for action selection demonstrated higher neuron firing rates in motor imagery than nonverbal commands.