Decoding grasp movement from monkey premotor cortex for real-time prosthetic hand control

作者:Hao YaoYao; Zhang QiaoSheng; Zhang ShaoMin; Zhao Ting; Wang YiWen; Chen WeiDong; Zheng XiaoXiang*
来源:Chinese Science Bulletin, 2013, 58(20): 2512-2520.
DOI:10.1007/s11434-013-5840-0

摘要

Brain machine interfaces (BMIs) have demonstrated lots of successful arm-related reach decoding in past decades, which provide a new hope for restoring the lost motor functions for the disabled. On the other hand, the more sophisticated hand grasp movement, which is more fundamental and crucial for daily life, was less referred. Current state of arts has specified some grasp related brain areas and offline decoding results; however, online decoding grasp movement and real-time neuroprosthetic control have not been systematically investigated. In this study, we obtained neural data from the dorsal premotor cortex (PMd) when monkey reaching and grasping one of four differently shaped objects following visual cues. The four grasp gesture types with an additional resting state were classified asynchronously using a fuzzy k-nearest neighbor model, and an artificial hand was controlled online using a shared control strategy. The results showed that most of the neurons in PMd are tuned by reach and grasp movement, using which we get a high average offline decoding accuracy of 97.1%. In the online demonstration, the instantaneous status of monkey grasping could be extracted successfully to control the artificial hand, with an event-wise accuracy of 85.1%. Overall, our results inspect the neural firing along the time course of grasp and for the first time enables asynchronous neural control of a prosthetic hand, which underline a feasible hand neural prosthesis in BMIs.

全文