摘要

BackgroundRobotic-assisted surgery allows surgeons to perform many types of complex operations with greater precision than is possible with conventional surgery. Despite these advantages, in current systems, a surgeon should communicate with the device directly and manually. To allow the robot to adjust parameters such as camera position, the system needs to know automatically what task the surgeon is performing. MethodsA distance-based time series classification framework has been developed which measures dynamic time warping distance between temporal trajectory data of robot arms and classifies surgical tasks and gestures using a k-nearest neighbor algorithm. ResultsResults on real robotic surgery data show that the proposed framework outperformed state-of-the-art methods by up to 9% across three tasks and by 8% across gestures. ConclusionThe proposed framework is robust and accurate. Therefore, it can be used to develop adaptive control systems that will be more responsive to surgeons' needs by identifying next movements of the surgeon.

  • 出版日期2017-9