摘要

This paper presents a new haptic rendering method for streaming point cloud data. It provides haptic rendering of moving physical objects using data obtained from RGB-D cameras. Thus, real-time haptic interaction with moving objects can be achieved using noncontact sensors. This method extends "virtual coupling"-based proxy methods in a way that does not require preprocessing of points and allows for spatial point cloud discontinuities. The key ideas of the algorithm are iterative motion of the proxy with respect to the points, and the use of a variable proxy step size that results in better accuracy for short proxy movements and faster convergence for longer movements. This method provides highly accurate haptic interaction for geometries in which the proxy can physically fit. Another advantage is a significant reduction in the risk of "pop through" during haptic interaction with dynamic point clouds, even in the presence of noise. This haptic rendering method is computationally efficient; it can run in real time on available personal computers without the need for downsampling of point clouds from commercially available depth cameras.

  • 出版日期2013-9