摘要

In this paper, we propose a new method for sub-sample computation of time displacements between two sampled signals. The new algorithm is based on sampled auto- and cross-correlation sequences and takes into account only the sampled signals without the need for the customary interpolation and fitting procedures. The proposed method was evaluated and compared with other methods, in simulated and real signals. Four other methods were used for comparison: two based on cross-correlation plus fitting, one method based on spline fitting over the input signals, and another based on phase demodulation. With simulated signals, the proposed approach presented similar or better performance, concerning bias and variance, in almost all the tested conditions. The exception was signals with very low SNRs (<10 dB), for which the methods based on phase demodulation and spline fitting presented lower variances. Considering only the two methods based on cross-correlation, our approach presented improved results with signals with high and moderate noise levels. The proposed approach and other three out of the four methods used for comparison are robust in real data. The exception is the phase demodulation method, which may fail when applied to signals collected from real-world scenarios because it is very sensitive to phase changes caused by other oscillations not related to the main echoes. This paper introduced a new class of methods, demonstrating that it is possible to estimate sub-sample delay, based on discrete cross-correlations sequences without the need for interpolation or fitting over the original sampled signals. The proposed approach was robust when applied to real-world signals and presented a moderated computational complexity when compared to the other tested algorithms. Although the new method was tested using ultrasound signals, it can be applied to any time-series with observable events.

  • 出版日期2017-1