摘要

In weak GNSS signal environments, extending integration time is paramount to improving the GNSS receiver's sensitivity. Furthermore, sufficient coherent integration can help to mitigate multipath and cross-correlation false locks and avoid squaring loss. However, extending integration time is limited by the navigation message data bit, if present. The maximum likelihood (ML) estimation method has been shown as the most effective way to estimate the navigation bit boundary locations (i.e., bit synchronization) and subsequently estimate the data bit values (i.e., bit decoding) in the presence of noise alone. In this paper, the performance of ML bit synchronization and decoding is systematically assessed as a function of the number of data bits, the effect of Doppler error and received signal power in different tracking modes (i.e., phase-locked mode and frequency-locked mode). In addition, the theoretical performance models of ML bit synchronization and decoding are developed based on statistical theory. The experimental validation of the developed performance models and analyses is reported. For GPS L1 C/A signals, it is shown that for ML bit synchronization, using 100 data bits, the successful synchronization rate (SSR) can reach to about 100% with C/N-0 as low as 20 dB-Hz with no Doppler error. The performance degradation caused by Doppler error is not significant if the Doppler error is within 5 Hz, and with the maximum tolerance of 25 Hz, while for ML bit decoding, the successful decoding rate (SDR) of the 2-bit sequence can reach to about 100% with C/N-0 as low as 25 dB-Hz with no Doppler error. The performance degradation caused by Doppler error is not significant if the Doppler error is within 2 Hz. Both theoretical and simulation results establish that the upper bound of Doppler error for a 2-bit sequence is 12.5 Hz.

  • 出版日期2014-1-14