摘要

In Very Long Baseline Interferometry, signals from far radio sources are simultaneously recorded at different antennas, with the purpose of investigating their physical properties. The recorded signals are generally modeled as realizations of Gaussian processes, whose power is dominated by the system noise at the receiving antennas. The actual signal coming from the radio source can be detected only after cross-correlation of the various data-streams. The signals received at each antenna are digitized after low noise amplification and frequency down-conversion, in order to allow subsequent digital post-processing. The applied quantization is coarse, 1 or 2 bits being generally associated to the signal amplitude. In modern applications the sampling is typically performed at a high rate, and subchannels are then generated by filtering, followed by decimation and requantization of the signal streams. The redigitized streams are then cross-correlated to extract the physical observables. While the classical effect of quantization has widely been studied in the past, the decorrelation induced by the filtering and requantization process is still characterized experimentally, mainly due to its inherent mathematical complexity. In the present work we analyze the above problem, and provide algorithms and analytical formulas aimed at predicting the induced decorrelation for a wide class of quantization schemes, with the unique assumption of weakly correlated signals, typically fulfilled in VLBI and radio astronomy applications.

  • 出版日期2013-3

全文