摘要

A general theoretical framework is presented for analyzing information transmission over Gaussian channels with memoryless transceiver distortion, which encompasses various nonlinear distortion models including transmit-side clipping, receive-side analog-to-digital conversion, and others. The framework is based on the so-called generalized mutual information (GMI), and the analysis in particular benefits from the setup of Gaussian codebook ensemble and nearest-neighbor decoding, for which it is established that the GMI takes a general form analogous to the channel capacity of undistorted Gaussian channels, with a reduced "effective" signal-to-noise ratio (SNR) that depends on the nominal SNR and the distortion model. When applied to specific distortion models, an array of results of engineering relevance is obtained. For channels with transmit-side distortion only, it is shown that a conventional approach, which treats the distorted signal as the sum of the original signal part and a uncorrelated distortion part, achieves the GMI. For channels with output quantization, closed-form expressions are obtained for the effective SNR and the GMI, and related optimization problems are formulated and solved for quantizer design. Finally, super-Nyquist sampling is analyzed within the general framework, and it is shown that sampling beyond the Nyquist rate increases the GMI for all SNR values. For example, with binary symmetric output quantization, information rates exceeding one bit per channel use are achievable by sampling the output at four times the Nyquist rate.

全文