摘要

This paper presents a loop gain optimization technique for integer-digital phase-locked loops with a time-to-digital converter. Due to noise filtering properties, a phase-locked loop has an optimal loop gain which gives rise to the best jitter performance, taking into account external and internal noise sources. By using the loop gain optimization technique, the digital phase-locked loops can automatically attain this loop gain in background to minimize the jitter. Theoretical analysis is presented. The stability issue and the impact of loop latency are also discussed. Finally, the analysis is compared to behavioral simulations with good agreement.