摘要

Modeling and analysis of low-frequency noise, such as 1/f and burst noise, with time-varying bias conditions is a long-standing open problem in circuit simulation. In this paper, we offer a solution for this problem and present a computational model for low-frequency noise. The merits of our model are twofold. First, it is fully nonstationary. It can represent noise processes with time-varying statistics that are tightly coupled to the circuit variables in a stochastically correct manner. Second, its mathematical structure allows the utilization of well-established, non-Monte Carlo noise analysis techniques which are orders of magnitude faster than their Monte Carlo counterparts. We first provide an overview of our noise model along with the legacy modeling techniques. We verify that, when used with fast non-Monte Carlo analysis methods, our noise model produces results with the same accuracy as computationally laborious Monte Carlo simulations. We then proceed to contrast our nonstationary noise model with the legacy modulated stationary model. As a typical instance, where the results produced by these two models differ significantly due to intricate nonstationary behavior, we analyze the switched biasing technique that was proposed in order to reduce low-frequency noise. As a case study, we examine the 1/f noise behavior in the previously proposed coupled sawtooth oscillator circuit and show that, whereas simulations conducted with the legacy modulated stationary model suggest that no noise reduction is obtainable with switched biasing, our nonstationary noise model predicts the correct amount of noise reduction observed in previously published experimental data.

  • 出版日期2016-11