摘要

An improved deep sub-micrometer (90 nm) large signal model for silicon-based MOSFET that incorporates DC/AC dispersion model is proposed. The derived DC model can accurately predict the device current-voltage behavior over the wide range of bias points and the corresponding extraction method for model parameters is investigated. The improvement also consists of new equations for the nonlinear capacitance phenomenon in the saturation region using few fitting parameters, and emphasizes for the particularly difficult problems associated with the DC/RF dispersion. Model verification is carried out by comparison of measured and simulated S-parameters for 90 nm gate-length MOSFET devices point up to 50 GHz. Good agreement is obtained between measured and modeled results and the scalability of model is also verified in this paper.