摘要

In this paper a new method is introduced and investigated for removing the destabilizing effects of time-delay parameter in control loops. The concept of the method is taken from the knowledge concerning the dynamic behaviour of irrational transfer functions (Ir-TF), which is discussed and investigated elswhere in frequency response domain and is explained briefly here. Ir-TFs, which are well capable of representing the model structure of a wide range of distributed parameter process systems are known o have transcendental characteristics in their frequency responses. The main complexity of these systems is in their phase behavior, which appears to have the capability to represent a complete time-delay characteristic as well as the characteristics in which the effect of time-delay is much limited. The conditions for appearance of the above dual phase characteristics may guide one to synthesise a contol loop in which the non-minimum phase dinamics of the open-loop transfer function is removed. This concept, when used in a simple loop by using a suitable predictor, affects the robustness features of the loop in a desirable manner and improves the stability characteristics of the loop, provided that the required conditions for the predictor is established. In addition to the important robustness property, the proposed time-delay compensator provides some advantages and specific properties in comparison to the conventional Smith predictor. These are the capability to be used for controlling the processes with an irrational transfer function model as well as the integrated processes that include time-delay parameter.

  • 出版日期2008