摘要

This paper investigates the problem of delay-dependent H (a) filter design for continuous time-delay systems. Attention is focused on the design of linear filters guaranteeing a prescribed noise attenuation level in an H (a) sense. The admissible filters can be obtained from the solution of a convex optimization problem in terms of linear matrix inequalities (LMIs), which can be readily solved via standard software. The crucial issue for solving the filter design problem is the utilization of the delay partitioning idea, which proves to be less conservative than most of the existing results, and the conservatism could be notably reduced by thinning the delay partitioning. Numerical examples are provided to show the effectiveness and the advantage of the proposed filter design method.