摘要

Minimax robust hypothesis testing is studied for the cases where the collected data samples are corrupted by outliers and are mismodeled due to modeling errors. For the former case, Huber's clipped likelihood ratio test is introduced and analyzed. For the latter case, first, a robust hypothesis testing scheme based on the Kullback-Leibler divergence is designed. This approach generalizes a previous work by Levy. Second, Dabak and Johnson's asymptotically robust test is introduced, and other possible designs based on f-divergences are investigated. All proposed and analyzed robust tests are extended to fixed sample size and sequential probability ratio tests. Simulations are provided to exemplify and evaluate the theoretical derivations.

  • 出版日期2017-9