摘要

We give an algorithm to learn an intersection of k halfspaces in R(n) whose normals span an l-dimensional subspace. For any input distribution with a logconcave density such that the bounding hyperplanes of the k halfspaces pass through its mean, the algorithm (epsilon, d)-learns with time and sample complexity bounded by (nkl/epsilon)(O(l)) log1/epsilon delta. The hypothesis found is an intersection of O(k log(1/epsilon)) halfspaces. This improves on Blum and Kannan's algorithm for the uniform distribution over a ball, in the time and sample complexity (previously doubly exponential) and in the generality of the input distribution.

  • 出版日期2010-10