Learning Intersections of Halfspaces
Adam Klivans Toyota Technological Institute at Chicago and University of Texas at Austin
Friday, November 19, 2004 2:30, 1221 CS
Learning a halfspace is one of the oldest and most fundamental tasks in
machine learning. Polynomial-time algorithms for learning
halfspaces are at the heart of many systems used by practitioners from
fields such as artificial intelligence, statistics, and computational
biology to classify data.
Surprisingly, learning the intersection of even two halfspaces in
polynomial-time remains a difficult open problem. Intersections of
halfspaces are a powerful concept class, as they are capable of
expressing any convex body.
In this talk, we will present two new algorithms for learning
intersections of halfspaces under two natural restrictions. First, we
give an algorithm for learning the intersection of two halfspaces under
any distribution that respects a sufficiently large margin. Our
algorithm applies random projection with polynomial-threshold functions
and can tolerate a margin of $\rho = \frac{1}{2^{sqrt{\log n}}}$ where $n$
is the dimension. Second, we give the first polynomial-time algorithm
for learning any function of a constant number of halfspaces with respect
to the uniform distribution. Here we make use of new properties of the
Fourier spectrum of threshold functions.
The talk comprises work done with Ryan O'Donnell and Rocco Servedio.
|