Geometry and Topology in Dimension Reduction

Monday, March 26, 2012 -
4:30pm to 5:30pm
In the first part of the talk we describe how learning the gradient of a regression function can be used for supervised dimension reduction (SDR). We provide an algorithm for learning gradients in high-dimensional data, provide theoretical guarantees for the algorithm, and provide a statistical interpretation. Comparisons to other methods on real and simulated data are presented. In the second part of the talk we present preliminary results on using the Laplacian on forms for dimension reduction. This involves understanding higher-order versions of the isoperimetric inequality for both manfifolds and abstract simplicial complexes.
Sayan Mukherjee
Duke University
Event Location: 
Fine Hall 214