Matrix perturbation bounds with random noise and applications

-
Van Vu , Yale University
Fine Hall 214

Classical matrix perturbation bounds, such as Weyl (for eigenvalues) and David-Kahan (for eigenvectors) have been fundamental  in various areas: numerical analysis, combinatorics, theoretical computer science, statistics, machine learning, etc.   In this talk, I am going to discuss a popular modern setting where the perturbation is random and the original matrix (data) is low rank (or approximately low rank). In this case, we can significantly improve the classical bounds mentioned above, replacing parameters depending on  the dimension of the matrix by parameters depending only on its rank.   As  applications, I will discuss a simple and robust approach to several matrix recovery problems, such as the hidden clique problem, the hidden coloring problem, the clustering problem, the Netflix problem, etc. In various cases, our method leads to results beyond the currently known.   Joint work with S. O'rourke (Yale) and K. Wang (IMA)