Efficient computational methods for nonconvex optimization

-
Sheng Xu, Yale University

Online Talk 

Zoom Meeting ID: 956 8252 2304

This talk will discuss two types of nonconvex optimization problems induced by sparsity and mixture assumptions. Two computational methods will be further presented to efficiently solve them: 

  1. We aim to estimate a gradient-sparse parameter on a general graph. We introduce a tree-projected gradient descent algorithm and show the resulting estimators achieve rate-optimal statistical guarantee.
  2. Motivated by applications to single-particle cryo-EM, we consider problems of function estimation from multiple independently rotated observations in a low SNR regime. We study the unregularized MLE and characterize the geometry of the Fisher information and log-likelihood landscape. We show the existence of spurious local optima under the continuous multi-reference alignment model. We further propose a frequency marching algorithm to recover the signal without a good initial guess.