MINDS Symposium on the Foundations of Data Science

/ February 13, 2020/

February 18, 2020 @ 3:00 pm – 4:00 pm
Hodson 310

Kaizheng Wang

Title: Latent variable models: spectral methods and non-convex optimization

Abstract: Latent variable models lay the statistical foundation for data science problems with unstructured, incomplete and heterogeneous information. For the sake of computational efficiency, heuristic algorithms are proposed to extract the latent low-dimensional structures for downstream tasks. Despite their huge success in practice, theoretical understanding is lagging far behind and that hinders further advancement. In this talk, I will first show an L_p theory of eigenvector analysis that yields optimal recovery guarantees for spectral methods in many challenging problems. Then I will present a general framework for clustering based on non-convex optimization, and study its theoretical guarantees under statistical models. The results find applications in dimensionality reduction, mixture models, network analysis, recommendation systems, ranking and beyond.

Biography: Kaizheng Wang is a fifth-year PhD student in Operations Research and Financial Engineering at Princeton University. His research interests lie at the intersection of statistics, machine learning and optimization, with special focus on statistical understanding of efficient algorithms for unsupervised learning. He is a recipient of the honorific Harold W. Dodds Fellowship at Princeton University.

Share this Post