(Photo by SL, at Maroon Bells Aspen, September 2016)
I am looking for self-motivated students who are interested in data science, machine learning, optimization, and signal processing. If you are interested in joining my lab, please send me your CV, transcripts, and any demonstration materials such as papers or drafts.
I am an Assistant Professor with the Department of Electrical and Computer Engineering, University of Denver. I was a Postdoctoral Fellow in the Mathematical Institute for Data Science at The Johns Hopkins University. I obtained my Ph.D. degreee in Electrical Engineering from the Colorado School of Mines in 2017. My research focuses on the interaction among the fields of signal processing, data analysis, and machine learning, using tools from optimization, approximation theory, and harmonic analysis. I am especially interested in efficient and reliable methods for extracting useful information in large-scale and high-dimensional signals and data. A few current projects include deep neural networks for unsupervised learning and inverse problems; landscape analysis of (nonsmooth) nonconvex optimizations; efficient and provably correct algorithms in the offline, stochastic, or distributed settings by exploiting the geometric properties; theoretically understanding of deep learning.
News:
[Nov 2020] Co-organizing IEEE workshop on ‘‘Seeking Low-dimensionality in Deep Neural Networks (SLowDNN)’’, Nov. 23rd – Nov. 24th.
[Sep 2020] Our paper has been accepted at NeurIPS as spotlight (top 4%), which characterizes implicit bias with discrepant learning rates and builds connections between over-parameterization, RPCA, and deep neural networks.
[Jun 2020] Our proposal (with Jere at JHU) ‘‘Collaborative Research: CIF: Small: Deep Sparse Models: Analysis and Algorithms’’ has been awarded by NSF!
[Jun 2020] Two papers about over-parameterization are on arXiv: one studies the benefit of over-realized model in dictionary learning, another one characterizes implicit bias with discrepant learning rates and builds connection between over-parameterization, RPCA, and deep neural networks.
[Feb 2020] Our paper on robust homography estimation has been accepted to CVPR 2020.
[Jan 2020] Co-organized with Qing and Shuyang, our two-session mini-symposium ‘‘Recent Advances in Optimization Methods for Signal Processing and Machine Learning’’ has been accepted by the inaugural SIAM Conference on Mathematics of Data Science. See you at Cincinnati, Ohio in May!
[Jan 2020] Our review paper (with Qing, Xiao, Manolis, John, and Rene) ‘‘Finding the Sparsest Vectors in a Subspace: Theory, Algorithms, and Applications’’ is on arxiv.
[Jan 2020] Invited talk at Statistics, Optimization and Machine Learning Seminar, University of Colorado Boulder.
[Jan 2020] Invited talk at Colorado School of Mines.
[Dec 2019 ] Our paper Nonconvex Robust Low-rank Matrix Recovery has been accepted by SIAM Journal on Optimization.
[Dec 2019 ] Our paper Analysis of the Optimization Landscapes for Overcomplete Representation Learning was accepted to ICLR 2020 and was selected for oral presentation. In this paper, we showed benign optimization landscapes for learning overcomplete/convolutional dictionaries, ensuring simple gradient descent find the targeted solutions.
[Dec 2019 ] Attended NeurIPS 2019 and presented 3 papers.
[Dec 2019] Invited talk at Signal and Information Processing Seminar, Rutgers University.
[Nov 2019] Our paper (with Xiao, Shixiang, Zengde, Qing, and Anthony) ‘‘Nonsmooth Optimization over Stiefel Manifold: Riemannian Subgradient Methods’’ is on arxiv. This work provides (first) explicit convergence rate guarantees for a family of Riemannian subgradient methods when used to optimize nonsmooth functions (that are weakly convex in the Euclidean space) over then Stiefel manifold.
[Oct 2019] Attended the Northrop Grumman University Research Symposium, and presented our work on ‘‘Object Identification with Less Supervision".
[Oct 2019] Attended the Computational Imaging workshop at IMA, University of Minnesota, and presented our work on ‘‘A Linearly Convergent Method for Non-smooth Non-convex Optimization on Grassmannian with Applications to Robust Subspace and Dictionary Learning’’.
[Sep 2019] Gave an invited talk on Provable Nonconvex Approaches for Low-rank Models at Workshop on Low-Rank Models and Applications (LRMA), University of Mons, Belgium, Sep 12 – 13, 2019.
[Sep 2019] 3 papers accepted to NeurIPS 2019.
[Aug 2019] Gave an invited talk at ICCOPT 2019, the Sixth International Conference on Continuous Optimization, Technical University (TU) of Berlin, Aug 3 – 8, 2019.
[Aug 2019] Our paper (with Xiao, Anthony, Jason) ‘‘Incremental Methods for Weakly Convex Optimization’’ is on arxiv. This work provides (first) convergence guarantee for incrememtal algorithms and their random shuffling version (including the incremental subgradient method which is the work-horse of deep learning) in solving weakly convex optimization problems which could be nonconvex and nonsmooth.
[Aug 2019] Our paper (with Qing, Xiao) ‘‘A Nonconvex Approach for Exact and Efficient Multichannel Sparse Blind Deconvolution’’ is on arxiv. This work considers multichannel sparse blind deconvolution problem and provides efficient first-order methods that can exactly solve this blind deconvolution problem in a linear rate.
|