I am a postdoc in the OPTIM lab at EPFL, working with Nicolas Boumal.
I graduated from Georgia Tech in May 2022 with a Ph.D. in Electrical and Computer Engineering, advised by Mark Davenport.
My doctoral research was in learning theory and high-dimensional statistics. My postdoc work has shifted focus toward the specific optimization problems that arise from such problems. Most of my work falls under the broad umbrella of understanding how problem structure affects
- how many measurements/samples we need to make useful predictions or inferences,
- how much error/corruption we can expect due to noise or other factors, and
- how difficult the associated optimization problems are to solve.
Here is a non-comprehensive list of topics I've worked on recently:
- Nonconvex optimization landscapes arising from statistics and machine learning problems
- Low-rank (noisy) matrix completion via convex optimization
- Convex optimization for nonlinear recovery via lifting
- Reproducing kernel Hilbert space (RKHS) methods
- (Sparse) phase retrieval
- (Sparse) principal component analysis (PCA)
- Interpolation with noise (a.k.a. "Benign overfitting" or "Harmless interpolation")
- Classification theory (and how it differs from regression)
- Regression on a manifold domain