Andrew D. McRae

I am a postdoc in the OPTIM lab at the EPFL Institute of Mathematics, advised by Nicolas Boumal. Prior to that, I graduated from Georgia Tech in May 2022 with a Ph.D. in Electrical and Computer Engineering, advised by Mark Davenport.

I am generally interested in the theory of high-dimensional statistics and signal processing and machine learning; my current research focuses on the optimization and computational complexity aspects of such problems. For many interesting estimation problems, the most natural choice of estimator requires solving a nonconvex or otherwise seemingly intractable optimization problem. However, these optimization problems are in practice solved well (even exactly) by convex relaxations or even direct nonconvex approaches (see, e.g., my recent work on synchronization problems, which resemble the famous NP-hard max-cut problem). I study how the data models from applications influence the difficulty of the associated optimization problems. I consider both convex and nonconvex problems and, in particular, the relationship between the two; we can often learn a great deal about a nonconvex problem by studying a convex relaxation of it. For more information on my work, see my publications page or Google Scholar profile.