Chenyi Zhang
@chenyizhang0802
PhD student @StanfordTheory
ID: 1466972927765753858
https://chenyizhang2000.github.io 04-12-2021 03:29:44
7 Tweet
82 Followers
93 Following
Can we have simple algorithms to escape saddle points in high-dim functions with better convergence rate? In my #NeurIPS2021 paper with Chenyi Zhang, we proposed a GD-based algorithm with a poly-speedup in log n. Join our poster session at spot A3 at Dec 7, 11:30-13:00 EST!
With Weiyuan Gong and Tongyang Li, we study the robustness of quantum algorithms for d-dim nonconvex optimization with noisy inputs and characterize the domains where they can find an approximate local min with polylog, poly, or exp number of queries in d. arxiv.org/abs/2212.02548
With Tongyang Li, we study quantum lower bounds on finding stationary points of nonconvex functions, and proved that there’s no quantum speedup in the following two settings: having access to 1) p-th order derivatives, or 2) stochastic gradients. arxiv.org/abs/2212.03906