Memory-efficient structured convex optimization via extreme point sampling
Sep 14, 2021 04:00 PM Singapore (Registration will open at 03:50 PM.)
Join Zoom Meeting:
Meeting ID: 975 4943 7254
Semidefinite programs (SDPs) are a class of structured convex optimization problems that generalize linear programs. Typically SDPs arise by (approximately) reformulating a low-dimensional (possibly nonconvex) problem as a higher-dimensional convex problem. This “lifting” approach is powerful, leading to the best approximation algorithms known for hard combinatorial optimization problems such as max-cut. However, the resulting SDPs often require much more memory even to store the decision variable than was required to express the original problem. This talk will focus on algorithms that break this memory bottleneck by implicitly representing the decision variable via random samples. In the special case of max-cut on a graph with n vertices, the approach gives an algorithm with O(n) working memory that preserves the approximation guarantees of the full semidefinite programming relaxation.
About the Speaker
James Saunderson is a lecturer in the Department of Electrical and Computer Systems Engineering at Monash University in Melbourne, Australia. He received undergraduate degrees in Mathematics and Electrical Engineering from the University of Melbourne in 2008, and Masters and PhD degrees in Electrical Engineering and Computer Science from MIT in 2011 and 2015, respectively. Before joining Monash, James was a postdoctoral scholar at Caltech and a research fellow at the University of Washington, Seattle. In 2020, James was awarded the SIAM activity group on optimization best paper prize and an Australian Research Council discovery early career research award. His research interests focus on mathematical optimization and its applications.
For more information about the ESD Seminar, please email email@example.com