Skip Navigation
Search

ECE Departmental Seminar

Distributed Optimization Methods with Convergence Rate Guarantees over Time-varying Networks

Prof. Jie Lu
ShanghaiTech

Tuesday, 5/28/19, 11:00am
Light Engineering 250

Abstract: To date, only a limited number of distributed optimization algorithms are provided with guaranteed convergence rates on time-varying networks, which restrict the problem to be unconstrained. Motivated by this, we develop a family of distributed Fenchel dual gradient methods for solving constrained convex optimization problems over time-varying undirected networks. The proposed algorithms are constructed based on the application of weighted gradient methods to (smoothed) Fenchel dual problems, and can be implemented in a fully decentralized fashion. Under a standard network connectivity condition, we provide evaluable bounds on the convergence rates at which the Fenchel dual gradient methods drive all the nodes to both primal and dual optimality. Compared with the existing algorithms that also have convergence rate guarantees over time-varying networks, the Fenchel dual gradient methods are able to address constrained problems and their convergence rates have better scalability with respect to the network size and the time needed for each node to influence all the others.

Bio: Jie Lu received her B.S. in Information Engineering from Shanghai Jiao Tong University, China in 2007 and her Ph.D. in Electrical and Computer Engineering from the University of Oklahoma, USA in 2011. From 2012-2015 she was a postdoctoral researcher with the Automatic Control Lab, ACCESS Linnaeus Centre at KTH Royal Institute of Technology, Sweden and with the Department of Signals and Systems at Chalmers University of Technology, Sweden. Since 2015 she has been a tenure-track assistant professor in the School of Information Science and Technology at ShanghaiTech University, China. Her research interests include distributed optimization algorithms, multi-agent decision making, and networked dynamical systems.