Author
Mancino-Ball, Gabriel
Other Contributors
Xu, Yangyang; Mitchell, John; Lai, Rongjie; Chen, Jie;
Date Issued
2023-05
Subject
Mathematics
Degree
PhD;
Terms of Use
This electronic version is a licensed copy owned by Rensselaer Polytechnic Institute (RPI), Troy, NY. Copyright of original work retained by author.;
Abstract
Distributed optimization has garnered much attention from the machine learning community in recent years due to the ever-growing presence of distributed data and the emphasis on more complex machine learning models which require vast amounts of data to train. This work focuses on designing and analyzing first-order methods to solve distributed nonconvex optimization problems over a network of N computing devices (e.g. cell phones, GPUs). Specifically, we propose two general algorithmic frameworks: one for handling deterministic (offline) problems and another for handling stochastic (online) problems. In both cases, we rigorously prove our frameworks achieve optimal (full or sample) gradient complexity and in the deterministic setting we further achieve optimal communication complexity.;
Description
May2023; School of Science
Department
Dept. of Mathematical Sciences;
Publisher
Rensselaer Polytechnic Institute, Troy, NY
Relationships
Rensselaer Theses and Dissertations Online Collection;
Access
Users may download and share copies with attribution in accordance with a Creative Commons
Attribution-Noncommercial-No Derivative Works 3.0 license. No commercial use or derivatives
are permitted without the explicit approval of the author.;