First-order methods for large-scale distributed nonconvex optimization
Loading...
Authors
Mancino-Ball, Gabriel
Issue Date
2023-05
Type
Electronic thesis
Thesis
Thesis
Language
en_US
Keywords
Mathematics
Alternative Title
Abstract
Distributed optimization has garnered much attention from the machine learning community in recent years due to the ever-growing presence of distributed data and the emphasis on more complex machine learning models which require vast amounts of data to train. This work focuses on designing and analyzing first-order methods to solve distributed nonconvex optimization problems over a network of N computing devices (e.g. cell phones, GPUs). Specifically, we propose two general algorithmic frameworks: one for handling deterministic (offline) problems and another for handling stochastic (online) problems. In both cases, we rigorously prove our frameworks achieve optimal (full or sample) gradient complexity and in the deterministic setting we further achieve optimal communication complexity.
Description
May2023
School of Science
School of Science
Full Citation
Publisher
Rensselaer Polytechnic Institute, Troy, NY