Show simple item record

dc.rights.licenseRestricted to current Rensselaer faculty, staff and students in accordance with the Rensselaer Standard license. Access inquiries may be directed to the Rensselaer Libraries.
dc.contributorBennett, Kristin P.
dc.contributorLai, Rongjie
dc.contributorWang, Meng
dc.contributor.advisorMitchell, John E.
dc.contributor.authorTan, Jun
dc.date.accessioned2022-09-15T19:06:13Z
dc.date.available2022-09-15T19:06:13Z
dc.date.issued2021-12
dc.identifier.urihttps://hdl.handle.net/20.500.13015/6141
dc.descriptionDecember 2021
dc.descriptionSchool of Science
dc.description.abstractComplementarity constraints are used to enforce the orthogonality between two decision vectors (or matrices). They naturally appear in mathematical programs for many applications and also in reformulations of certain non-convex optimization problems. Compared to a convex relaxation, a complementarity reformulation of a non-convex model can often yield a better-quality solution. In this thesis, I explore the complementarity reformulation to sparse optimization problems that can simultaneously include $\ell_0$ and rank minimization. Two applications are studied. One is the robust principal component analysis (RPCA), and the other is the low-rank representation (LRR) problem. For RPCA, I introduce complementarity constraints to both $\ell_0$-term and the rank function. For the LRR problem, I give complementarity reformulation to the original model and also a low-rank matrix decomposition based model. For the original model, the complementarity constraints are introduced to the rank function and also the column $\ell_0$-term, and for the low-rank decomposition model, the complementarity constraints are imposed to both low-rank factor matrices. Numerically, I apply the penalty method to each complementarity reformulated problem, with the block coordinate descent method for solving a sequence of penalty subproblems. Global (or whole-sequence) convergence and also local convergence (to global optimality) are shown. The effectiveness and efficiency of my methods are demonstrated by comparing to convex relaxation methods. In terms of recoverability, my method tends to give higher rate of successful recovery. Also, for the LRR problem, solving the low-rank matrix decomposition based model can save the computing time, as compared to solving the original model by the complementarity reformulation.
dc.languageENG
dc.language.isoen_US
dc.publisherRensselaer Polytechnic Institute, Troy, NY
dc.relation.ispartofRensselaer Theses and Dissertations Online Collection
dc.subjectApplied mathematics
dc.titleComplementarity formulation for sparse optimization
dc.typeElectronic thesis
dc.typeThesis
dc.date.updated2022-09-15T19:06:16Z
dc.rights.holderThis electronic version is a licensed copy owned by Rensselaer Polytechnic Institute (RPI), Troy, NY. Copyright of original work retained by author.
dc.description.degreePhD
dc.relation.departmentDept. of Mathematical Sciences


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record