Show simple item record

dc.rights.licenseRestricted to current Rensselaer faculty, staff and students. Access inquiries may be directed to the Rensselaer Libraries.
dc.contributorXia, Lirong
dc.contributorMagdon-Ismail, Malik
dc.contributorZaki, Mohammed J., 1971-
dc.contributor.authorPiech, Peter D.
dc.date.accessioned2021-11-03T08:35:15Z
dc.date.available2021-11-03T08:35:15Z
dc.date.created2016-06-13T11:14:35Z
dc.date.issued2016-05
dc.identifier.urihttps://hdl.handle.net/20.500.13015/1663
dc.descriptionMay 2016
dc.descriptionSchool of Science
dc.description.abstractIt is often the primary interest of certain voting systems to be able to generate an aggregate ranking over a set of candidates or alternatives from the preferences of individual agents or voters. The Plackett-Luce model is one of the most studied models for statistically describing discrete choice ordinal preferences and summarizing rank data in the machine learning subarea of rank aggregation which has also been widely researched from the approach of problems in computational social choice. Much work has been done by the machine learning community in developing algorithms to efficiently estimate Plackett-Luce model parameters with wide-ranging real-world applications of rank data in e-commerce and political science, such as meta-search engines, consumer products rankings, and presidential elections. In machine learning tasks, a mixture of models can sometimes better fit the data more closely than a single model alone, and so naturally, mixtures of Plackett-Luce models can also confer the same benefits for rank data.
dc.description.abstractA major obstacle in learning the parameters of mixture models is the identifiability of the models which is necessary in order to be able to make correct, meaningful inferences from the learned parameters. Without identifiability, it becomes impossible even to estimate the parameters in certain cases. Using breakthrough results on the identifiability of mixtures of Plackett-Luce, we propose an efficient generalized method of moments (GMM) algorithm to learn mixtures of Plackett-Luce models and compare it to an existing expectation maximization (EM) algorithm. We outline the overall approach of GMM and the selection of the moment conditions used by our algorithm in estimating the ground-truth parameters. Next, we describe the design and implementation details of our GMM algorithm and present both theory and experiments that show it to be significantly faster than the EM algorithm while achieving competitive statistical efficiency. Finally, we discuss the implications of the identifiability results and our algorithm on future work in extending both for learning mixtures of Plackett-Luce models from big rank data.
dc.language.isoENG
dc.publisherRensselaer Polytechnic Institute, Troy, NY
dc.relation.ispartofRensselaer Theses and Dissertations Online Collection
dc.subjectComputer science
dc.titleGeneralized method of moments algorithm for learning mixtures of Plackett-Luce models
dc.typeElectronic thesis
dc.typeThesis
dc.digitool.pid177226
dc.digitool.pid177227
dc.digitool.pid177228
dc.rights.holderThis electronic version is a licensed copy owned by Rensselaer Polytechnic Institute, Troy, NY. Copyright of original work retained by author.
dc.description.degreeMS
dc.relation.departmentDept. of Computer Science


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record