Neural network architectures for short text

Authors
Elliott, Dylan
ORCID
Loading...
Thumbnail Image
Other Contributors
Zaki, Mohammed J., 1971-
Stewart, Charles V.
Gittens, Alex
Issue Date
2017-12
Keywords
Computer science
Degree
MS
Terms of Use
Attribution-NonCommercial-NoDerivs 3.0 United States
This electronic version is a licensed copy owned by Rensselaer Polytechnic Institute, Troy, NY. Copyright of original work retained by author.
Full Citation
Abstract
This thesis presents a comparison of four commonly used neural network models for learning to classify and encode short text sequences. We first evaluate the performance of the models for a supervised classification task on three short text datasets. The results of these tests suggest that performance can be dependent on a combination of the model architecture and the complexity of features desired to be learned. We then train each model on a semi-supervised learning task with a K-means clustering objective for one of the short text datasets, after which we encode the dataset with the trained models and perform clustering on the encoded representations. The results of clustering reveal that a model's performance in the classification task does not necessarily correlate positively to its performance in the semi-supervised task and we relate these observations to data about each model's behavior during learning. Overall we find that if a model does not learn to largely separate its feature representations too quickly, it may have a better chance at clustering due to an increased ability to correct initial alignment mistakes. These insights provide guidance to future work in which more complex models will be used and knowledge bases will be constructed using raw text scraped from the web.
Description
December 2017
School of Science
Department
Dept. of Computer Science
Publisher
Rensselaer Polytechnic Institute, Troy, NY
Relationships
Rensselaer Theses and Dissertations Online Collection
Access
CC BY-NC-ND. Users may download and share copies with attribution in accordance with a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 License. No commercial use or derivatives are permitted without the explicit approval of the author.