Neural network architectures for short text
Loading...
Authors
Elliott, Dylan
Issue Date
2017-12
Type
Electronic thesis
Thesis
Thesis
Language
ENG
Keywords
Computer science
Alternative Title
Abstract
This thesis presents a comparison of four commonly used neural network models for learning to classify and encode short text sequences. We first evaluate the performance of the models for a supervised classification task on three short text datasets. The results of these tests suggest that performance can be dependent on a combination of the model architecture and the complexity of features desired to be learned. We then train each model on a semi-supervised learning task with a K-means clustering objective for one of the short text datasets, after which we encode the dataset with the trained models and perform clustering on the encoded representations. The results of clustering reveal that a model's performance in the classification task does not necessarily correlate positively to its performance in the semi-supervised task and we relate these observations to data about each model's behavior during learning. Overall we find that if a model does not learn to largely separate its feature representations too quickly, it may have a better chance at clustering due to an increased ability to correct initial alignment mistakes. These insights provide guidance to future work in which more complex models will be used and knowledge bases will be constructed using raw text scraped from the web.
Description
December 2017
School of Science
School of Science
Full Citation
Publisher
Rensselaer Polytechnic Institute, Troy, NY