Semi-supervised sequence learning

AM Dai, QV Le - Advances in neural information processing …, 2015 - proceedings.neurips.cc
Advances in neural information processing systems, 2015proceedings.neurips.cc
We present two approaches to use unlabeled data to improve Sequence Learningwith
recurrent networks. The first approach is to predict what comes next in asequence, which is
a language model in NLP. The second approach is to use asequence autoencoder, which
reads the input sequence into a vector and predictsthe input sequence again. These two
algorithms can be used as a “pretraining” algorithm for a later supervised sequence learning
algorithm. In other words, theparameters obtained from the pretraining step can then be …
Abstract
We present two approaches to use unlabeled data to improve Sequence Learningwith recurrent networks. The first approach is to predict what comes next in asequence, which is a language model in NLP. The second approach is to use asequence autoencoder, which reads the input sequence into a vector and predictsthe input sequence again. These two algorithms can be used as a “pretraining” algorithm for a later supervised sequence learning algorithm. In other words, theparameters obtained from the pretraining step can then be used as a starting pointfor other supervised training models. In our experiments, we find that long shortterm memory recurrent networks after pretrained with the two approaches becomemore stable to train and generalize better. With pretraining, we were able toachieve strong performance in many classification tasks, such as text classificationwith IMDB, DBpedia or image recognition in CIFAR-10.
proceedings.neurips.cc
Showing the best result for this search. See all results