[PDF][PDF] The fixed-size ordinally-forgetting encoding method for neural network language models

S Zhang, H Jiang, M Xu, J Hou… - Proceedings of the 53rd …, 2015 - aclanthology.org
Proceedings of the 53rd Annual Meeting of the Association for …, 2015aclanthology.org
In this paper, we propose the new fixedsize ordinally-forgetting encoding (FOFE) method,
which can almost uniquely encode any variable-length sequence of words into a fixed-size
representation. FOFE can model the word order in a sequence using a simple ordinally-
forgetting mechanism according to the positions of words. In this work, we have applied
FOFE to feedforward neural network language models (FNN-LMs). Experimental results
have shown that without using any recurrent feedbacks, FOFE based FNN-LMs can …
Abstract
In this paper, we propose the new fixedsize ordinally-forgetting encoding (FOFE) method, which can almost uniquely encode any variable-length sequence of words into a fixed-size representation. FOFE can model the word order in a sequence using a simple ordinally-forgetting mechanism according to the positions of words. In this work, we have applied FOFE to feedforward neural network language models (FNN-LMs). Experimental results have shown that without using any recurrent feedbacks, FOFE based FNN-LMs can significantly outperform not only the standard fixed-input FNN-LMs but also the popular recurrent neural network (RNN) LMs.
aclanthology.org
Showing the best result for this search. See all results