SI Yujing, LI Ta, PAN Jielin, YAN Yonghong. A Prefix Tree Based n-best List Re-scoring Strategy for Recurrent Neural Network Language Model[J]. Chinese Journal of Electronics, 2014, 23(1): 70-74.
Citation: SI Yujing, LI Ta, PAN Jielin, YAN Yonghong. A Prefix Tree Based n-best List Re-scoring Strategy for Recurrent Neural Network Language Model[J]. Chinese Journal of Electronics, 2014, 23(1): 70-74.

A Prefix Tree Based n-best List Re-scoring Strategy for Recurrent Neural Network Language Model

  • In this paper, issues of speeding up Recurrent neural network language model (RNNLM) in the testing phase are explored so that RNNLMs can be used to re-rank a large n-best list in real-time systems which could obtain better performance. A new n-best list rescoring framework, Prefix tree based n-best list re-scoring (PTNR), is proposed to hundred percent eliminate the repeated computations which makes n-best list re-scoring ineffective. At the same time, the bunch mode technique, widely-used in speeding up the training of Feed-forward neural network language model (FF-NNLM), is combined with PTNR and the speed is further improved. Experimental results show that our approach is much faster than basic n-best list re-scoring. Take 1000-best as an example, our approach is almost 11 times faster than the basic n-best list re-scoring.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return