START Conference Manager    

Combination of Recurrent Neural Networks and Factored Language Models for Code-Switching Language Modeling

Heike Adel, Ngoc Thang Vu and Tanja Schultz

The 51st Annual Meeting of the Association for Computational Linguistics - Short Papers (ACL Short Papers 2013)
Sofia, Bulgaria, August 4-9, 2013


Abstract

In this paper, we investigate the application of recurrent neural networks (RNN) and factored language models (FLM) to the task of language modeling for Code- Switching speech. We present a way to integrate part-of-speech tags (POS) and language information (LID) into these models which leads to significant improvements in terms of perplexity. Furthermore, a comparison between RNNLM and FLM and a detailed analysis of perplexities on the different backoff levels are performed. Finally, we show that recurrent neural networks and factored language models can be combined using linear interpolation to achieve the best performance. The final combined language model provides 37.8% relative improvement in terms of perplexity on the SEAME development set and a relative improvement of 32.7% on the evaluation set compared to the traditional n-gram LM.


START Conference Manager (V2.61.0 - Rev. 2792M)