We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
A Neural Syntactic Language Model.
- Authors
Ahmad Emami; Frederick Jelinek
- Abstract
Abstract This paper presents a study of using neural probabilistic models in a syntactic based language model. The neural probabilistic model makes use of a distributed representation of the items in the conditioning history, and is powerful in capturing long dependencies. Employing neural network based models in the syntactic based language model enables it to use efficiently the large amount of information available in a syntactic parse in estimating the next word in a string. Several scenarios of integrating neural networks in the syntactic based language model are presented, accompanied by the derivation of the training procedures involved. Experiments on the UPenn Treebank and the Wall Street Journal corpus show significant improvements in perplexity and word error rate over the baseline SLM. Furthermore, comparisons with the standard and neural net based N-gram models with arbitrarily long contexts show that the syntactic information is in fact very helpful in estimating the word string probability. Overall, our neural syntactic based model achieves the best published results in perplexity and WER for the given data sets.
- Publication
Machine Learning, 2005, Vol 60, Issue 1-3, p195
- ISSN
0885-6125
- Publication type
Article
- DOI
10.1007/s10994-005-0916-y