Parallel randomized block coordinate descent for neural probabilistic language model with high-dimensional output targets

Xin Liu, Junchi Yan*, Xiangfeng Wang, Hongyuan Zha

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

Training a large probabilistic neural network language model, with typical high-dimensional output is excessively timeconsuming, which is one of the main reasons that more simplified models such as n-gram is often more popular despite the inferior performance. In this paper a Chinese neural probabilistic language model is trained using the Fudan Chinese Language Corpus. As hundreds of thousands of distinct words have been tokenized from the raw corpus, the model contains tens of millions of parameters. To address the challenge, popular parallel computing platform MPI (Message Passing Interface) based on cluster is employed to implement the parallel neural network language model. Specifically, we propose a new method termed as Parallel Randomized Block Coordinate Descent (PRBCD) to train this model cost-effectively. Different from traditional coordinate descent method, our new method could be employed in network with multiple layers, allowing scaling up the gradients with respect to hidden units proportionally based on sampled parameters. We empirically show that our PRBCD is stable and is well suited for language models, which contain only a few layers while often have a large amount of parameters and extremely high-dimensional output targets.

Original languageEnglish
Title of host publicationPattern Recognition - 7th Chinese Conference, CCPR 2016, Proceedings
EditorsTieniu Tan, Xilin Chen, Xuelong Li, Jian Yang, Hong Cheng, Jie Zhou
PublisherSpringer Verlag
Pages334-348
Number of pages15
ISBN (Print)9789811030048
DOIs
StatePublished - 2016

Publication series

NameCommunications in Computer and Information Science
Volume663
ISSN (Print)1865-0929

Keywords

  • Language model
  • Parallel computing
  • Stochastic optimization

Fingerprint

Dive into the research topics of 'Parallel randomized block coordinate descent for neural probabilistic language model with high-dimensional output targets'. Together they form a unique fingerprint.

Cite this