|
- 2017
基于双向门限递归单元神经网络的维吾尔语形态切分
|
Abstract:
以维吾尔语为代表的低资源、形态丰富语言的信息处理对于满足“一带一路”语言互通的战略需求具有重要意义。这类语言通过组合语素来表示句法和语义关系,因而给语言处理带来严重的数据稀疏问题。该文提出基于双向门限递归单元神经网络的维吾尔语形态切分方法,将维吾尔词自动切分为语素序列,从而缓解数据稀疏问题。双向门限递归单元神经网络能够充分利用双向上下文信息进行切分消歧,并通过门限递归单元有效处理长距离依赖。实验结果表明,该方法相比主流统计方法和单向门限递归单元神经网络获得了显著的性能提升。该方法具有良好的语言无关性,能够用于处理更多的形态丰富语言。
Abstract:Information processing of low-resource, morphologically-rich languages such as Uyghur is critical for addressing the language barrier problem faced by the One Belt and One Road (B&R) program in China. In such languages, individual words encode rich grammatical and semantic information by concatenating morphemes to a root form, which leads to severe data sparsity for language processing. This paper introduces an approach for Uyghur morphological segmentation which divides Uyghur words into sequences of morphemes based on bidirectional gated recurrent unit (GRU) neural networks. The bidirectional GRU exploits the bidirectional context to resolve ambiguities and model long-distance dependencies using the gating mechanism. Tests show that this approach significantly outperforms conditional random fields and unidirectional GRUs. This approach is language-independent and can be applied to all morphologically-rich languages.