|
Computer Science 2015
One model, two languages: training bilingual parsers with harmonized treebanksAbstract: We introduce an approach to train parsers using bilingual corpora obtained by merging harmonized treebanks of different languages, producing parsers that effectively analyze sentences in any of the learned languages, or even sentences that mix both languages. We test the approach on the Universal Dependency Treebanks, training with MaltParser and MaltOptimizer.The results show that these bilingual parsers are more than competitive, as some combinations not only preserve the performance, but even achieve significant improvements over the corresponding monolingual parsers. Preliminary experiments also show the approach to be promising on texts with code-switching.
|