期刊文献+

Multi-Level Cross-Lingual Attentive Neural Architecture for Low Resource Name Tagging 被引量:2

Multi-Level Cross-Lingual Attentive Neural Architecture for Low Resource Name Tagging
原文传递
导出
摘要 Neural networks have been widely used for English name tagging and have delivered state-of-the-art results. However, for low resource languages, due to the limited resources and lack of training data, taggers tend to have lower performance, in comparison to the English language. In this paper, we tackle this challenging issue by incorporating multi-level cross-lingual knowledge as attention into a neural architecture, which guides low resource name tagging to achieve a better performance. Specifically, we regard entity type distribution as language independent and use bilingual lexicons to bridge cross-lingual semantic mapping. Then, we jointly apply word-level cross-lingual mutual influence and entity-type level monolingual word distributions to enhance low resource name tagging. Experiments on three languages demonstrate the effectiveness of this neural architecture: for Chinese,Uzbek, and Turkish, we are able to yield significant improvements in name tagging over all previous baselines. Neural networks have been widely used for English name tagging and have delivered state-of-the-art results. However, for low resource languages, due to the limited resources and lack of training data, taggers tend to have lower performance, in comparison to the English language. In this paper, we tackle this challenging issue by incorporating multi-level cross-lingual knowledge as attention into a neural architecture, which guides low resource name tagging to achieve a better performance. Specifically, we regard entity type distribution as language independent and use bilingual lexicons to bridge cross-lingual semantic mapping. Then, we jointly apply word-level cross-lingual mutual influence and entity-type level monolingual word distributions to enhance low resource name tagging. Experiments on three languages demonstrate the effectiveness of this neural architecture: for Chinese,Uzbek, and Turkish, we are able to yield significant improvements in name tagging over all previous baselines.
出处 《Tsinghua Science and Technology》 SCIE EI CAS CSCD 2017年第6期633-645,共13页 清华大学学报(自然科学版(英文版)
基金 supported by the National High-Tech Development(863)Program of China(No.2015AA015407) the National Natural Science Foundation of China(Nos.61632011 and 61370164)
关键词 name tagging deep learning recurrent neural network cross-lingual information extraction name tagging deep learning recurrent neural network cross-lingual information extraction
  • 相关文献

参考文献1

共引文献6

同被引文献6

引证文献2

二级引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部