HFCAS OpenIR
An ERNIE-Based Joint Model for Chinese Named Entity Recognition
Wang, Yu1,2; Sun, Yining1,2; Ma, Zuchang1; Gao, Lisheng1; Xu, Yang1
2020-08-01
发表期刊APPLIED SCIENCES-BASEL
通讯作者Sun, Yining(ynsun@iim.ac.cn)
摘要Named Entity Recognition (NER) is the fundamental task for Natural Language Processing (NLP) and the initial step in building a Knowledge Graph (KG). Recently, BERT (Bidirectional Encoder Representations from Transformers), which is a pre-training model, has achieved state-of-the-art (SOTA) results in various NLP tasks, including the NER. However, Chinese NER is still a more challenging task for BERT because there are no physical separations between Chinese words, and BERT can only obtain the representations of Chinese characters. Nevertheless, the Chinese NER cannot be well handled with character-level representations, because the meaning of a Chinese word is quite different from that of the characters, which make up the word. ERNIE (Enhanced Representation through kNowledge IntEgration), which is an improved pre-training model of BERT, is more suitable for Chinese NER because it is designed to learn language representations enhanced by the knowledge masking strategy. However, the potential of ERNIE has not been fully explored. ERNIE only utilizes the token-level features and ignores the sentence-level feature when performing the NER task. In this paper, we propose the ERNIE-Joint, which is a joint model based on ERNIE. The ERNIE-Joint can utilize both the sentence-level and token-level features by joint training the NER and text classification tasks. In order to use the raw NER datasets for joint training and avoid additional annotations, we perform the text classification task according to the number of entities in the sentences. The experiments are conducted on two datasets: MSRA-NER and Weibo. These datasets contain Chinese news data and Chinese social media data, respectively. The results demonstrate that the ERNIE-Joint not only outperforms BERT and ERNIE but also achieves the SOTA results on both datasets.
关键词joint training named entity recognition pre-training models ERNIE BERT
DOI10.3390/app10165711
收录类别SCI
语种英语
资助项目major special project of Anhui Science and Technology Department[18030801133] ; Science and Technology Service Network Initiative[KFJ-STS-ZDTP-079]
项目资助者major special project of Anhui Science and Technology Department ; Science and Technology Service Network Initiative
WOS研究方向Chemistry ; Engineering ; Materials Science ; Physics
WOS类目Chemistry, Multidisciplinary ; Engineering, Multidisciplinary ; Materials Science, Multidisciplinary ; Physics, Applied
WOS记录号WOS:000567143300001
出版者MDPI
引用统计
被引频次:16[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://ir.hfcas.ac.cn:8080/handle/334002/104067
专题中国科学院合肥物质科学研究院
通讯作者Sun, Yining
作者单位1.Chinese Acad Sci, Inst Intelligent Machines, Hefei Inst Phys Sci, AnHui Prov Key Lab Med Phys & Technol, Hefei 230031, Peoples R China
2.Univ Sci & Technol China, Grad Sch, Sci Isl Branch, Hefei 230026, Peoples R China
推荐引用方式
GB/T 7714
Wang, Yu,Sun, Yining,Ma, Zuchang,et al. An ERNIE-Based Joint Model for Chinese Named Entity Recognition[J]. APPLIED SCIENCES-BASEL,2020,10.
APA Wang, Yu,Sun, Yining,Ma, Zuchang,Gao, Lisheng,&Xu, Yang.(2020).An ERNIE-Based Joint Model for Chinese Named Entity Recognition.APPLIED SCIENCES-BASEL,10.
MLA Wang, Yu,et al."An ERNIE-Based Joint Model for Chinese Named Entity Recognition".APPLIED SCIENCES-BASEL 10(2020).
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Wang, Yu]的文章
[Sun, Yining]的文章
[Ma, Zuchang]的文章
百度学术
百度学术中相似的文章
[Wang, Yu]的文章
[Sun, Yining]的文章
[Ma, Zuchang]的文章
必应学术
必应学术中相似的文章
[Wang, Yu]的文章
[Sun, Yining]的文章
[Ma, Zuchang]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。