End-to-End Chinese Entity Recognition Based on BERT-BiLSTM-ATT-CRF

Author:LI Daiyi, TU Yaofeng, ZHOU Xiangsheng, ZHANG Yangming, MA Zongmin
End-to-End Chinese Entity Recognition Based on BERT-BiLSTM-ATT-CRF - ztecommunications
The browser version you are accessing is too low. To provide better experience, it is recommended that you upgrade the browser toEdgeBrowserOr, it is recommended.GoogleBrowser

End-to-End Chinese Entity Recognition Based on BERT-BiLSTM-ATT-CRF

LI Daiyi, TU Yaofeng, ZHOU Xiangsheng, ZHANG Yangming, MA Zongmin Click:83

End-to-End Chinese Entity Recognition Based on BERT-BiLSTM-ATT-CRF

LI Daiyi1, TU Yaofeng2, ZHOU Xiangsheng2, ZHANG Yangming2, MA Zongmin1
(1. Nanjing University of Aeronautics and Astronautics, Nanjing 211106, China;
2. ZTE Corporation, Nanjing 210012, China)

Abstract: Traditional named entity recognition methods need professional domain knowledge and a large amount of human participation to extract features, as well as the Chinese named entity recognition method based on a neural network model, which brings the problem that vector representation is too singular in the process of character vector representation. To solve the above problem, we propose a Chinese named entity recognition method based on the BERT-BiLSTM-ATT-CRF model. Firstly, we use the bidirectional encoder representations from transformers (BERT) pre-training language model to obtain the semantic vector of the word according to the context information of the word; Secondly, the word vectors trained by BERT are input into the bidirectional long-term and short-term memory network embedded with attention mechanism (BiLSTM-ATT) to capture the most important semantic information in the sentence; Finally, the conditional random field (CRF) is used to learn the dependence between adjacent tags to obtain the global optimal sentence level tag sequence. The experimental results show that the proposed model achieves state-of-the-art performance on both MSRA corpus and people's daily corpus, with F1 values of 94.77% and 95.97% respectively.

Keywords: named entity recognition (NER); feature extraction; BERT model; BiLSTM; attention mechanism; CRF

Browse PDF:PDF
PDF