Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Qin, Zhi; * | Liu, Enyang | Zhang, Shibin | Chang, Yan | Yan, Lili
Affiliations: School of Cybersecurity, Chengdu University of Information Technology, Chengdu, Sichuan, China
Correspondence: [*] Corresponding author. Dr. Zhi Qin, School of Cybersecurity, Chengdu University of Information Technology, Chengdu, Sichuan, China. E-mail: mercyqz@cuit.edu.cn.
Abstract: Currently, word segmentation errors and polysemy problems are common in the field of Chinese relationship extraction. Although character-based model input can avoid word segmentation errors, in order to obtain the word information of a sentence, it is often necessary to introduce a dictionary or an external knowledge base to expand the word information, which requires a lot of manpower and time. In response to the above existing problems, this article uses characters as input, uses multiple embedding models to jointly form a character vector sequence, and obtains features containing character information through BiLSTM and attention layers; considering that convolutional neural networks are good at extracting local features, obtain features containing word information through multi-kernel convolutional layers and multi-head self-attention layers, and finally use a gating mechanism to fuse the features. The model was tested on the public SanWen data set and our own cultural-travel data set, and obtained F1 values of 61.22% and 60.26% respectively. Experimental results show that our method can achieve better relationship extraction effects without using word segmentation tools and without building a dictionary or external knowledge base, and the effect is better than most commonly used models currently.
Keywords: Chinese relation extraction, multiple embedded representations, muti-head self-attention, gating mechanism
DOI: 10.3233/JIFS-237391
Journal: Journal of Intelligent & Fuzzy Systems, vol. 46, no. 3, pp. 7093-7107, 2024
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
sales@iospress.com
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
info@iospress.nl
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office info@iospress.nl
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
china@iospress.cn
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
如果您在出版方面需要帮助或有任何建, 件至: editorial@iospress.nl