Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Qiu, Zhiqiana | Chen, Feib; * | Ji, Junyuc
Affiliations: [a] Tianjin Key Laboratory of Imaging and Sensing Microelectronic Technology School of Microelectronics, Tianjin University, Tianjin, China | [b] Microelectronics Research Institute, Shenzhen Tsinghua University Research Institute, Shenzhen, Guangdong, China | [c] Shenzhen Zhiting Technology Co, Ltd., Shenzhen, Guangdong, China
Correspondence: [*] Corresponding author: Fei Chen, Microelectronics Research Institute, Shenzhen Tsinghua University Research Institute, Shenzhen, Guangdong 518057, China. E-mail: chenfei@tsinghua-sz.org.
Abstract: Speech enhancement is essential for hearing aids. In recent years, many speech enhancement methods based on deep learning have been proven to be effective. However, these speech enhancement methods rarely consider limited hardware resources and have difficulty meeting real-time requirements, which is very important for hearing aids. To solve the above problems, we propose a method that combines beamforming and speech enhancement methods based on deep learning. Beamforming is used to filter background noise and reduce the complexity of noise. Additionally, a new filter bank used in hearing aids is adopted to reduce the complexity of the system. The system was deployed and tested in resource-constrained hearing aids. The effectiveness of the method was verified by objective experiments using standard evaluation indicators. The results showed that the power was 8.43 mA, the signal-to-noise ratio improved by 9.4394 dB, and the PESQ improved by 0.7350. The presented objective and subjective results show that the proposed method achieves better noise suppression than previous methods.
Keywords: Hearing aids, reature extraction, recurrent neural network, hardware algorithm implementation, speech enhancement
DOI: 10.3233/JCM-226897
Journal: Journal of Computational Methods in Sciences and Engineering, vol. 23, no. 6, pp. 3239-3254, 2023
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
sales@iospress.com
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
info@iospress.nl
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office info@iospress.nl
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
china@iospress.cn
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
如果您在出版方面需要帮助或有任何建, 件至: editorial@iospress.nl