Personal Homepage

Personal Information

MORE+

Associate Professor

Supervisor of Master's Candidates

E-Mail:

Date of Employment:2025-05-21

School/Department:软件学院

Education Level:博士研究生

Business Address:新主楼C808,G517

Gender:Male

Contact Information:18810578537

Degree:博士

Status:Employed

Alma Mater:北京航空航天大学

Discipline:Software Engineering
Computer Science and Technology

Junfan Chen

+

Gender:Male

Education Level:博士研究生

Alma Mater:北京航空航天大学

Paper

Current position: Home / Paper
A Neural Expectation-Maximization Framework for Noisy Multi-Label Text Classification

Journal:IEEE Transactions on Knowledge and Data Engineering (TKDE), CCF-A
Abstract:Multi-label text classification (MLTC) has a wide range of real-world applications. Neural networks recently promoted the performance of MLTC models. Training these neural-network models relies on sufficient accurately labelled data. However, manually annotating large-scale multi-label text classification datasets is expensive and impractical for many applications. Weak supervision techniques have thus been developed to reduce the cost of annotating text corpus. However, these techniques introduce noisy labels into the training data and may degrade the model performance. This paper aims to deal with such noise-label problems in MLTC in both single-instance and multi-instance settings. We build a novel Neural Expectation-Maximization Framework (nEM) that combines neural networks with probabilistic modelling. The nEM framework produces text representations using neural-network text encoders and is optimized with the Expectation-Maximization algorithm. It naturally considers the noisy labels during learning by iteratively updating the model parameters and estimating the distribution of the ground-truth labels. We evaluate our nEM framework in multi-instance noisy MLTC on a benchmark relation extraction dataset constructed by distant supervision and in single-instance noisy MLTC on synthetic noisy datasets constructed by keywords supervision and label flipping. The experimental results demonstrate that nEM significantly improves upon baseline models in both single-instance and multi-instance noisy MLTC tasks. The experiment analysis suggests that our nEM framework efficiently reduces the noisy labels in MLTC datasets and significantly improves model performance.
Co-author:Junfan Chen,Richong Zhang, Jie Xu,Chunming Hu, Yongyi Mao
Indexed by:国际刊物
Page Number:10992-11003
Translation or Not:no
Date of Publication:2023-01-01