Journal:Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (ACL), CCF-A
Abstract:Multi-Modal Entity Alignment aims to discover identical entities across heterogeneous knowledge graphs. While recent studies have delved into fusion paradigms to represent entities holistically, the elimination of features irrelevant to alignment and modal inconsistencies is overlooked, which are caused by inherent differences in multi-modal features. To address these challenges, we propose a novel strategy of progressive modality freezing, called PMF, that focuses on alignment-relevant features and enhances multi-modal feature fusion. Notably, our approach introduces a pioneering cross-modal association loss to foster modal consistency. Empirical evaluations across nine datasets confirm PMF’s superiority, demonstrating state-of-the-art performance and the rationale for freezing modalities.
Co-author:Yani Huang, Xuefeng Zhang,Richong Zhang,Junfan Chen, Jaein Kim
Indexed by:国际学术会议
Page Number:3477-3489
Translation or Not:no
Date of Publication:2024-01-01
