Personal Homepage

Personal Information

MORE+

Associate Professor

Supervisor of Master's Candidates

E-Mail:

Date of Employment:2025-05-21

School/Department:软件学院

Education Level:博士研究生

Business Address:新主楼C808,G517

Gender:Male

Contact Information:18810578537

Degree:博士

Status:Employed

Alma Mater:北京航空航天大学

Discipline:Software Engineering
Computer Science and Technology

Junfan Chen

+

Gender:Male

Education Level:博士研究生

Alma Mater:北京航空航天大学

Paper

Current position: Home / Paper
A Hierarchical N-Gram Framework for Zero-Shot Link Prediction

Journal:Findings of the Association for Computational Linguistics: EMNLP 2022 (EMNLP)
Abstract:Knowledge graphs typically contain a large number of entities but often cover only a fraction of all relations between them (i.e., incompleteness). Zero-shot link prediction (ZSLP) is a popular way to tackle the problem by automatically identifying unobserved relations between entities. Most recent approaches use textual features of relations (e.g., surface names or textual descriptions) as auxiliary information to improve the encoded representation. These methods lack robustness as they are bound to support only tokens from a fixed vocabulary and are unable to model out-of-vocabulary (OOV) words. Subword units such as character n-grams have the capability of generating more expressive representations for OOV words. Hence, in this paper, we propose a Hierarchical N-gram framework for Zero-Shot Link Prediction (HNZSLP) that leverages character n-gram information for ZSLP. Our approach works by first constructing a hierarchical n-gram graph from the surface name of relations. Subsequently, a new Transformer-based network models the hierarchical n-gram graph to learn a relation embedding for ZSLP. Experimental results show that our proposed HNZSLP method achieves state-of-the-art performance on two standard ZSLP datasets.
Co-author:Mingchen Li,Junfan Chen, Samuel Mensah, Nikolaos Aletras, Xiulong Yang, Yang Ye
Indexed by:国际学术会议
Page Number:2498-2509
Translation or Not:no
Date of Publication:2022-01-01