We are very pleased to announce that our group got a paper accepted for presentation at The 2019 edition of The NAACL conference, which will be held on June 2–7, 2019 Minneapolis, USA.
NAACL aims to bring together researchers interested in the design and study of natural language processing technology as well as its applications to new problem areas. With this goal in mind, the 2019 edition invites the submission of long and short papers on creative, substantial and unpublished research in all aspects of computational linguistics. It covers a diverse technical program–in addition to traditional research results, papers may present negative findings, survey an area, announce the creation of a new resource, argue a position, report novel linguistic insights derived using existing techniques, and reproduce, or fail to reproduce, previous results.
Here is the pre-print of the accepted paper with its abstract:
- Old is Gold: Linguistic Driven Approach for Entity and Relation Linking of Short Text by Ahmad Sakor, Isaiah Onando Mulang’, Kuldeep Singh, Saeedeh Shekarpour, Maria Esther Vidal, Jens Lehmann, and Sören Auer.
Abstract: Short texts challenge NLP tasks such as named entity recognition, disambiguation, linking and relation inference because they do not provide sufficient context or are partially malformed (e.g. wrt. capitalization, long tail entities, implicit relations). In this work, we present the Falcon approach which effectively maps entities and relations within a short text to its mentions of a background knowledge graph. Falcon overcomes the challenges of short text using a light-weight linguistic approach relying on a background knowledge graph. Falcon performs joint entity and relation linking of a short text by leveraging several fundamental principles of English morphology (e.g. compounding, headword identification) and utilizes an extended knowledge graph created by merging entities and relations from various knowledge sources. It uses the context of entities for finding relations and does not require training data. Our empirical study using several standard benchmarks and datasets show that Falcon significantly outperforms state-of-the-art entity and relation linking for short text query inventories.
Acknowledgment
This work was partially funded by the Fraunhofer IAIS, and EU H2020 project IASIS.
Looking forward to seeing you at The NAACL 2019 conference.