Learning Lexical Subspaces in a Distributional Vector Space
Kushal Arora, Aishik Chakraborty, Jackie Chi Kit Cheung
Semantics: Lexical TACL Paper
Session 4B: Jul 6
(18:00-19:00 GMT)
Session 5B: Jul 6
(21:00-22:00 GMT)
Abstract:
In this paper, we propose LEXSUB, a novel approach towards unifying lexical and distributional semantics. We inject knowledge about lexical-semantic relations into distributional word embeddings by defining subspaces of the distributional vector space in which a lexical relation should hold. Our framework can handle symmetric attract and repel relations (e.g., synonymy and antonymy, respectively), as well as asymmetric relations (e.g., hypernymy and meronomy). In a suite of intrinsic benchmarks, we show that our model outperforms previous post-hoc approaches on relatedness tasks, and on hypernymy classification and detection while being competitive on word similarity tasks. It also outperforms previous systems on extrinsic classification tasks that benefit from exploiting lexical relational cues. We perform a series of analyses to understand the behaviors of our model.
You can open the
pre-recorded video
in a separate window.
NOTE: The SlidesLive video may display a random order of the authors.
The correct author list is shown at the top of this webpage.
Similar Papers
BiRRE: Learning Bidirectional Residual Relation Embeddings for Supervised Hypernymy Detection
Chengyu Wang, Xiaofeng He,

Hypernymy Detection for Low-Resource Languages via Meta Learning
Changlong Yu, Jialong Han, Haisong Zhang, Wilfred Ng,

A Novel Cascade Binary Tagging Framework for Relational Triple Extraction
Zhepei Wei, Jianlin Su, Yue Wang, Yuan Tian, Yi Chang,
