Search engine for discovering works of Art, research articles, and books related to Art and Culture
ShareThis
Javascript must be enabled to continue!

SCL: Selective Contrastive Learning for Data-driven Zero-shot Relation Extraction

View through CrossRef
Abstract Relation extraction has evolved from supervised relation extraction to zero-shot setting due to the continuous emergence of newly generated relations. Some pioneering works handle zero-shot relation extraction by reformulating it into proxy tasks, such as reading comprehension and textual entailment. Nonetheless, the divergence in proxy task formulations from relation extraction hinders the acquisition of informative semantic representations, leading to subpar performance. Therefore, in this paper, we take a data-driven view to handle zero-shot relation extraction under a three-step paradigm, including encoder training, relation clustering, and summarization. Specifically, to train a discriminative relational encoder, we propose a novel selective contrastive learning framework, namely, SCL, where selective importance scores are assigned to distinguish the importance of different negative contrastive instances. During testing, the prompt-based encoder is employed to map test samples into representation vectors, which are then clustered into several groups. Typical samples closest to the cluster centroid are selected for summarization to generate the predicted relation for all samples in the cluster. Moreover, we design a simple non-parametric threshold plugin to reduce false-positive errors in inference on unseen relation representations. Our experiments demonstrate that SCL outperforms the current state-of-the-art method by over 3% across all metrics.
Title: SCL: Selective Contrastive Learning for Data-driven Zero-shot Relation Extraction
Description:
Abstract Relation extraction has evolved from supervised relation extraction to zero-shot setting due to the continuous emergence of newly generated relations.
Some pioneering works handle zero-shot relation extraction by reformulating it into proxy tasks, such as reading comprehension and textual entailment.
Nonetheless, the divergence in proxy task formulations from relation extraction hinders the acquisition of informative semantic representations, leading to subpar performance.
Therefore, in this paper, we take a data-driven view to handle zero-shot relation extraction under a three-step paradigm, including encoder training, relation clustering, and summarization.
Specifically, to train a discriminative relational encoder, we propose a novel selective contrastive learning framework, namely, SCL, where selective importance scores are assigned to distinguish the importance of different negative contrastive instances.
During testing, the prompt-based encoder is employed to map test samples into representation vectors, which are then clustered into several groups.
Typical samples closest to the cluster centroid are selected for summarization to generate the predicted relation for all samples in the cluster.
Moreover, we design a simple non-parametric threshold plugin to reduce false-positive errors in inference on unseen relation representations.
Our experiments demonstrate that SCL outperforms the current state-of-the-art method by over 3% across all metrics.

Related Results

Selection of Injectable Drug Product Composition using Machine Learning Models (Preprint)
Selection of Injectable Drug Product Composition using Machine Learning Models (Preprint)
BACKGROUND As of July 2020, a Web of Science search of “machine learning (ML)” nested within the search of “pharmacokinetics or pharmacodynamics” yielded over 100...
Anti-Scl-70 Antibodies in Autoimmune Hypothyroidism
Anti-Scl-70 Antibodies in Autoimmune Hypothyroidism
The relationship between autoimmune thyroiditis and systemic sclerosis is controversial. Data exist on the presence of thyroid autoantibodies in patients with systemic sclerosis bu...
Utilizing Large Language Models for Geoscience Literature Information Extraction
Utilizing Large Language Models for Geoscience Literature Information Extraction
Extracting information from unstructured and semi-structured geoscience literature is a crucial step in conducting geological research. The traditional machine learning extraction ...
Difference in Threshold between Sono- and Sonochemical Luminescence
Difference in Threshold between Sono- and Sonochemical Luminescence
The difference in threshold between sonoluminescence (SL) and sonochemical luminescence (SCL) has been investigated. The intensity of both SL from distilled water and SCL f...
Contrastive Distillation Learning with Sparse Spatial Aggregation
Contrastive Distillation Learning with Sparse Spatial Aggregation
Abstract Contrastive learning has advanced significantly and demonstrates excellent transfer learning capabilities. Knowledge distillation is one of the most effective meth...
Analyzing Data Augmentation Techniques for Contrastive Learning in Recommender Models
Analyzing Data Augmentation Techniques for Contrastive Learning in Recommender Models
This paper investigates the application of contrastive learning-based user and item representation learning in recommendation systems. A recommendation model combining contrastive ...
Improving Neural Retrieval with Contrastive Learning
Improving Neural Retrieval with Contrastive Learning
In recent years, neural retrieval models have shown remarkable progress in improving the efficiency and accuracy of information retrieval systems. However, challenges remain in eff...

Back to Top