Search engine for discovering works of Art, research articles, and books related to Art and Culture
ShareThis
Javascript must be enabled to continue!

DIBERT: Dependency Injected Bidirectional Encoder Representations from Transformers

View through CrossRef
<div> <div> <div> <p> </p><div> <div> <div> <p>In this paper, we propose a new model named DIBERT which stands for Dependency Injected Bidirectional Encoder Representations from Transformers. DIBERT is a variation of the BERT and has an additional third objective called Parent Prediction (PP) apart from Masked Language Modeling (MLM) and Next Sentence Prediction (NSP). PP injects the syntactic structure of a dependency tree while pre-training the DIBERT which generates syntax-aware generic representations. We use the WikiText-103 benchmark dataset to pre-train both BERT- Base and DIBERT. After fine-tuning, we observe that DIBERT performs better than BERT-Base on various downstream tasks including Semantic Similarity, Natural Language Inference and Sentiment Analysis. </p> </div> </div> </div> </div> </div> </div>
Institute of Electrical and Electronics Engineers (IEEE)
Title: DIBERT: Dependency Injected Bidirectional Encoder Representations from Transformers
Description:
<div> <div> <div> <p> </p><div> <div> <div> <p>In this paper, we propose a new model named DIBERT which stands for Dependency Injected Bidirectional Encoder Representations from Transformers.
DIBERT is a variation of the BERT and has an additional third objective called Parent Prediction (PP) apart from Masked Language Modeling (MLM) and Next Sentence Prediction (NSP).
PP injects the syntactic structure of a dependency tree while pre-training the DIBERT which generates syntax-aware generic representations.
We use the WikiText-103 benchmark dataset to pre-train both BERT- Base and DIBERT.
After fine-tuning, we observe that DIBERT performs better than BERT-Base on various downstream tasks including Semantic Similarity, Natural Language Inference and Sentiment Analysis.
</p> </div> </div> </div> </div> </div> </div>.

Related Results

On the Remote Calibration of Instrumentation Transformers: Influence of Temperature
On the Remote Calibration of Instrumentation Transformers: Influence of Temperature
The remote calibration of instrumentation transformers is theoretically possible using synchronous measurements across a transmission line with a known impedance and a local set of...
Hepatoprotective activity of Ammi majus on CCL4 Induced Albino Mice
Hepatoprotective activity of Ammi majus on CCL4 Induced Albino Mice
This study was amid to evaluate the possible protective effects of the water and alcoholic extract of Ammi majus seeds against liver damage induced in mice by CCL4. The plant was c...
Development of a combined magnetic encoder
Development of a combined magnetic encoder
Purpose As a type of angular displacement sensor, the Hall-effect magnetic encoder incorporates many advantages. While compared with the photoelectric encoder, the magnetic encoder...
MD2PR: A Multi-level Distillation based Dense Passage Retrieval Model
MD2PR: A Multi-level Distillation based Dense Passage Retrieval Model
Abstract Reranker and retriever are two important components in information retrieval. The retriever typically adopts a dual-encoder model, where queries and docume...
Meta-Representations as Representations of Processes
Meta-Representations as Representations of Processes
In this study, we explore how the notion of meta-representations in Higher-Order Theories (HOT) of consciousness can be implemented in computational models. HOT suggests that consc...
High Resolution SelfA Rotary Table by the Interpolation Signal Calibration
High Resolution SelfA Rotary Table by the Interpolation Signal Calibration
Self-A (Self-calibratable Angle device) rotary encoder can detect some kinds of angle error, not only its encoder scale error, but also the encoder attachment error (e.g. eccentric...

Back to Top