Javascript must be enabled to continue!
A method to utilize prior knowledge for extractive summarization based on pre-trained language models
View through CrossRef
This paper presents a novel model for extractive summarization that integrates context representation from a pre-trained language model (PLM), such as BERT, with prior knowledge derived from unsupervised learning methods. Sentence importance assessment is crucial in extractive summarization, with prior knowledge providing indicators of sentence importance within a document. Our model introduces a method for estimating sentence importance based on prior knowledge, complementing the contextual representation offered by PLMs like BERT. Unlike previous approaches that primarily relied on PLMs alone, our model leverages both contextual representation and prior knowledge extracted from each input document. By conditioning the model on prior knowledge, it emphasizes key sentences in generating the final summary. We evaluate our model on three benchmark datasets across two languages, demonstrating improved performance compared to strong baseline methods in extractive summarization. Additionally, our ablation study reveals that injecting knowledge into certain first attention layers yields greater benefits than others. The model code is publicly available for further exploration.
Publishing House for Science and Technology, Vietnam Academy of Science and Technology (Publications)
Title: A method to utilize prior knowledge for extractive summarization based on pre-trained language models
Description:
This paper presents a novel model for extractive summarization that integrates context representation from a pre-trained language model (PLM), such as BERT, with prior knowledge derived from unsupervised learning methods.
Sentence importance assessment is crucial in extractive summarization, with prior knowledge providing indicators of sentence importance within a document.
Our model introduces a method for estimating sentence importance based on prior knowledge, complementing the contextual representation offered by PLMs like BERT.
Unlike previous approaches that primarily relied on PLMs alone, our model leverages both contextual representation and prior knowledge extracted from each input document.
By conditioning the model on prior knowledge, it emphasizes key sentences in generating the final summary.
We evaluate our model on three benchmark datasets across two languages, demonstrating improved performance compared to strong baseline methods in extractive summarization.
Additionally, our ablation study reveals that injecting knowledge into certain first attention layers yields greater benefits than others.
The model code is publicly available for further exploration.
Related Results
Hubungan Perilaku Pola Makan dengan Kejadian Anak Obesitas
Hubungan Perilaku Pola Makan dengan Kejadian Anak Obesitas
<p><em><span style="font-size: 11.0pt; font-family: 'Times New Roman',serif; mso-fareast-font-family: 'Times New Roman'; mso-ansi-language: EN-US; mso-fareast-langua...
Performance Study on Extractive Text Summarization Using BERT Models
Performance Study on Extractive Text Summarization Using BERT Models
The task of summarization can be categorized into two methods, extractive and abstractive. Extractive summarization selects the salient sentences from the original document to form...
Automatic text summarization based on extractive-abstractive method
Automatic text summarization based on extractive-abstractive method
The choice of this study has a significant impact on daily life. In various fields such as journalism, academia, business, and more, large amounts of text need to be processed quic...
Perspective-Based Microblog Summarization
Perspective-Based Microblog Summarization
Social media allows people to express and share a variety of experiences, opinions, beliefs, interpretations, or viewpoints on a single topic. Summarizing a collection of social me...
Video-to-Text Summarization using Natural Language Processing
Video-to-Text Summarization using Natural Language Processing
Video summarization aims to produce a high-quality text-based summary of videos so that it can convey all the important information or the zest of the videos to users. The process ...
Text Summarizing Using NLP
Text Summarizing Using NLP
In this era everything is digitalized we can find a large amount of digital data for different purposes on the internet and relatively it’s very hard to summarize this data manuall...
Perspective-based Microblog Summarization
Perspective-based Microblog Summarization
Social media allows people to express and share a variety of users’ experiences, opinions, beliefs, interpretations, or viewpoints on a single topic. Summarizing a collection of so...
Exploring Summarization Performance: A Comparison of Pointer Generator, Pegasus, and GPT-3 Models
Exploring Summarization Performance: A Comparison of Pointer Generator, Pegasus, and GPT-3 Models
The world is rapidly advancing technologically and the way we communicate is changing with it.We are now able to send messages through text, voice, or video chat, which means that ...


