Javascript must be enabled to continue!
Abstractive Text Summarisation using T5 Transformer Architecture with analysis
View through CrossRef
Abstract
Now a days, Text summarization has become important as the amount of text data available online grows at an exponential rate. Most of the text classification systems require going through a huge amount of data. In general, Producing exact and meaningful summaries of big texts is a time-consuming endeavour. Hence generating abstract summaries which retain the key information of the data and using it to train machine learning models will make these models space and time-efficient. Abstractive text summarization has been successful in moving from linear models to nonlinear neural network models using sparse models [1]. This success comes from the application of deep learning models on natural language processing tasks where these mod-els are capable of modeling the interrelating patterns in data without hand-crafted features. The Text to Text Transfer Transformer(T5) approach was used to investigate the text summarization problem, and the results showed that the Transfer Learning based model performed significantly better for abstractive text summarization than the Sequence to Sequence Recurrent Model.
Title: Abstractive Text Summarisation using T5 Transformer Architecture with analysis
Description:
Abstract
Now a days, Text summarization has become important as the amount of text data available online grows at an exponential rate.
Most of the text classification systems require going through a huge amount of data.
In general, Producing exact and meaningful summaries of big texts is a time-consuming endeavour.
Hence generating abstract summaries which retain the key information of the data and using it to train machine learning models will make these models space and time-efficient.
Abstractive text summarization has been successful in moving from linear models to nonlinear neural network models using sparse models [1].
This success comes from the application of deep learning models on natural language processing tasks where these mod-els are capable of modeling the interrelating patterns in data without hand-crafted features.
The Text to Text Transfer Transformer(T5) approach was used to investigate the text summarization problem, and the results showed that the Transfer Learning based model performed significantly better for abstractive text summarization than the Sequence to Sequence Recurrent Model.
Related Results
Automatic Load Sharing of Transformer
Automatic Load Sharing of Transformer
Transformer plays a major role in the power system. It works 24 hours a day and provides power to the load. The transformer is excessive full, its windings are overheated which lea...
High frequency modeling of power transformers under transients
High frequency modeling of power transformers under transients
This thesis presents the results related to high frequency modeling of power transformers. First, a 25kVA distribution transformer under lightning surges is tested in the laborator...
Abstractive text summarization of low-resourced languages using deep learning
Abstractive text summarization of low-resourced languages using deep learning
Background
Humans must be able to cope with the huge amounts of information produced by the information technology revolution. As a result, automatic text summarizat...
The architecture of differences
The architecture of differences
Following in the footsteps of the protagonists of the Italian architectural debate is a mark of culture and proactivity. The synthesis deriving from the artistic-humanistic factors...
ANALISIS PENGARUH MASA OPERASIONAL TERHADAP PENURUNAN KAPASITAS TRANSFORMATOR DISTRIBUSI DI PT PLN (PERSERO)
ANALISIS PENGARUH MASA OPERASIONAL TERHADAP PENURUNAN KAPASITAS TRANSFORMATOR DISTRIBUSI DI PT PLN (PERSERO)
One cause the interruption of transformer is loading that exceeds the capabilities of the transformer. The state of continuous overload will affect the age of the transformer and r...
E-Press and Oppress
E-Press and Oppress
From elephants to ABBA fans, silicon to hormone, the following discussion uses a new research method to look at printed text, motion pictures and a te...
Automatic text summarization based on extractive-abstractive method
Automatic text summarization based on extractive-abstractive method
The choice of this study has a significant impact on daily life. In various fields such as journalism, academia, business, and more, large amounts of text need to be processed quic...
LIFE CYCLE OF TRANSFORMER 110/X KV AND ITS VALUE
LIFE CYCLE OF TRANSFORMER 110/X KV AND ITS VALUE
In a deregulated environment, power companies are in the constant process of reducing the costs of operating power facilities, with the aim of optimally improving the quality of de...

