SlideShare a Scribd company logo
Improving Neural Abstractive Text Summarization
with Prior Knowledge
Gaetano Rossiello, Pierpaolo Basile, Giovanni Semeraro, Marco Di Ciano and
Gaetano Grasso
gaetano.rossiello@uniba.it
Department of Computer Science
University of Bari - Aldo Moro, Italy
URANIA 16 - 1st Italian Workshop on Deep Understanding and Reasoning:
A challenge for Next-generation Intelligent Agents
28 November 2016
AI*IA 16 - Genoa, Italy
Text Summarization
The goal of summarization is to produce a shorter version of a source
text by preserving the meaning and the key contents of the original.
A well written summary can significantly reduce the amount of
cognitive work needed to digest large amounts of text.
Gaetano Rossiello, et al. Neural Abstractive Text Summarization
Information Overload
Information overload is a problem in modern digital society caused
by the explosion of the amount of information produced on both the
World Wide Web and the enterprise environments.
Gaetano Rossiello, et al. Neural Abstractive Text Summarization
Text Summarization - Approaches
Input
Single-document
Multi-document
Output
Extractive
Abstract
Headline
Extractive Summarization
The generated summary is a selection of relevant sentences from the
source text in a copy-paste fashion.
Abstractive Summarization
The generated summary is a new cohesive text not necessarily present
in the original source.
Gaetano Rossiello, et al. Neural Abstractive Text Summarization
Extractive Summarization - Methods
Statistical methods
Features based
Machine Learning
Fuzzy Logic
Graph based
Distributional Semantic
LSA (Latent Semantic Analysis)
NMF (Non-Negative Matrix Factorization)
Word2Vec
Gaetano Rossiello, et al. Neural Abstractive Text Summarization
Abstractive Summarization: a Challenging Task
Abstractive summarization requires deep understanding and
reasoning over the text, determining the explicit or implicit meaning
of each element, such as words, phrases, sentences and paragraphs,
and making inferences about their propertiesa in order to generate
new sentences which compose the summary
a
Norvig, P.: Inference in text understanding. AAAI, 1987.
– Abstractive Example –
Original: Russian defense minister Ivanov called Sunday for the
creation of a joint front for combating global terrorism.
Summary: Russia calls for joint front against terrorism.
Gaetano Rossiello, et al. Neural Abstractive Text Summarization
Deep Learning for Abstractive Text Summarization
Idea
Casting the summarization task as a neural machine translation
problem, where the models, trained on a large amount of data,
learn the alignments between the input text and the target summary
through an attention encoder-decoder paradigm.
Rush, A., et al. A neural attention model for abstractive
sentence summarization. EMNLP 2015
Nallapati, R., et al. Sequence-to-sequence RNNs for text
summarization and Beyond. CoNNL 2016
Chopra, S., et al. Abstractive Sentence Summarization
with Attentive Recurrent Neural Networks. NAACL 2016
Gaetano Rossiello, et al. Neural Abstractive Text Summarization
Deep Learning for Abstractive Text Summarization
1
Rush, A., et al.: A neural attention model for abstractive sentence
summarization. EMNLP 2015
Gaetano Rossiello, et al. Neural Abstractive Text Summarization
Abstractive Summarization - Problem Formulation
Let us consider:
Original text x = {x1, x2, . . . , xn}
Summary y = {y1, y2, . . . , ym}
where n >> m and xi , yj ∈ V (V is the vocabulary)
A probabilistic perspective goal
The summarization problem consists in finding an output sequence
y that maximizes the conditional probability of y given an input
sequence x
arg maxy∈V P(y|x)
P(y|x) = P(y|x; θ) =
|y|
t=1 P(yt|{y1, . . . , yt−1}, x; θ)
where θ denotes a set of parameters learnt from a training set of
source text and target summary pairs.
Gaetano Rossiello, et al. Neural Abstractive Text Summarization
Recurrent Neural Networks
Recurrent neural network (RNN) is a neural network model
proposed in the 80’s for modelling time series.
The structure of the network is similar to feedforward neural
network, with the distinction that it allows a recurrent hidden
state whose activation at each time is dependent on that of the
previous time (cycle).
Gaetano Rossiello, et al. Neural Abstractive Text Summarization
Sequence to Sequence Learning
Sequence to sequence learning problem can be modeled by
RNNs using a encoder-decoder paradigm.
The encoder is a RNN that reads one token at time from the
input source and returns a fixed-size vector representing the
input text.
The decoder is another RNN that generates words for the
summary and it is conditioned by the vector representation
returned by the first network.
Gaetano Rossiello, et al. Neural Abstractive Text Summarization
Abstractive Summarization and Sequence to Sequence
P(y|x; θ) = P(yt|{y1, . . . , yt−1}, x; θ) = gθ(ht, c)
where:
ht = gθ(yt−1, ht−1, c)
The vector context c is the output of the encoder and it
encodes the representation of the whole input source.
gθ is a RNN and it can be modeled using:
Elman RNN
LSTM (Long-Short Term Memory)
GRU (Gated Recurrent Unit)
At the time t the decoder RNN computes the probability of the word
yt given the last hidden state ht and the context input c.
Gaetano Rossiello, et al. Neural Abstractive Text Summarization
Limits of the State-of-the-Art Neural Models
The proposed neural attention-based models for abstractive
summarization are still in an early stage, thus they show some
limitations:
Problems in distinguish rare and unknown words
Grammar errors in the generated summaries
– Example –
Suppose that none of two tokens 10 and Genoa belong to the
vocabulary, then the model cannot distinguish the probability of the
two sentences:
The airport is about 10 kilometers.
The airport is about Genoa kilometers.
Gaetano Rossiello, et al. Neural Abstractive Text Summarization
Infuse Prior Knowledge into Neural Networks
Our Idea
Infuse prior knowledge, such as linguistic features, into a RNNs
in order to overtake these limits.
Motivation:
The
DT
airport
NN
is
VBZ
about
IN
?
CD
kilometers
NNS
where CD is the Part-of-Speech (POS) tag that identifies a cardinal
number.
Thus, 10 is the unknown token with the higher probability because
it is tagged as CD.
Introducing information about the syntactical role of each word, the
neural network can tend to learn the right collocation of the words
by belonging to a certain part-of-speech class.
Gaetano Rossiello, et al. Neural Abstractive Text Summarization
Infuse Prior Knowledge into Neural Networks
Preliminary approach:
Combine hand-crafted linguistic features and embeddings as
input vectors into RNNs.
Substitute the softmax layer of neural network with a
Log-Linear model.
Gaetano Rossiello, et al. Neural Abstractive Text Summarization
Evaluation Plan - Dataset
We plan to evaluate our models on gold-standard datasets for the
summarization task:
DUC (Document Understanding Conference) 2002-20071
TAC (Text Analysis Conference) 2008-20112
Gigaword3
CNN/DailyMail4
Cornell University Library 5
Local government documents 6
1
http://duc.nist.gov/
2
http://tac.nist.gov/data/index.html
3
https://catalog.ldc.upenn.edu/LDC2012T21
4
https://github.com/deepmind/rc-data
5
https://arxiv.org/
6
made available by InnovaPuglia S.p.A.
Gaetano Rossiello, et al. Neural Abstractive Text Summarization
Evaluation Plan - Metric
ROUGE (Recall-Oriented Understudy for Gisting Evaluation)
ROUGEa metrics compare an automatically produced summary
against a reference or a set of references (human-produced) summary.
a
Lin, Chin-Yew. ROUGE: a Package for Automatic Evaluation of Summaries.
WAS 2004
ROUGE-N: N-gram based co-occurrence statistics.
ROUGE-L: Longest Common Subsequence (LCS) based
statistics.
ROUGEN(X) = S∈{Ref Summaries} gramn∈S countmatch(gramn,X)
S∈{Ref Summaries} gramn∈S count(gramn)
Gaetano Rossiello, et al. Neural Abstractive Text Summarization
Future Works
Evaluate the proposed approach by comparing it with the SOA
models.
Integrate relational semantic knowledge into RNNs in order
to learn jointly word and knowledge embeddings by exploiting
knowledge bases and lexical thesaurus.
Abstractive summaries from whole documents or multiple
documents.
Gaetano Rossiello, et al. Neural Abstractive Text Summarization
Improving Neural Abstractive Text Summarization with Prior Knowledge

More Related Content

PDF
text summarization using amr
PDF
Abstractive Text Summarization
PDF
Text summarization
PDF
Text summarization
PDF
Text Summarization
PDF
Document Summarization
PDF
Extraction Based automatic summarization
PDF
Speech emotion recognition
text summarization using amr
Abstractive Text Summarization
Text summarization
Text summarization
Text Summarization
Document Summarization
Extraction Based automatic summarization
Speech emotion recognition

What's hot (20)

PPTX
Text summarization
PPTX
TEXT SUMMARIZATION
PPTX
NLP_KASHK:Minimum Edit Distance
PPTX
Image captioning
PDF
Pegasus
PDF
Recurrent Neural Networks, LSTM and GRU
PPTX
Text summarization using deep learning
PDF
Human Activity Recognition
PDF
Transformer Introduction (Seminar Material)
PPT
Natural language processing
PPTX
Unsupervised Machine Learning
PPTX
Recurrent Neural Network (RNN) | RNN LSTM Tutorial | Deep Learning Course | S...
PPTX
Sms spam-detection
PPTX
Introduction to Named Entity Recognition
PPTX
Natural Language Processing: Parsing
PPTX
Building Named Entity Recognition Models Efficiently using NERDS
PDF
Deep Learning for Natural Language Processing: Word Embeddings
PDF
Recurrent Neural Networks
PDF
Neural Networks: Multilayer Perceptron
PPTX
Natural Language Processing
Text summarization
TEXT SUMMARIZATION
NLP_KASHK:Minimum Edit Distance
Image captioning
Pegasus
Recurrent Neural Networks, LSTM and GRU
Text summarization using deep learning
Human Activity Recognition
Transformer Introduction (Seminar Material)
Natural language processing
Unsupervised Machine Learning
Recurrent Neural Network (RNN) | RNN LSTM Tutorial | Deep Learning Course | S...
Sms spam-detection
Introduction to Named Entity Recognition
Natural Language Processing: Parsing
Building Named Entity Recognition Models Efficiently using NERDS
Deep Learning for Natural Language Processing: Word Embeddings
Recurrent Neural Networks
Neural Networks: Multilayer Perceptron
Natural Language Processing
Ad

Similar to Improving Neural Abstractive Text Summarization with Prior Knowledge (20)

PDF
Comparative Study of Abstractive Text Summarization Techniques
PDF
Abigail See - 2017 - Get To The Point: Summarization with Pointer-Generator N...
PPTX
team10.ppt.pptx
PDF
A Newly Proposed Technique for Summarizing the Abstractive Newspapers’ Articl...
PDF
A Newly Proposed Technique for Summarizing the Abstractive Newspapers’ Articl...
PDF
A Newly Proposed Technique for Summarizing the Abstractive Newspapers’ Articl...
PDF
Text Summarization of Food Reviews using AbstractiveSummarization and Recurre...
PDF
A hybrid approach for text summarization using semantic latent Dirichlet allo...
PPTX
presenttat related toautomated text summ
PDF
SummaRuNNer: A Recurrent Neural Network based Sequence Model for Extractive S...
PPTX
SummaRuNNer: A Recurrent Neural Network based Sequence Model for Extractive ...
PDF
Text Summarization Talk @ Saama Technologies
PDF
A Neural Attention Model for Sentence Summarization
PDF
NLG, Training, Inference & Evaluation
PDF
Conceptual framework for abstractive text summarization
PPTX
Joint Copying and Restricted Generation for Paraphrase
PDF
textsummarization-17123018102232 (3).pdf
PDF
Automatic text summarization of konkani texts using pre-trained word embeddin...
PDF
Deep recurrent generative decoder for abstractive text summarization
PDF
NLP Based Text Summarization Using Semantic Analysis
Comparative Study of Abstractive Text Summarization Techniques
Abigail See - 2017 - Get To The Point: Summarization with Pointer-Generator N...
team10.ppt.pptx
A Newly Proposed Technique for Summarizing the Abstractive Newspapers’ Articl...
A Newly Proposed Technique for Summarizing the Abstractive Newspapers’ Articl...
A Newly Proposed Technique for Summarizing the Abstractive Newspapers’ Articl...
Text Summarization of Food Reviews using AbstractiveSummarization and Recurre...
A hybrid approach for text summarization using semantic latent Dirichlet allo...
presenttat related toautomated text summ
SummaRuNNer: A Recurrent Neural Network based Sequence Model for Extractive S...
SummaRuNNer: A Recurrent Neural Network based Sequence Model for Extractive ...
Text Summarization Talk @ Saama Technologies
A Neural Attention Model for Sentence Summarization
NLG, Training, Inference & Evaluation
Conceptual framework for abstractive text summarization
Joint Copying and Restricted Generation for Paraphrase
textsummarization-17123018102232 (3).pdf
Automatic text summarization of konkani texts using pre-trained word embeddin...
Deep recurrent generative decoder for abstractive text summarization
NLP Based Text Summarization Using Semantic Analysis
Ad

Recently uploaded (20)

PPTX
Overview of calcium in human muscles.pptx
PPTX
neck nodes and dissection types and lymph nodes levels
PPTX
POULTRY PRODUCTION AND MANAGEMENTNNN.pptx
PPT
6.1 High Risk New Born. Padetric health ppt
PPTX
ognitive-behavioral therapy, mindfulness-based approaches, coping skills trai...
PPTX
Vitamins & Minerals: Complete Guide to Functions, Food Sources, Deficiency Si...
PDF
Lymphatic System MCQs & Practice Quiz – Functions, Organs, Nodes, Ducts
PDF
Looking into the jet cone of the neutrino-associated very high-energy blazar ...
PPTX
C1 cut-Methane and it's Derivatives.pptx
DOCX
Q1_LE_Mathematics 8_Lesson 5_Week 5.docx
PPTX
Fluid dynamics vivavoce presentation of prakash
PPTX
BIOMOLECULES PPT........................
PDF
Cosmic Outliers: Low-spin Halos Explain the Abundance, Compactness, and Redsh...
PDF
The scientific heritage No 166 (166) (2025)
PDF
Phytochemical Investigation of Miliusa longipes.pdf
PPTX
2. Earth - The Living Planet earth and life
PPTX
Microbiology with diagram medical studies .pptx
PDF
SEHH2274 Organic Chemistry Notes 1 Structure and Bonding.pdf
PDF
lecture 2026 of Sjogren's syndrome l .pdf
PDF
Assessment of environmental effects of quarrying in Kitengela subcountyof Kaj...
Overview of calcium in human muscles.pptx
neck nodes and dissection types and lymph nodes levels
POULTRY PRODUCTION AND MANAGEMENTNNN.pptx
6.1 High Risk New Born. Padetric health ppt
ognitive-behavioral therapy, mindfulness-based approaches, coping skills trai...
Vitamins & Minerals: Complete Guide to Functions, Food Sources, Deficiency Si...
Lymphatic System MCQs & Practice Quiz – Functions, Organs, Nodes, Ducts
Looking into the jet cone of the neutrino-associated very high-energy blazar ...
C1 cut-Methane and it's Derivatives.pptx
Q1_LE_Mathematics 8_Lesson 5_Week 5.docx
Fluid dynamics vivavoce presentation of prakash
BIOMOLECULES PPT........................
Cosmic Outliers: Low-spin Halos Explain the Abundance, Compactness, and Redsh...
The scientific heritage No 166 (166) (2025)
Phytochemical Investigation of Miliusa longipes.pdf
2. Earth - The Living Planet earth and life
Microbiology with diagram medical studies .pptx
SEHH2274 Organic Chemistry Notes 1 Structure and Bonding.pdf
lecture 2026 of Sjogren's syndrome l .pdf
Assessment of environmental effects of quarrying in Kitengela subcountyof Kaj...

Improving Neural Abstractive Text Summarization with Prior Knowledge

  • 1. Improving Neural Abstractive Text Summarization with Prior Knowledge Gaetano Rossiello, Pierpaolo Basile, Giovanni Semeraro, Marco Di Ciano and Gaetano Grasso [email protected] Department of Computer Science University of Bari - Aldo Moro, Italy URANIA 16 - 1st Italian Workshop on Deep Understanding and Reasoning: A challenge for Next-generation Intelligent Agents 28 November 2016 AI*IA 16 - Genoa, Italy
  • 2. Text Summarization The goal of summarization is to produce a shorter version of a source text by preserving the meaning and the key contents of the original. A well written summary can significantly reduce the amount of cognitive work needed to digest large amounts of text. Gaetano Rossiello, et al. Neural Abstractive Text Summarization
  • 3. Information Overload Information overload is a problem in modern digital society caused by the explosion of the amount of information produced on both the World Wide Web and the enterprise environments. Gaetano Rossiello, et al. Neural Abstractive Text Summarization
  • 4. Text Summarization - Approaches Input Single-document Multi-document Output Extractive Abstract Headline Extractive Summarization The generated summary is a selection of relevant sentences from the source text in a copy-paste fashion. Abstractive Summarization The generated summary is a new cohesive text not necessarily present in the original source. Gaetano Rossiello, et al. Neural Abstractive Text Summarization
  • 5. Extractive Summarization - Methods Statistical methods Features based Machine Learning Fuzzy Logic Graph based Distributional Semantic LSA (Latent Semantic Analysis) NMF (Non-Negative Matrix Factorization) Word2Vec Gaetano Rossiello, et al. Neural Abstractive Text Summarization
  • 6. Abstractive Summarization: a Challenging Task Abstractive summarization requires deep understanding and reasoning over the text, determining the explicit or implicit meaning of each element, such as words, phrases, sentences and paragraphs, and making inferences about their propertiesa in order to generate new sentences which compose the summary a Norvig, P.: Inference in text understanding. AAAI, 1987. – Abstractive Example – Original: Russian defense minister Ivanov called Sunday for the creation of a joint front for combating global terrorism. Summary: Russia calls for joint front against terrorism. Gaetano Rossiello, et al. Neural Abstractive Text Summarization
  • 7. Deep Learning for Abstractive Text Summarization Idea Casting the summarization task as a neural machine translation problem, where the models, trained on a large amount of data, learn the alignments between the input text and the target summary through an attention encoder-decoder paradigm. Rush, A., et al. A neural attention model for abstractive sentence summarization. EMNLP 2015 Nallapati, R., et al. Sequence-to-sequence RNNs for text summarization and Beyond. CoNNL 2016 Chopra, S., et al. Abstractive Sentence Summarization with Attentive Recurrent Neural Networks. NAACL 2016 Gaetano Rossiello, et al. Neural Abstractive Text Summarization
  • 8. Deep Learning for Abstractive Text Summarization 1 Rush, A., et al.: A neural attention model for abstractive sentence summarization. EMNLP 2015 Gaetano Rossiello, et al. Neural Abstractive Text Summarization
  • 9. Abstractive Summarization - Problem Formulation Let us consider: Original text x = {x1, x2, . . . , xn} Summary y = {y1, y2, . . . , ym} where n >> m and xi , yj ∈ V (V is the vocabulary) A probabilistic perspective goal The summarization problem consists in finding an output sequence y that maximizes the conditional probability of y given an input sequence x arg maxy∈V P(y|x) P(y|x) = P(y|x; θ) = |y| t=1 P(yt|{y1, . . . , yt−1}, x; θ) where θ denotes a set of parameters learnt from a training set of source text and target summary pairs. Gaetano Rossiello, et al. Neural Abstractive Text Summarization
  • 10. Recurrent Neural Networks Recurrent neural network (RNN) is a neural network model proposed in the 80’s for modelling time series. The structure of the network is similar to feedforward neural network, with the distinction that it allows a recurrent hidden state whose activation at each time is dependent on that of the previous time (cycle). Gaetano Rossiello, et al. Neural Abstractive Text Summarization
  • 11. Sequence to Sequence Learning Sequence to sequence learning problem can be modeled by RNNs using a encoder-decoder paradigm. The encoder is a RNN that reads one token at time from the input source and returns a fixed-size vector representing the input text. The decoder is another RNN that generates words for the summary and it is conditioned by the vector representation returned by the first network. Gaetano Rossiello, et al. Neural Abstractive Text Summarization
  • 12. Abstractive Summarization and Sequence to Sequence P(y|x; θ) = P(yt|{y1, . . . , yt−1}, x; θ) = gθ(ht, c) where: ht = gθ(yt−1, ht−1, c) The vector context c is the output of the encoder and it encodes the representation of the whole input source. gθ is a RNN and it can be modeled using: Elman RNN LSTM (Long-Short Term Memory) GRU (Gated Recurrent Unit) At the time t the decoder RNN computes the probability of the word yt given the last hidden state ht and the context input c. Gaetano Rossiello, et al. Neural Abstractive Text Summarization
  • 13. Limits of the State-of-the-Art Neural Models The proposed neural attention-based models for abstractive summarization are still in an early stage, thus they show some limitations: Problems in distinguish rare and unknown words Grammar errors in the generated summaries – Example – Suppose that none of two tokens 10 and Genoa belong to the vocabulary, then the model cannot distinguish the probability of the two sentences: The airport is about 10 kilometers. The airport is about Genoa kilometers. Gaetano Rossiello, et al. Neural Abstractive Text Summarization
  • 14. Infuse Prior Knowledge into Neural Networks Our Idea Infuse prior knowledge, such as linguistic features, into a RNNs in order to overtake these limits. Motivation: The DT airport NN is VBZ about IN ? CD kilometers NNS where CD is the Part-of-Speech (POS) tag that identifies a cardinal number. Thus, 10 is the unknown token with the higher probability because it is tagged as CD. Introducing information about the syntactical role of each word, the neural network can tend to learn the right collocation of the words by belonging to a certain part-of-speech class. Gaetano Rossiello, et al. Neural Abstractive Text Summarization
  • 15. Infuse Prior Knowledge into Neural Networks Preliminary approach: Combine hand-crafted linguistic features and embeddings as input vectors into RNNs. Substitute the softmax layer of neural network with a Log-Linear model. Gaetano Rossiello, et al. Neural Abstractive Text Summarization
  • 16. Evaluation Plan - Dataset We plan to evaluate our models on gold-standard datasets for the summarization task: DUC (Document Understanding Conference) 2002-20071 TAC (Text Analysis Conference) 2008-20112 Gigaword3 CNN/DailyMail4 Cornell University Library 5 Local government documents 6 1 http://duc.nist.gov/ 2 http://tac.nist.gov/data/index.html 3 https://catalog.ldc.upenn.edu/LDC2012T21 4 https://github.com/deepmind/rc-data 5 https://arxiv.org/ 6 made available by InnovaPuglia S.p.A. Gaetano Rossiello, et al. Neural Abstractive Text Summarization
  • 17. Evaluation Plan - Metric ROUGE (Recall-Oriented Understudy for Gisting Evaluation) ROUGEa metrics compare an automatically produced summary against a reference or a set of references (human-produced) summary. a Lin, Chin-Yew. ROUGE: a Package for Automatic Evaluation of Summaries. WAS 2004 ROUGE-N: N-gram based co-occurrence statistics. ROUGE-L: Longest Common Subsequence (LCS) based statistics. ROUGEN(X) = S∈{Ref Summaries} gramn∈S countmatch(gramn,X) S∈{Ref Summaries} gramn∈S count(gramn) Gaetano Rossiello, et al. Neural Abstractive Text Summarization
  • 18. Future Works Evaluate the proposed approach by comparing it with the SOA models. Integrate relational semantic knowledge into RNNs in order to learn jointly word and knowledge embeddings by exploiting knowledge bases and lexical thesaurus. Abstractive summaries from whole documents or multiple documents. Gaetano Rossiello, et al. Neural Abstractive Text Summarization