100% satisfaction guarantee Immediately available after payment Both online and in PDF No strings attached
logo-home
Summary- Natural Language Generation (INFOMNLG) $7.83   Add to cart

Summary

Summary- Natural Language Generation (INFOMNLG)

 19 views  1 purchase
  • Course
  • Institution

This document includes a summary of all lectures, lecture notes, screenshots of important lecture slides and extra notes to help understand the contents and concepts better.

Preview 4 out of 101  pages

  • March 24, 2024
  • 101
  • 2023/2024
  • Summary
avatar-seller
Natural Language Generation
Lecture 1 – General Introduction
Introduction




What is Natural Language Generation?
• Natural Language Generation: Automatic generation of text in any natural language
• This can take place in different settings
o Text-to-text (e.g. automatic summarisation, machine translation: sth in language
A as input, something in language B as output)
o Data-to-text (e.g. summarising tables of sports or weather data, summarising
patient data)
o Media-to-text (e.g. captioning images, describing videos)
o Open-ended (“creative”?) generation (e.g. generating stories based on
prompts: tell me a story about xyz)
• Current state of the art: Deep neural networks (Transformers) offer a unified
framework in which to deal with all of these.




1

, • There is a classic distinction, which is sometimes left implicit:
• Strategic choices: what to say (street, organ, people)
o Based on the input
o Based on additional knowledge (what you already know)
o Based on the target language
• Tactical choices: how to say it → Highly dependent on language (A street organ on a city
street/ Een traditioneel draaiorgel in Utrecht)
• Originally proposed by Thomson and features in several architectures for (human)
production and (automatic) generation.
• The same football match can be described entirely differently depending on whose side
you’re on/ the perspective
• Hallucination: when the model predicts something, e.g. hail, because the data contains
parts about showers and comparable weather conditions

3 dimensions to consider when generating text




2

,Lecture 2 - What are the subtasks involved in generating text?
The classic pipeline architecture for NLG and its sub-tasks
• What is involved in NLG? It’s all about choices.
• Modular versus end-to-end
o A modular architecture breaks down the main task into sub-tasks, modelling
each one separately. This was the dominant approach in “classical” (pre-neural)
NLG systems.
▪ breaks steps up from the input in steps, breaking up big tasks in subtasks
o In end-to-end models, there might be no (or fewer) explicit subtasks. This does
not mean that the choices are not made.
o A classic approach to NLG involves breaking down the generation process into
stages, such as content selection, rhetorical structuring, ordering, lexicalization,
aggregation, referring expressions, and syntactic planning. These stages can be
implemented using either modular architectures, where each sub-task is
modeled separately, or end-to-end models, which integrate multiple tasks into a
single framework. Both approaches have their advantages and trade-offs.
• The early “consensus”
o Reiter and Reiter and Dale argued that the various tasks can be grouped in a
three-stage pipeline. Their architecture represented a “consensus” view.




o
o Pipeline: you start with an input → then you have some communicative goal:
many systems are designed to inform people about something, but it could also
be to entertain → plan what to say and structure those messages, which are not
linguistic yet into a document plan. Goal of document planner: choose what to
say and structure it in a certain way and target relationships → microplanning
stage: where document plan begins to be lashed out, in a more linguistic way →
surface realiser is the actual text
o Domain knowledge is important; how you structure a document to report about
e.g. a football match is governed by knowledge of conventions
o Also, who you are generating for (doctor vs nurse vs family member) → what
lexical/ grammatical knowledge do you assume?




3

, o
o Strategic tasks (what to say):
▪ What information to include (what are people wearing in a football
match might not be important); depending on how much you assume
your user knows
▪ Rhethorical structuring
▪ Ordering
▪ Segmentation: some things you can merge (this person scored a goal, but
if there was a tackle before that, you also include that part)
o Tactical tasks:
▪ What words to use
▪ How to refer to things
▪ Some sentences merged to help with the narrative flow
o Tactical tasks
▪ Syntactic structure
▪ Morphologic rules: Rules at level of the world (change form of verb)
• The case of raw input data
o Some NLG systems have to deal with raw, unstructured data. This means that
prior to generating text, the data has to be analysed in order to:
1. Identify the important things and filter out noise
2. Map the data to appropriate input representations
3. Perform some reasoning on these representations
o Image caption → pixels
o Pre-processing to figure out what the objects are
• Extending the original architecture to handle data pre-processing
o Reiter (2007) proposed to extend the “consensus” architecture
to deal with preliminary stages of:
1. Signal analysis: to extract patterns and trends from
unstructured input data;
2. Data interpretation: the perform reasoning on the
results




4

The benefits of buying summaries with Stuvia:

Guaranteed quality through customer reviews

Guaranteed quality through customer reviews

Stuvia customers have reviewed more than 700,000 summaries. This how you know that you are buying the best documents.

Quick and easy check-out

Quick and easy check-out

You can quickly pay through credit card or Stuvia-credit for the summaries. There is no membership needed.

Focus on what matters

Focus on what matters

Your fellow students write the study notes themselves, which is why the documents are always reliable and up-to-date. This ensures you quickly get to the core!

Frequently asked questions

What do I get when I buy this document?

You get a PDF, available immediately after your purchase. The purchased document is accessible anytime, anywhere and indefinitely through your profile.

Satisfaction guarantee: how does it work?

Our satisfaction guarantee ensures that you always find a study document that suits you well. You fill out a form, and our customer service team takes care of the rest.

Who am I buying these notes from?

Stuvia is a marketplace, so you are not buying this document from us, but from seller IsabelleU. Stuvia facilitates payment to the seller.

Will I be stuck with a subscription?

No, you only buy these notes for $7.83. You're not tied to anything after your purchase.

Can Stuvia be trusted?

4.6 stars on Google & Trustpilot (+1000 reviews)

80202 documents were sold in the last 30 days

Founded in 2010, the go-to place to buy study notes for 14 years now

Start selling
$7.83  1x  sold
  • (0)
  Add to cart