Natural Language Processing 9789357462396, 9789357462389

Striking a balance between foundational insights and applications, the book introduces three generations of NLP—rule-bas

148 112 2MB

English Pages 902 Year 2023

Report DMCA / Copyright

DOWNLOAD EPUB FILE

Table of contents :
Contents
1. Cover Page
2. Half Title
3. Title Page
4. Copyright Page
5. Dedication
6. Foreword
7. Preface
8. Author Bios

9. Chapter 1 Introduction
a. 1.1 Language and Linguistics
b. 1.2 Ambiguity and Layers of NLP
c. 1.3 Grammar, Probability, and Data
d. 1.4 Generations of NLP
e. 1.5 Scope of the Book

10. Chapter 2 Representation and NLP
a. 2.1 Ambiguity and Representations
b. 2.2 Generation 1: Belongingness via Grammars
i. 2.2.1 Representing Method Definitions in Python Using a Set of Rules
ii. 2.2.2 Representing Simple English Sentences as a Set of Rules
iii. 2.2.3 Chomsky Hierarchy
iv. 2.2.4 Applications
c. 2.3 Generation 2: Discrete Representational Semantics
i. 2.3.1 n-Gram Vectors
ii. 2.3.2 Caveats
iii. 2.3.3 Limitations
iv. 2.3.4 Statistical Language Models
v. 2.3.5 Use of Statistical Language Modelling
d. 2.4 Generation 3: Dense Representations
i. 2.4.1 Dense Representation of Words
ii. 2.4.2 Neural Language Models
iii. 2.4.3 Bidirectional Encoder Representations from Transformers (BERT)
iv. 2.4.4 XLNet

11. Chapter 3 Shallow Parsing
a. 3.1 Part-of-Speech Tagging
i. 3.1.1 Illustration of Ambiguity in POS Tagging and the -al Rule
ii. 3.1.2 Table Look-Up-Based and Rule-Based POS Tagging
b. 3.2 Statistical POS Tagging
i. 3.2.1 Hidden Markov Model Based Formulation of POS Tagging
ii. 3.2.2 Viterbi Decoding for POS Tagging
iii. 3.2.3 Computational Complexity of Viterbi Decoding
iv. 3.2.4 Parameter Estimation
v. 3.2.5 Discriminative POS Tagging
c. 3.3 Neural POS Tagging
i. 3.3.1 Foundational Considerations
ii. 3.3.2 A Simple POS Tagger Implementation Using Transformer
d. 3.4 Chunking

12. Chapter 4 Deep Parsing
a. 4.1 Linguistics of Parsing
i. 4.1.1 Heads and Modifiers
ii. 4.1.2 Relationship between Constituency and Dependency
iii. 4.1.3 Phrase Structure Grammar Rules
iv. 4.1.4 X-Bar Theory
b. 4.2 Algorithmics of Parsing
i. 4.2.1 Machine Learning and Parsing
c. 4.3 Constituency Parsing: Rule Based
i. 4.3.1 Top-Down Parsing
ii. 4.3.2 Bottom-Up Parsing
iii. 4.3.3 Top-Down–Bottom-Up Chart Parsing
iv. 4.3.4 CYK Parsing
d. 4.4 Statistical Parsing
i. 4.4.1 Computing the Probability of a Parse Tree
ii. 4.4.2 Theory Behind Computing the Probability of a Parse Tree
iii. 4.4.3 CYK Parsing and Probabilities of Constituents
iv. 4.4.4 Need for Efficiency in Computing the Highest Probability Parse Tree
v. 4.4.5 Important Probabilities
e. 4.5 Dependency Parsing
i. 4.5.1 Arguments and Adjuncts
ii. 4.5.2 Algorithmics of Unlabelled Dependency Graph Construction
iii. 4.5.3 Dependency Relations
iv. 4.5.4 Dependency Parsing and Semantic Role Labelling
v. 4.5.5 Projectivity
vi. 4.5.6 Sequence Labelling-Based Dependency Parsing
vii. 4.5.7 Graph-Based Dependency Parsing
f. 4.6 Neural Parsing
i. 4.6.1 Constituency Parsing Using RcNN
ii. 4.6.2 Learning ρ, σ, and λ

13. Chapter 5 Named Entity Recognition
a. 5.1 Problem Formulation
b. 5.2 Ambiguity in Named Entity Recognition
c. 5.3 Datasets
d. 5.4 First Generation: Rule-Based Approaches
e. 5.5 Second Generation: Probabilistic Models
f. 5.6 Third Generation: Sentence Representations and Position-Wise Labelling
g. 5.7 Implications to Other NLP Problems

14. Chapter 6 Natural Language Inference
a. 6.1 Ambiguity in NLI
b. 6.2 Problem Formulation
c. 6.3 Datasets
d. 6.4 First Generation: Logical Reasoning
e. 6.5 Second Generation: Alignment
f. 6.6 Third Generation: Neural Approaches
i. 6.6.1 Attention over Trees

15. Chapter 7 Machine Translation
a. 7.1 Introduction
i. 7.1.1 Ambiguity Resolution in Machine Translation
ii. 7.1.2 RBMT-EBMT-SMT-NMT
iii. 7.1.3 Today’s Ruling Paradigm: Neural Machine Translation
iv. 7.1.4 Ambiguity in Machine Translation: Language Divergence
v. 7.1.5 Vauquois Triangle
b. 7.2 Rule-Based Machine Translation
c. 7.3 Indian Language Statistical Machine Translation
i. 7.3.1 Mitigating the Resource Problem
d. 7.4 Phrase-Based Statistical Machine Translation
i. 7.4.1 Need for Phrase Alignment
ii. 7.4.2 Case of Promotional/Demotional Divergence
iii. 7.4.3 Case of Multiword (Includes Idioms)
iv. 7.4.4 Phrases Are Not Necessarily Linguistic Phrases
v. 7.4.5 Use of the Phrase Table
vi. 7.4.6 Mathematics of Phrase-Based Statistical Machine Translation
vii. 7.4.7 Understanding Phrase-Based Translation Through an Example
e. 7.5 Factor-Based Statistical Machine Translation
f. 7.6 Cooperative NLP: Pivot-Based Machine Translation
g. 7.7 Neural Machine Translation
i. 7.7.1 Encoder–Decoder
ii. 7.7.2 Problem of Long-Distance Dependency
iii. 7.7.3 Attention
iv. 7.7.4 NMT Using Transformers

16. Chapter 8 Sentiment Analysis
a. 8.1 Problem Statement
b. 8.2 Ambiguity for Sentiment Analysis
c. 8.3 Lexicons for Sentiment Analysis
i. 8.3.1 Valence, Arousal, and Dominance
ii. 8.3.2 Wheel of Emotions
iii. 8.3.3 Manual Creation of Lexicons
iv. 8.3.4 Automatic Creation of Lexicons
d. 8.4 Rule-Based Sentiment Analysis
e. 8.5 Statistical Sentiment Analysis
i. 8.5.1 Classification Algorithms
ii. 8.5.2 Naïve Bayes
f. 8.6 Neural Approaches to Sentiment Analysis
g. 8.7 Sentiment Analysis in Different Languages

17. Chapter 9 Question Answering
a. 9.1 Problem Formulation
b. 9.2 Ambiguity in Question Answering
c. 9.3 Dataset Creation
d. 9.4 Rule-based Q&A
e. 9.5 Second Generation
f. 9.6 Third Generation
i. 9.6.1 RNN-Based Model
ii. 9.6.2 BERT-Based Models
iii. 9.6.3 Code Examples

18. Chapter 10 Conversational AI
a. 10.1 Problem Definition
b. 10.2 Ambiguity Resolution in Conversational AI
c. 10.3 Rule-Based Approaches to Conversational AI
i. 10.3.1 Artificial Linguistic Internet Computer Entity (ALICE)
ii. 10.3.2 Genial Understander System (GUS)
d. 10.4 Statistical Approaches
e. 10.5 Neural Approaches
i. 10.5.1 Retrieval-Based Agents
ii. 10.5.2 Generation-Based Agents

19. Chapter 11 Summarization
a. 11.1 Ambiguity in Text Summarization
b. 11.2 Problem Definitions
c. 11.3 Early Work
d. 11.4 Summarization Using Machine Learning
i. 11.4.1 Sentence-Based Summarization
ii. 11.4.2 Graph-Based Summarization
e. 11.5 Summarization Using Deep Learning
i. 11.5.1 Similarity Between Language Representations for Summarization
ii. 11.5.2 RNNs for Summarization
iii. 11.5.3 Pointer-Generator Networks
f. 11.6 Evaluation
i. 11.6.1 Recall-Oriented Understudy for Gisting Evaluation
ii. 11.6.2 Pyramid

20. Chapter 12 NLP of Incongruous Text
a. 12.1 Incongruity and Ambiguity
b. 12.2 Sarcasm Detection
i. 12.2.1 Creation of Datasets
ii. 12.2.2 Rule-Based Approaches
iii. 12.2.3 Statistical Approaches
iv. 12.2.4 Deep Learning-Based Approaches
c. 12.3 Metaphor Detection
i. 12.3.1 Rule-Based Approaches
ii. 12.3.2 Statistical Approaches
iii. 12.3.3 Deep Learning-Based Approaches
d. 12.4 Humour Detection
i. 12.4.1 Dataset Creation
ii. 12.4.2 Rule-Based Approaches
iii. 12.4.3 Statistical Approaches
iv. 12.4.4 Deep Learning-Based Approaches

21. Chapter 13 Large Language Models
a. 13.1 Background
b. 13.2 Ambiguity Resolution
c. 13.3 Generative LLMs
i. 13.3.1 Pre-Training LLMs
ii. 13.3.2 Fine-Tuning LLMs
iii. 13.3.3 Refining LLMs for Conversations
iv. 13.3.4 Enhancement of LLMs Using External Tools
d. 13.4 Usage of LLMs
i. 13.4.1 Risks of Using LLMs
ii. 13.4.2 Prompting
iii. 13.4.3 Applications in Education and Work Productivity

22. Chapter 14 Shared Tasks and Benchmarks
a. 14.1 Background
b. 14.2 Shared Tasks
i. 14.2.1 Motivation
ii. 14.2.2 Overview
iii. 14.2.3 Datasets
iv. 14.2.4 Process
v. 14.2.5 SemEval
vi. 14.2.6 WMT
c. 14.3 NLP Benchmarks
i. 14.3.1 Process
ii. 14.3.2 General Language Understanding Evaluation
iii. 14.3.3 iNLP Suite
iv. 14.3.4 BIG-Bench

23. Chapter 15 NLP Dissemination
a. 15.1 How Is NLP Work Disseminated?
i. 15.1.1 Papers
ii. 15.1.2 Key Bodies
b. 15.2 How Can One Learn about NLP Research?
i. 15.2.1 Forums from Publishing Entities
ii. 15.2.2 Supplementary Online Content
c. 15.3 How Is NLP Work Published?
i. 15.3.1 Publishing at an Event-Based Forum
ii. 15.3.2 Publishing in Journals

24. Index

25. EULA

Natural Language Processing
 9789357462396, 9789357462389

  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
Recommend Papers