更新时间:2021-06-25 21:28:44
封面
版权信息
PacktPub.com
Contributors
About the reviewers
Packt is searching for authors like you
Preface
What this book covers
To get the most out of this book
Get in touch
Chapter 1. Introduction to Natural Language Processing
What is Natural Language Processing?
Tasks of Natural Language Processing
The traditional approach to Natural Language Processing
The deep learning approach to Natural Language Processing
The roadmap – beyond this chapter
Introduction to the technical tools
Summary
Chapter 2. Understanding TensorFlow
What is TensorFlow?
Inputs variables outputs and operations
Reusing variables with scoping
Implementing our first neural network
Chapter 3. Word2vec – Learning Word Embeddings
What is a word representation or meaning?
Classical approaches to learning word representation
Word2vec – a neural network-based approach to learning word representation
The skip-gram algorithm
The Continuous Bag-of-Words algorithm
Chapter 4. Advanced Word2vec
The original skip-gram algorithm
Comparing skip-gram with CBOW
Extensions to the word embeddings algorithms
More recent algorithms extending skip-gram and CBOW
GloVe – Global Vectors representation
Document classification with Word2vec
Chapter 5. Sentence Classification with Convolutional Neural Networks
Introducing Convolution Neural Networks
Understanding Convolution Neural Networks
Exercise – image classification on MNIST with CNN
Using CNNs for sentence classification
Chapter 6. Recurrent Neural Networks
Understanding Recurrent Neural Networks
Backpropagation Through Time
Applications of RNNs
Generating text with RNNs
Evaluating text results output from the RNN
Perplexity – measuring the quality of the text result
Recurrent Neural Networks with Context Features – RNNs with longer memory
Chapter 7. Long Short-Term Memory Networks
Understanding Long Short-Term Memory Networks
How LSTMs solve the vanishing gradient problem
Other variants of LSTMs
Chapter 8. Applications of LSTM – Generating Text
Our data
Implementing an LSTM
Comparing LSTMs to LSTMs with peephole connections and GRUs
Improving LSTMs – beam search
Improving LSTMs – generating text with words instead of n-grams
Using the TensorFlow RNN API
Chapter 9. Applications of LSTM – Image Caption Generation
Getting to know the data
The machine learning pipeline for image caption generation
Extracting image features with CNNs
Implementation – loading weights and inferencing with VGG-
Learning word embeddings
Preparing captions for feeding into LSTMs
Generating data for LSTMs
Defining the LSTM
Evaluating the results quantitatively
Captions generated for test images
Using TensorFlow RNN API with pretrained GloVe word vectors
Chapter 10. Sequence-to-Sequence Learning – Neural Machine Translation
Machine translation
A brief historical tour of machine translation
Understanding Neural Machine Translation
Preparing data for the NMT system
Training the NMT
Inference with NMT
The BLEU score – evaluating the machine translation systems
Implementing an NMT from scratch – a German to English translator
Training an NMT jointly with word embeddings
Improving NMTs
Attention
Other applications of Seq2Seq models – chatbots
Chapter 11. Current Trends and the Future of Natural Language Processing
Current trends in NLP
Penetration into other research fields
Towards Artificial General Intelligence
NLP for social media
New tasks emerging