更新时间:2021-08-13 15:34:58
封面
Title Page
Copyright and Credits
R Deep Learning Essentials Second Edition
Packt Upsell
Why subscribe?
PacktPub.com
Contributors
About the authors
About the reviewer
Packt is searching for authors like you
Preface
Who this book is for
What this book covers
To get the most out of this book
Download the example code files
Download the color images
Conventions used
Get in touch
Reviews
Getting Started with Deep Learning
What is deep learning?
A conceptual overview of neural networks
Neural networks as an extension of linear regression
Neural networks as a network of memory cells
Deep neural networks
Some common myths about deep learning
Setting up your R environment
Deep learning frameworks for R
MXNet
Keras
Do I need a GPU (and what is it anyway)?
Setting up reproducible results
Summary
Training a Prediction Model
Neural networks in R
Building neural network models
Generating predictions from a neural network
The problem of overfitting data – the consequences explained
Use case – building and applying a neural network
Deep Learning Fundamentals
Building neural networks from scratch in R
Neural network web application
Neural network code
Back to deep learning
The symbol X y and ctx parameters
The num.round and begin.round parameters
The optimizer parameter
The initializer parameter
The eval.metric and eval.data parameters
The epoch.end.callback parameter
The array.batch.size parameter
Using regularization to overcome overfitting
L1 penalty
L1 penalty in action
L2 penalty
L2 penalty in action
Weight decay (L2 penalty in neural networks)
Ensembles and model-averaging
Use case – improving out-of-sample model performance using dropout
Training Deep Prediction Models
Getting started with deep feedforward neural networks
Activation functions
Introduction to the MXNet deep learning library
Deep learning layers
Building a deep learning model
Use case – using MXNet for classification and regression
Data download and exploration
Preparing the data for our models
The binary classification model
The regression model
Improving the binary classification model
The unreasonable effectiveness of data
Image Classification Using Convolutional Neural Networks
CNNs
Convolutional layers
Pooling layers
Dropout
Flatten layers dense layers and softmax
Image classification using the MXNet library
Base model (no convolutional layers)
LeNet
Classification using the fashion MNIST dataset
References/further reading
Tuning and Optimizing Models
Evaluation metrics and evaluating performance
Types of evaluation metric
Evaluating performance
Data preparation
Different data distributions
Data partition between training test and validation sets
Standardization
Data leakage
Data augmentation
Using data augmentation to increase the training data
Test time augmentation