Coursera Deep Learning Course 1 Week 4 notes: Deep neural networks
2017-10-16Deep Neural Network
Deep L-layer neural network
The word ‘deep’ refers to the number of layer of a neural network (not count the input layer).
Some notations:
- = number of layer
- = number of units in layer l (
- = activations in layer l ().
Forward Propagation in a Deep Network
Vectorization for whole training set (stack lowercase matrices in column to obtain capital matrices):
We can’t avoid having a for loop iterating over all layers.
Getting your matrix dimensions right
No notes.
Why deep representing
No notes.
Building blocks of deep neural networks
Forward and Backward Propagation
Forward propagation for layer l
Input
Output , cache()
-
.
-
.
Backward propagation for layer l
Input
Output
-
.
-
.
-
.
-
.
Parameters vs Hyperameters
Parameters: .
Hyperparameters: control parameters.
- Learning rate .
- Number of iterations.
- Number of hidden layers L.
- Number of hidden units.
- Choice of activation functions.
- Many more…
Therefore, applying deep learning is a very empirical process.