Important Deep learning Concept Explained Part – 2
- Naveen
- 0
Converge
Algorithm that converges will eventually reach an optimal answer, even if very slowly. An algorithm that doesn’t converge may never reach an optimal answer.
Learning Rate
Rate at which optimizers change weights and biases. High learning rate generally trains faster but risks not converging whereas a lower rate trains slower.
Numerical instability
Issues with very large/small values due to limits of floating-point numbers in computers.
Embeddings
Mapping from discrete objects. Such as words, to vectors of real numbers. Useful because classifier/neural networks work well on vectors of real numbers.
Convolutional layer
Series of convolutional operations, each acting a different slice of the input matrix.
Dropout
Method for regularization that involves ending training early.
Gradient descent
Technique to minimize loss by computing the gradients of loss with respect to the model’s parameters, conditioned on training data.