Tag: ReLU activation function
Difference between Leaky ReLU and ReLU activation function?
- Naveen
- 0
Leaky ReLU is a type of activation function that helps to prevent the function from becoming saturated at 0. It has a small slope instead of the standard ReLU which has an infinite slope Leaky ReLU is a modification of the ReLU activation function. It has the same form as the ReLU, but it will…
Read MoreWhat is ReLU and Sigmoid activation function?
- Naveen
- 0
The activation function is a nonlinear function that takes in the weighted sum and produces the output. They are used to provide a more simplified model of neuron behavior which can be used as an input to deep neural networks. There are many different activation functions that can be used, including sigmoid, hyperbolic tangent, logistic,…
Read More