Texonom
Texonom
/
Engineering
Engineering
/Data Engineering/Artificial Intelligence/Machine Learning/Neural Network/Neural Network Structure/Activation Function/
ELU Function
Search

ELU Function

Creator
Creator
Seonglae Cho
Created
Created
2023 Jun 7 8:17
Editor
Editor
Seonglae Cho
Edited
Edited
2024 Mar 30 17:53
Refs
Refs

Exponential Linear Unit

 
 
 
 
 
Papers with Code - ELU Explained
The Exponential Linear Unit (ELU) is an activation function for neural networks. In contrast to ReLUs, ELUs have negative values which allows them to push mean unit activations closer to zero like batch normalization but with lower computational complexity. Mean shifts toward zero speed up learning by bringing the normal gradient closer to the unit natural gradient because of a reduced bias shift effect. While LReLUs and PReLUs have negative values, too, they do not ensure a noise-robust deactivation state. ELUs saturate to a negative value with smaller inputs and thereby decrease the forward propagated variation and information. The exponential linear unit (ELU) with $0 < \alpha$ is: $$f\left(x\right) = x \text{ if } x > 0$$ $$\alpha\left(\exp\left(x\right) − 1\right) \text{ if } x \leq 0$$
Papers with Code - ELU Explained
https://paperswithcode.com/method/elu
Papers with Code - ELU Explained
 
 

Recommendations

Texonom
Texonom
/
Engineering
Engineering
/Data Engineering/Artificial Intelligence/Machine Learning/Neural Network/Neural Network Structure/Activation Function/
ELU Function
Copyright Seonglae Cho