Pytorch 学习(7):Pytorch中的Non-linear Activations (非线性层)实现
Pytorch中的Non-linear Activations (非线性层)包括以下激活函数:
- ReLU
- ReLU6
- ELU
- SELU
- PReLU
- LeakyReLU
- Threshold
- Hardtanh
- Sigmoid
- Tanh
- LogSigmoid
- Softplus
- Softshrink
- Softsign
- Tanhshrink
- Softmin
- Softmax
- Softmax2d
- LogSoftmax
Pytorch各激活函数的Python前端代码在activation.py中:
查看一下ReLU的python源码:
class ReLU(Threshold):
r"""Applies the rectified linear unit function element-wise
:math:`\text{ReLU}(x)= \max(0, x)`
.. image:: scripts/activation_images/ReLU.pn