激活函数 | 说明 |
nn.ELU | Applies the Exponential Linear Unit (ELU) function, element-wise. |
nn.Hardshrink | Applies the Hard Shrinkage (Hardshrink) function element-wise. |
nn.Hardsigmoid | Applies the Hardsigmoid function element-wise. |
nn.Hardtanh | Applies the HardTanh function element-wise. |
nn.Hardswish | Applies the Hardswish function, element-wise. |
nn.LeakyReLU | Applies the LeakyReLU function element-wise. |
nn.LogSigmoid | Applies the Logsigmoid function element-wise. |
nn.MultiheadAttention | Allows the model to jointly attend to information from different representation subspaces. |
nn.PReLU | Applies the element-wise PReLU function. |
nn.ReLU | Applies the rectified linear unit function element-wise. |
nn.ReLU6 | Applies the ReLU6 function element-wise. |
nn.RReLU | Applies the randomized leaky rectified linear unit function, element-wise. |
nn.SELU | Applies the SELU function element-wise. |
nn.CELU | Applies the CELU function element-wise. |
nn.GELU | Applies the Gaussian Error Linear Units function. |
nn.Sigmoid | Applies the Sigmoid function element-wise. |
nn.SiLU | Applies the Sigmoid Linear Unit (SiLU) function, element-wise. |
nn.Mish | Applies the Mish function, element-wise. |
nn.Softplus | Applies the Softplus function element-wise. |
nn.Softshrink | Applies the soft shrinkage function element-wise. |
nn.Softsign | Applies the element-wise Softsign function. |
nn.Tanh | Applies the Hyperbolic Tangent (Tanh) function element-wise. |
nn.Tanhshrink | Applies the element-wise Tanhshrink function. |
nn.Threshold | Thresholds each element of the input Tensor. |
nn.GLU | Applies the gated linear unit function. |