torch.nn.functional.mish# torch.nn.functional.mish(input, inplace=False)[source]# Apply the Mish function, element-wise. Mish: A Self Regularized Non-Monotonic Neural Activation Function. Mish(x)=x∗Tanh(Softplus(x))\text{Mish}(x) = x * \text{Tanh}(\text{Softplus}(x)) Mish(x)=x∗Tanh(Softplus(x)) Note See Mish: A Self Regularized Non-Monotonic Neural Activation Function See Mish for more details. Return type Tensor
torch.nn.functional.mish# torch.nn.functional.mish(input, inplace=False)[source]# Apply the Mish function, element-wise. Mish: A Self Regularized Non-Monotonic Neural Activation Function. Mish(x)=x∗Tanh(Softplus(x))\text{Mish}(x) = x * \text{Tanh}(\text{Softplus}(x)) Mish(x)=x∗Tanh(Softplus(x)) Note See Mish: A Self Regularized Non-Monotonic Neural Activation Function See Mish for more details. Return type Tensor