pytorch / 1.8.0 / generated / torch.nn.prelu.html /

PReLU

class torch.nn.PReLU(num_parameters=1, init=0.25) [source]

Applies the element-wise function:

PReLU ( x ) = max ( 0 , x ) + a min ( 0 , x ) \text{PReLU}(x) = \max(0,x) + a * \min(0,x)

or

PReLU ( x ) = { x , if x 0 a x , otherwise \text{PReLU}(x) = \begin{cases} x, & \text{ if } x \geq 0 \\ ax, & \text{ otherwise } \end{cases}

Here a a is a learnable parameter. When called without arguments, nn.PReLU() uses a single parameter a a across all input channels. If called with nn.PReLU(nChannels), a separate a a is used for each input channel.

Note

weight decay should not be used when learning a a for good performance.

Note

Channel dim is the 2nd dim of input. When input has dims < 2, then there is no channel dim and the number of channels = 1.

Parameters
  • num_parameters (int) – number of a a to learn. Although it takes an int as input, there is only two values are legitimate: 1, or the number of channels at input. Default: 1
  • init (float) – the initial value of a a . Default: 0.25
Shape:
  • Input: ( N , ) (N, *) where * means, any number of additional dimensions
  • Output: ( N , ) (N, *) , same shape as the input
Variables

~PReLU.weight (Tensor) – the learnable weights of shape (num_parameters).

../_images/PReLU.png

Examples:

>>> m = nn.PReLU()
>>> input = torch.randn(2)
>>> output = m(input)

© 2019 Torch Contributors
Licensed under the 3-clause BSD License.
https://pytorch.org/docs/1.8.0/generated/torch.nn.PReLU.html