complextorch.nn.modules.softmax#

Classes#

CVSoftMax

Split Complex-Valued Softmax Layer

MagSoftMax

Magnitude Softmax Layer

PhaseSoftMax

Phase-Preserving Complex-Valued Softmax Layer

Module Contents#

class complextorch.nn.modules.softmax.CVSoftMax(dim: int | None = None)[source]#

Bases: torch.nn.Module

Split Complex-Valued Softmax Layer#

Simple real/image split softmax function. Applies SoftMax to the real and imaginary parts of the input tensor.

Note: this naive implementation can cause significant phase changes and the relationship between the real and iamginary parts of the complex-valued signal are ignored in the two SoftMax computations.

Implements the following operation:

\[G(\mathbf{z}) = \texttt{SoftMax}(\mathbf{x}) + j \texttt{SoftMax}(\mathbf{y}),\]

where \(\mathbf{z} = \mathbf{x} + j\mathbf{y}\)

Initialize internal Module state, shared by both nn.Module and ScriptModule.

forward(input: torch.Tensor) torch.Tensor[source]#

Computes softmax over the real and imaginary parts of the input tensor separately

Parameters:

input (torch.Tensor) – input tensor

Returns:

torch.Tensor\(\texttt{SoftMax}(\mathbf{x}) + j \texttt{SoftMax}(\mathbf{y})\)

softmax#
class complextorch.nn.modules.softmax.MagSoftMax(dim: int | None = None)[source]#

Bases: torch.nn.Module

Magnitude Softmax Layer#

Ignores phase and applies SoftMax to the magnitude.

Implements the following operation:

\[G(\mathbf{z}) = \texttt{SoftMax}(|\mathbf{z}|)\]

Initialize internal Module state, shared by both nn.Module and ScriptModule.

forward(input: torch.Tensor) torch.Tensor[source]#

Ignores phase and applies SoftMax function to magnitude.

Parameters:

input (torch.Tensor) – input tensor

Returns:

torch.Tensor\(\texttt{SoftMax}(|\mathbf{z}|)\)

softmax#
class complextorch.nn.modules.softmax.PhaseSoftMax(dim: int | None = None)[source]#

Bases: torch.nn.Module

Phase-Preserving Complex-Valued Softmax Layer#

Retains phase and applies SoftMax function to magnitude.

Implements the following operation:

\[G(\mathbf{z}) = \texttt{SoftMax}(|\mathbf{z}|) \odot \mathbf{z} / |\mathbf{z}|\]

Initialize internal Module state, shared by both nn.Module and ScriptModule.

forward(input: torch.Tensor) torch.Tensor[source]#

Retains phase and applies SoftMax function to magnitude.

Parameters:

input (torch.Tensor) – input tensor

Returns:

torch.Tensor\(\texttt{SoftMax}(|\mathbf{z}|) \odot \mathbf{z} / |\mathbf{z}|\)

softmax#