complextorch.nn.modules.softmax#
Classes#
Split Complex-Valued Softmax Layer |
|
Magnitude Softmax Layer |
|
Phase-Preserving Complex-Valued Softmax Layer |
Module Contents#
- class complextorch.nn.modules.softmax.CVSoftMax(dim: int | None = None)[source]#
Bases:
torch.nn.ModuleSplit Complex-Valued Softmax Layer#
Simple real/image split softmax function. Applies SoftMax to the real and imaginary parts of the input tensor.
Note: this naive implementation can cause significant phase changes and the relationship between the real and iamginary parts of the complex-valued signal are ignored in the two SoftMax computations.
Implements the following operation:
\[G(\mathbf{z}) = \texttt{SoftMax}(\mathbf{x}) + j \texttt{SoftMax}(\mathbf{y}),\]where \(\mathbf{z} = \mathbf{x} + j\mathbf{y}\)
Initialize internal Module state, shared by both nn.Module and ScriptModule.
- forward(input: torch.Tensor) torch.Tensor[source]#
Computes softmax over the real and imaginary parts of the input tensor separately
- Parameters:
input (torch.Tensor) – input tensor
- Returns:
torch.Tensor – \(\texttt{SoftMax}(\mathbf{x}) + j \texttt{SoftMax}(\mathbf{y})\)
- softmax#
- class complextorch.nn.modules.softmax.MagSoftMax(dim: int | None = None)[source]#
Bases:
torch.nn.ModuleMagnitude Softmax Layer#
Ignores phase and applies SoftMax to the magnitude.
Implements the following operation:
\[G(\mathbf{z}) = \texttt{SoftMax}(|\mathbf{z}|)\]Initialize internal Module state, shared by both nn.Module and ScriptModule.
- forward(input: torch.Tensor) torch.Tensor[source]#
Ignores phase and applies SoftMax function to magnitude.
- Parameters:
input (torch.Tensor) – input tensor
- Returns:
torch.Tensor – \(\texttt{SoftMax}(|\mathbf{z}|)\)
- softmax#
- class complextorch.nn.modules.softmax.PhaseSoftMax(dim: int | None = None)[source]#
Bases:
torch.nn.ModulePhase-Preserving Complex-Valued Softmax Layer#
Retains phase and applies SoftMax function to magnitude.
Implements the following operation:
\[G(\mathbf{z}) = \texttt{SoftMax}(|\mathbf{z}|) \odot \mathbf{z} / |\mathbf{z}|\]Initialize internal Module state, shared by both nn.Module and ScriptModule.
- forward(input: torch.Tensor) torch.Tensor[source]#
Retains phase and applies SoftMax function to magnitude.
- Parameters:
input (torch.Tensor) – input tensor
- Returns:
torch.Tensor – \(\texttt{SoftMax}(|\mathbf{z}|) \odot \mathbf{z} / |\mathbf{z}|\)
- softmax#