Complex-valued activations#
Complex-valued activation functions must take into account the two degrees-of-freedom inherent to complex-valued data, typically represented as real / imaginary parts or magnitude / phase. Two generalised classes of activation operate on those respective representations and are defined as Type-A and Type-B functions.
Type-A — split on real / imaginary#
Type-A activations consist of two real-valued functions, \(G_\mathbb{R}(\cdot)\) and \(G_\mathbb{I}(\cdot)\), applied to the real and imaginary parts of the input tensor independently:
where \(\mathbf{z} = \mathbf{x} + j\mathbf{y}\).
Under the hood, Type-A activations call
complextorch.nn.functional.apply_complex_split(). Examples in the
package: CVSplitReLU, CVSplitTanh, CVSplitSigmoid, CELU, CCELU,
CGELU. See the activation reference
for the full list.
Type-B — split on magnitude / phase#
Type-B activations consist of two real-valued functions, \(G_{||}(\cdot)\) and \(G_\angle(\cdot)\), applied to the magnitude (modulus) and phase (argument) of the input tensor:
Type-B activations call
complextorch.nn.functional.apply_complex_polar(). Passing phase_fun=None
is an optimisation that skips the polar round-trip when the activation only
modifies magnitude. Examples: modReLU, AdaptiveModReLU, CVPolarTanh.
Fully complex#
Fully-complex activations fit neither the Type-A nor the Type-B designation — they operate on the complex tensor directly. Use them when an activation has a natural complex form (e.g., a learnable phase rotation).
ReLU variants#
A separate family — zReLU, CReLU, zAbsReLU, zLeakyReLU — generalises
the rectified linear unit to the complex plane. These are documented alongside
their classes in the API reference.
When to use which#
Need |
Reach for |
|---|---|
Drop-in replacement for |
|
Preserve phase, modulate magnitude only |
|
Phase-aware operation |
Type-B with both |
Learnable scalar phase shift |
|