complextorch#

A lightweight complex-valued neural network package built on PyTorch.

complextorch provides drop-in complextorch.nn.* modules whose names mirror torch.nn.* (torch.nn.Conv1d, torch.nn.Linear, …), so a real-valued PyTorch model can be ported to complex-valued by changing the import. The library emphasises 1-D signal-processing / radar / comms workloads, but most layers are also provided for 2-D and 3-D.

Getting started

A runnable notebook covering the README example, activation comparisons, and an end-to-end Conv1d demo.

Getting started
Concepts

Type-A / Type-B / fully-complex activations, and when to reach for the native cfloat vs. Gauss-trick (real/imag split) modules.

Complex-valued activations
API reference

Auto-generated reference for every public class and function in complextorch, with cross-links into PyTorch’s docs.

complextorch

Install#

pip install complextorch

PyTorch is not installed automatically — install the wheel matching your CUDA/CPU target from https://pytorch.org/get-started/locally/ first. See Installation for source-install and development setup.

Why complextorch?#

  • Native cfloat wrappers. complextorch.nn.Conv1d, complextorch.nn.Linear, and friends are thin wrappers around torch.nn modules with dtype=torch.cfloat. PyTorch ≥ 2.1 has fast complex kernels — these are the recommended path.

  • Reference implementations on hand. The complextorch.nn.gauss subpackage keeps the original real/imag-split Gauss-trick implementations (complextorch.nn.gauss.Conv1d, etc.) around as reference math.

  • Three composition primitives. Activations, pooling, losses, dropout, and softmax are built on apply_complex, apply_complex_split, and apply_complex_polar in complextorch.nn.functional — see Activations for the math.

  • Beyond layers. Includes complextorch.signal (a torch port of Welch’s PSD), complextorch.transforms (torchcvnn-style transforms), complextorch.nn.init (variance-correct complex initializers), complextorch.nn.relevance (Variational Dropout & ARD), and complextorch.nn.masked (fixed-mask sparsified layers).

Citation#

If complextorch helps your research, please cite the package and consider citing the author’s PhD thesis and related papers — see About for the full list.