Domain specific now, but could be generalized?
The core of Transformers is dot-product attention, but in complex space it becomes Hermitian inner product, which introduces a phase term that makes softmax meaningless and probabilistic interpretation of Attention Weight impossible
Unveiling the Power of Complex-Valued Transformers in Wireless...
Utilizing complex-valued neural networks (CVNNs) in wireless communication tasks has received growing attention for their ability to provide natural and effective representation of complex-valued...
https://arxiv.org/abs/2502.11151


Seonglae Cho