Superposition Complexity

Creator
Creator
Seonglae ChoSeonglae Cho
Created
Created
2025 Jun 25 10:1
Editor
Edited
Edited
2025 Jul 22 18:17
Refs
Refs
Neural networks in superposition state can simultaneously represent and compute many more 'features' than the number of neurons, which defines the resource complexity needed for actual computation
  • Lower bounds: Derived from information theory (Kolmogorov complexity) even with error tolerance.
    • L1) Neurons
    • L2) Parameter bits
  • Upper bounds: Chernoff bound, constructive algorithm
    • U1) Neurons (explicit construction
    • U2) Computable features capacity
    • U3) Parameter bits
Using Compression·Decompression matrices technique that compresses/decompresses inputs, outputs, and hidden states through superposition. Then defining Feature influence (how many outputs are affected by one input) and combining different channel routing strategies (input channel vs output channel) based on influence magnitude.
 
 
 
 
 

Recommendations