Texonom
Texonom
/
Engineering
Engineering
/Data Engineering/Artificial Intelligence/Machine Learning/Neural Network/Neural Network Structure/Seq2Seq/Attention Mechanism/Attention Mechanism Optimization/
FNet
Search

FNet

Creator
Creator
Seonglae Cho
Created
Created
2025 Apr 27 18:17
Editor
Editor
Seonglae Cho
Edited
Edited
2025 Apr 27 18:20
Refs
Refs
Replacing
Self-Attention
in transformer blocks with a
FFT
-based token mixing layer that operates without attention. The Fourier Mixing first applies a 1D FFT in the sequence dimension and then applies it to the hidden dimensional direction. This reduces the computational complexity to O(Nlog⁡N)O(N\log N)O(NlogN).
 
 
 
 
 

FNet NAACL 2022

aclanthology.org
https://aclanthology.org/2022.naacl-main.319.pdf
Unlocking Gen AI at the Edge: Speeding up Transformers by 80% by Removing Self Attention
A deep dive into FNet, FFT-based mixing, and why the future of AI might belong to fixed-structure models that don’t even try to learn what they can encode.
Unlocking Gen AI at the Edge: Speeding up Transformers by 80% by Removing Self Attention
https://artificialintelligencemadesimple.substack.com/p/speeding-up-transformers-by-80-by
Unlocking Gen AI at the Edge: Speeding up Transformers by 80% by Removing Self Attention
 

Recommendations

Texonom
Texonom
/
Engineering
Engineering
/Data Engineering/Artificial Intelligence/Machine Learning/Neural Network/Neural Network Structure/Seq2Seq/Attention Mechanism/Attention Mechanism Optimization/
FNet
Copyright Seonglae Cho