Texonom
Texonom
/
Engineering
Engineering
/Data Engineering/Artificial Intelligence/Machine Learning/Neural Network/Neural Network Structure/Seq2Seq/Attention Mechanism/Attention Mechanism Optimization/
LSH Attention
Search

LSH Attention

Creator
Creator
Seonglae Cho
Created
Created
2024 Mar 2 7:10
Editor
Editor
Seonglae Cho
Edited
Edited
2025 Feb 10 11:5
Refs
Refs
Reformer
Locality-sensitive hashing

Find high attention pair using locality-sensitive hashing (LSH)

Configured so that closer data has similar hash values
 
 
 
 
꼼꼼하고 이해하기 쉬운 Reformer 리뷰
Review of Reformer: The Efficient Transformer
꼼꼼하고 이해하기 쉬운 Reformer 리뷰
https://tech.scatterlab.co.kr/reformer-review/
꼼꼼하고 이해하기 쉬운 Reformer 리뷰
 
 

Recommendations

Texonom
Texonom
/
Engineering
Engineering
/Data Engineering/Artificial Intelligence/Machine Learning/Neural Network/Neural Network Structure/Seq2Seq/Attention Mechanism/Attention Mechanism Optimization/
LSH Attention
Copyright Seonglae Cho