Texonom
Texonom
/
Science
Science
/Mathematics/Math Field/Statistics/Statistical Model/Model Generalization/Model Training/Fine Tuning/PEFT/LoRA/
LoRA-TE
Search

LoRA-TE

Creator
Creator
Seonglae Cho
Created
Created
2024 Mar 7 4:52
Editor
Editor
Seonglae Cho
Edited
Edited
2024 Mar 7 4:53
Refs
Refs

LoRA-the-Explorer (LTE)

extends low-rank adaptation (LoRA) methods to the
Transformer Pretraining
 
 
 
 
Training Neural Networks from Scratch with Parallel Low-Rank Adapters
The scalability of deep learning models is fundamentally limited by computing resources, memory, and communication. Although methods like low-rank adaptation (LoRA) have reduced the cost of model...
Training Neural Networks from Scratch with Parallel Low-Rank Adapters
https://arxiv.org/abs/2402.16828
Training Neural Networks from Scratch with Parallel Low-Rank Adapters
 
 

Recommendations

Texonom
Texonom
/
Science
Science
/Mathematics/Math Field/Statistics/Statistical Model/Model Generalization/Model Training/Fine Tuning/PEFT/LoRA/
LoRA-TE
Copyright Seonglae Cho