Texonom
Texonom
/
Engineering
Engineering
/Data Engineering/Artificial Intelligence/Machine Learning/Neural Network/Neural Network Structure/RNN/
GRU
Search

GRU

Creator
Creator
Seonglae ChoSeonglae Cho
Created
Created
2021 Oct 6 10:17
Editor
Editor
Seonglae ChoSeonglae Cho
Edited
Edited
2024 Oct 21 11:57
Refs
Refs
Long-term dependency
Machine Reading Comprehension
Gating Mechanism
Kyunghyun Cho

Gated recurrent unit

Selectively forget and retain historical information
Selectively incorporated through and Selectively balanced through with
Hadamard product
GRUs have significantly more parameters than RNN
LSTM
에서 gate 2개로 줄여서 가중치가 적다. (출력 게이트 없음)
GRU Notion
GRU Update Gate
GRU Reset Gate
 
 
 
 
Gated Recurrent Units (GRU)
Gated Recurrent Units (GRU) GRU는 게이트 메커니즘이 적용된 RNN 프레임워크의 일종으로 LSTM에 영감을 받았고, 더 간략한 구조를 가지고 있습니다. 아주 자랑스럽게도 한국인 조경현 박사님이 제안한 방법입니..
Gated Recurrent Units (GRU)
https://yjjo.tistory.com/18
Gated Recurrent Units (GRU)
 
 

Recommendations

Texonom
Texonom
/
Engineering
Engineering
/Data Engineering/Artificial Intelligence/Machine Learning/Neural Network/Neural Network Structure/RNN/
GRU
Copyright Seonglae Cho