Texonom
Texonom
/
Engineering
Engineering
/Data Engineering/Artificial Intelligence/AI Development/AI Optimization/Model Quantization/Model Quantization Method/
Quantization Aware Training
Search

Quantization Aware Training

Creator
Creator
Seonglae Cho
Created
Created
2023 Jul 2 3:2
Editor
Editor
Seonglae Cho
Edited
Edited
2024 Dec 20 0:10
Refs
Refs
Post-training quantization

QAT

학습 진행 시점에 inference 시 quantization 적용에 의한 영향을 미리 시뮬레이션을 하는 방식이고 그걸 기반으로 Back Propagation
소형 모델에서도 성능하락 적다
Red node is fake quantization node (act means activation, wt means weight)
Red node is fake quantization node (act means activation, wt means weight)
 
 
Quantization aware training  |  TensorFlow Model Optimization
Quantization aware training  |  TensorFlow Model Optimization
https://www.tensorflow.org/model_optimization/guide/quantization/training
Quantization aware training  |  TensorFlow Model Optimization
Inside Quantization Aware Training
To optimize our neural networks to run for low power and low storage devices, various model optimization techniques are used. One such very efficient technique is Quantization Aware Training.
Inside Quantization Aware Training
https://towardsdatascience.com/inside-quantization-aware-training-4f91c8837ead
Inside Quantization Aware Training
딥러닝의 Quantization (양자화)와 Quantization Aware Training
gaussian37's blog
딥러닝의 Quantization (양자화)와 Quantization Aware Training
https://gaussian37.github.io/dl-concept-quantization/
딥러닝의 Quantization (양자화)와 Quantization Aware Training
 
 

Recommendations

Texonom
Texonom
/
Engineering
Engineering
/Data Engineering/Artificial Intelligence/AI Development/AI Optimization/Model Quantization/Model Quantization Method/
Quantization Aware Training
Copyright Seonglae Cho