Huggingface Transformers Usages
Huggingface Text Generation
Huggingface
AutoModelForCausalLMHuggingface Tokenizer
Huggingface pipe
Huggingface push
Huggingface Trainer
Huggingface Transformers Logging
Huggingface Transformers Profile
Huggingface Task
HfArgumentParser
Huggingface Text Classification
Huggingface ONNX
Huggingface Transformers Serve
Huggingface Transformers Addons
Standardize, Don’t Abstract
This explains the design principles (Tenets) and recent engineering changes that make 1M+ Python LOC and 400+ model architectures "maintainable despite continuous additions."
transformers aims to be the Source of Truth for each model implementation, prioritizing performance and reproducibility. Since users are mostly researchers/power users, code is the 'product' and must be easy to read, debug, and modify.Standardize, Don't Abstract: only commonize infrastructure, keep model semantics inside model files. DRY*: deduplication isn't always good (DRY Principle* (DO Repeat Yourself). Strategic duplication is allowed for user readability and local semantics. The essence is: "Keep model seantics transparent in one file (for readability/hackability), while reducing maintenance burden through modular definitions and config-based composition."
huggingface.co
https://huggingface.co/spaces/transformers-community/Transformers-tenets
🤗 Transformers
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
https://huggingface.co/docs/transformers/index
SimpleTransformers: 알기 쉬운 트랜스포머
이 섹션에서, Hugging Face API를 사용해 간단한 감성 분류기를 구축해보겠습니다. IMDB 데이터세트를 사용해 Distil BERT 모델을 미세 조정하겠습니다. Hugging Face 트랜스포머를 사용해 단 몇 줄의 코드로 감성 분류기를 구축할 수 있습니다. 하지만 여기에는, 특히 트랜스포머를 막 시작한 분들의 경우, 몇 가지 까다로운 부분이 있습니다. 1.. 불러오기(Imports) 트랜스포머마다 다른 불러오기(imports)가 필요합니다.
https://wandb.ai/authors/One-Shot-3D-Photography/reports/SimpleTransformers---Vmlldzo0MDE5NjM

Models
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
https://huggingface.co/docs/transformers/main_classes/model

Seonglae Cho