DeepSeek-V3 exemplifies the transformative potential of hardware software co-design in advancing the scalability, efficiency, and robustness of large-scale AI systems.
- Mixed Precision f8 overall fast training
- MLA effective attention head training
- Auxiliary-Loss-Free Load Balancing MoE Node-Limited Routing fast MoE training
- Multi Token Prediction - parallel training inter-multi token relationship
- DualPipe pipeline parallelism


tech report
DeepSeek-V3 Technical Report
We present DeepSeek-V3, a strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.
To achieve efficient inference and cost-effective training, DeepSeek-V3 adopts Multi-head Latent Attention (MLA) and DeepSeekMoE architectures, which were thoroughly validated in DeepSeek-V2.
Furthermore, DeepSeek-V3 pioneers an auxiliary-loss-free strategy for load balancing and sets a multi-token prediction training objective for stronger performance.
We pre-train DeepSeek-V3 on 14.8 trillion diverse and high-quality tokens, followed by Supervised Fine-Tuning and Reinforcement Learning stages to fully harness its capabilities.
Comprehensive evaluations reveal that DeepSeek-V3 outperforms other open-source models and achieves performance comparable to leading closed-source models.
Despite its excellent performance, DeepSeek-V3 requires only 2.788M H800 GPU hours for its full training.
In addition, its training process is remarkably stable.
Throughout the entire training process, we did not experience any irrecoverable loss spikes or perform any rollbacks.
The model checkpoints are available at https://github.com/deepseek-ai/DeepSeek-V3.
https://arxiv.org/html/2412.19437v1
www.arxiv.org
https://www.arxiv.org/pdf/2505.09343
Notes on the new Deepseek v3
In this blog we go thorough the new Deepseek v3 and compare it with GPT-4o and 3.5 Sonnet across reasoning, math, coding, & writing tasks.
https://composio.dev/blog/notes-on-new-deepseek-v3/

model
deepseek-ai/DeepSeek-V3-Base · Hugging Face
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
https://huggingface.co/deepseek-ai/DeepSeek-V3-Base
3.2
DeepSeek-V3.2 Release | DeepSeek API Docs
🚀 Launching DeepSeek-V3.2 & DeepSeek-V3.2-Speciale — Reasoning-first models built for agents!
https://api-docs.deepseek.com/news/news251201


Seonglae Cho