Dataset Distillation

Creator
Creator
Seonglae ChoSeonglae Cho
Created
Created
2025 Apr 26 21:29
Editor
Edited
Edited
2025 Nov 11 19:51
Refs
Refs
Dataset
Synthesize a small number of data points that do not need to come from the correct data distribution, but will, when given to the learning algorithm as training data, approximate the model trained on the original data
For example, we show that it is possible to compress 60, 000 MNIST training images into just 10 synthetic distilled images (one per class) and achieve close to original performance with only a few gradient descent steps, given a fixed network initialization.
Instead of matching the data distribution, this method creates samples that "quickly move model parameters in a good direction." This approach is effective because with random initialization, each model's initial weights create different loss landscapes. To make robust updates across all initializations, directly optimizing a synthetic dataset is more efficient than matching the distribution.
 
 
 
Compressing the information of the dataset itself into a few synthetic data points.

Extracting alignment data (2025) -
Synthetic Data Generation
Template attack

Models can reproduce training data used during alignment phases (SFT, RL) either verbatim or in similar form. Since chat templates (<|user|>, <|assistant|>) are introduced only during alignment, using them as prompts enables regeneration of alignment data through unconditional batch generation without context and only BOS or special token template prefix. Collecting model-generated data and reusing it for SFT/RL can restore performance similar to models trained on original data. Even in RL, regurgitation of training samples occurs during PPO/RLVR phases.
Knowledge Distillation
effectively operates as
Dataset Distillation
.
Semantic similarity (embedding similarity ≥0.95) is defined as "semantic memorization". Traditional string similarity-based detection (Levenshtein, etc.) underestimates actual memorization rates by at least 10x.
 
 

Recommendations