Distance Learner

Creator
Creator
Seonglae ChoSeonglae Cho
Created
Created
2026 Jan 12 15:53
Editor
Edited
Edited
2026 Jan 12 17:42
The method proposes training a classifier to predict the distance to class manifold instead of labels (Distance Learner). Final classification is determined by "the class with the smallest predicted distance."
Training data generation (augmentation): Move slightly in the tangent direction (on the manifold) around each training sample, then move delta in the normal direction (off the manifold) to generate many off-manifold points with known distances from the manifold, and train MSE regression with these distance values as ground truth. Local manifold estimation: Since we don't actually know the tangent/normal, we gather kNN of each point and use PCA to take the top m principal components as the tangent basis, and use its orthogonal complement as the normal.
  • OOD detection: If the predicted distance is large for all classes, it can be judged as out-of-domain (standard softmax classifiers have the problem of classifying distant points with high confidence).
  • Decision boundaries are more "geometrically" meaningful: Clean regions are formed only near class manifolds, and distant regions are treated as OOD.
  • Adversarial robustness: On synthetic data (especially concentric spheres, swiss roll), it is much stronger than standard classifiers and reported to show robustness similar to Madry-style adversarial training.
Currently focused on synthetic data, so expansion to real data is needed. Local manifold estimation accuracy and sampling efficiency (improving uniform sampling within bands) are also future challenges.
Distance Learners
 
 
 
 
 
 
 
arxiv.org
 
 

Recommendations