Pretraining

Creator
Creator
Seonglae Cho
Created
Created
2023 Mar 7 14:1
Editor
Edited
Edited
2025 Apr 16 14:58

The process of artificial neural networks extracting features from data and abstracting separation in each neuron.

The fact that more compression will lead to more intelligence that has a strong philosophical grounding.
Pretraining
compresses data into generalized abstractions that connect different concepts through analogies, while reasoning is a specific
Problem Solving
skill that involves careful thinking to unlock various problem-solving capabilities.
  • Data efficiency matters and there is an optimism. Algorithmic changes stack so well.
    Sample efficiency
    almost with human level learning is still far away.
  • Semi-synchronous scaling might work with 10+ million GPUs in the future since not all parts of the brain necessarily need to communicate with each other.
  • For the scaling law, the problem is that extending the tail of lower probability requires 10x more computation since relevant concepts appear sparsely in the long tail

Dataset for AI are three types

  • Problems with solution -
    SFT
Pre Training Notion
 
 
 

How training process and loss value is related to neural network’s ability

notion image
Perhaps the most striking phenomenon the Anthropic have noticed is that the learning dynamics of toy models with large numbers of features appear to be dominated by "energy level jumps" where features jump between different feature dimensionalities.
notion image
 
 

Procedural Knowledge in Pretraining

We observe that code data is highly influential for reasoning. StackExchange as a source has more than ten times more influential data in the top and bottom portions of the rankings than expected if the influential data was randomly sampled from the pretraining distribution. Other code sources and ArXiv & Markdown are twice or more as influential as expected when drawing randomly from the pretraining distribution
 
 
 
 

Recommendations