AI Scaling

Creator
Creator
Seonglae Cho
Created
Created
2023 May 15 15:30
Editor
Edited
Edited
2025 Feb 16 22:38

More intelligence for free by scaling

Andrej Karpathy
said that It will be surprisingly small since the current models are wasting a ton of capacity remembering stuff that does not matter. There will be a cognitive core (math, physics, computing, predicting) like human brain similar to brain’s layered structure.
AI Scaling
is not Everything. In a same sense, techniques like reasoning incentive and memory scaffolding might help, but there's no guarantee they will solve core deficits.
Recommended that training tokens should be scaled linearly with model size. The constraints on scaling test-time compute approach differ substantially from those of LLM pretraining.
Metrics of scaling are
Computing
Computing
power,
Model Training
,
Statistical Model
parameter.
  • Pretraining Scaling is reaching its limits due to finite data.
AI Scaling Notion
https://www.dwarkeshpatel.com/p/will-scaling-work
AI Scaling Methods
 
 
 
 

Scaling Law (OpenAI 2020)

Primate neural architecture that’s really scalable in comparison to the brains of other kinds of species, analogous to how transformers have better scaling curves than LSTMs and RNNs.

Human brain Neuron scaling process

Computing is bottleneck

Scaling is important

How to scale

 
 

Recommendations