Posits, a New Kind of Number, Improves the Math of AI
Training the large neural networks behind many modern AI tools requires real computational might: OpenAI's most advanced language model GPT-3 required a million billion billions of operations to train, and cost about US $5 million in compute time. Some researchers now think they have a better way.
https://spectrum.ieee.org/floating-point-numbers-posits-processor