Texonom
Texonom
/
Engineering
Engineering
/Data Engineering/Artificial Intelligence/AI Object/NLP/Language Model/LLM/
GPT
Search
GPT

GPT

Creator
Creator
Seonglae Cho
Created
Created
2020 Aug 17 14:5
Editor
Editor
Seonglae Cho
Edited
Edited
2025 Mar 3 23:57
Refs
Refs
OpenAI
Causal language model

Generative Pre-trained Transformer

Each 0.5 in the version is roughly 10X pretraining compute. -
Andrej Karpathy
fl
GPT Implementations
GPT 1
GPT 2
GPT 3
GPT 4
GPT 4.5
GPT 5
GPT F
nanoGPT
minGPT
gpt-neo
GPT J
GPT 4 Vision
 
 
 

History

Five years of GPT progress
https://finbarr.ca/five-years-of-gpt-progress

GPT in simple explanation

GPT in 60 Lines of NumPy | Jay Mody
January 30, 2023 In this post, we'll implement a GPT from scratch in just 60 lines of numpy . We'll then load the trained GPT-2 model weights released by OpenAI into our implementation and generate some text. Note: This post assumes familiarity with Python, NumPy, and some basic experience training neural networks.
GPT in 60 Lines of NumPy | Jay Mody
https://jaykmody.com/blog/gpt-from-scratch/
GPT in 60 Lines of NumPy | Jay Mody
미라클 어헤드 MiraKle Ahead
매일경제 스타트업 버티컬미디어
https://mirakle.mk.co.kr/list.php?sc=51800017
미라클 어헤드 MiraKle Ahead
 
 
 

Table of Contents
Generative Pre-trained TransformerHistoryGPT in simple explanation

Backlinks

Chat AI

Recommendations

Texonom
Texonom
/
Engineering
Engineering
/Data Engineering/Artificial Intelligence/AI Object/NLP/Language Model/LLM/
GPT
Copyright Seonglae Cho