GPT-3 Alternatif Büyük Dil Modelleri (LLM’ler)

The simplest, fastest repository for training/finetuning medium-sized GPTs..


nanoGPT Hakkında

nanoGPT is a rewrite of minGPT that prioritizes teeth over education. Still under active development, but currently the file reproduces GPT-2 (124M) on OpenWebText, running on a single 8XA100 40GB node in 38 hours of training. The code itself is plain and readable: is a ~300-line boilerplate training loop and a ~300-line GPT model definition, which can optionally load the GPT-2 weights from OpenAI. That’s it.nanoGPT is created by Andrej Karpathy, a legendary AI researcher, engineer, and educator. He’s the former director of AI at Tesla and a founding member of OpenAI.