Code Llama was trained on a 16k context window. In addition, the three model variants had additional long-context fine-tuning, allowing them to manage a context window of up to 100,000 tokens
Code LLaMa Descendents


Introducing Code Llama, a state-of-the-art large language model for coding
Code Llama, which is built on top of Llama 2, is free for research and commercial use.
https://ai.meta.com/blog/code-llama-large-language-model-coding/

Introducing Code Llama, a state-of-the-art large language model for coding
Code Llama, which is built on top of Llama 2, is free for research and commercial use.
https://ai.meta.com/blog/code-llama-large-language-model-coding/

codellama (Code Llama)
Org profile for Code Llama on Hugging Face, the AI community building the future.
https://huggingface.co/codellama
Llama 2 learns to code
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
https://huggingface.co/blog/codellama

Seonglae Cho
