Llama 2 model sizes (7B, 13B, 70B)

All three currently available Llama 2 model sizes (7B, 13B, 70B) are trained on 2 trillion tokens and have double the context length of Llama 1. Llama 2 encompasses a series of generative text models that have been pretrained and fine-tuned, varying in size from 7 billion to 70 billion parameters. Meta’s specially fine-tuned models … Continue reading Llama 2 model sizes (7B, 13B, 70B)