meta-llama/Llama-3.2-1B

Llama 3.2-1B is a 1.23 billion parameter multilingual large language model developed by Meta, utilizing an optimized transformer architecture. It is instruction-tuned and optimized for multilingual dialogue use cases, including agentic retrieval and summarization tasks. The model supports a 32,768 token context length and incorporates Grouped-Query Attention for improved inference scalability. It excels in scenarios requiring efficient, multilingual text generation and understanding, particularly for assistant-like chat applications.

Warm
Public
1B
BF16
32768
License: llama3.2
Hugging Face
Gated

No reviews yet. Be the first to review!