meta-llama/Llama-3.2-3B
The Llama 3.2-3B is a 3.21 billion parameter multilingual large language model developed by Meta, utilizing an optimized transformer architecture. It is instruction-tuned for multilingual dialogue, excelling in agentic retrieval and summarization tasks. This model supports a 32,768 token context length and is optimized for deployment in constrained environments, including mobile devices.