alpindale/Llama-3.2-1B
alpindale/Llama-3.2-1B is a 1.23 billion parameter, instruction-tuned, multilingual large language model from Meta's Llama 3.2 collection, built on an optimized transformer architecture. It is specifically optimized for multilingual dialogue use cases, including agentic retrieval and summarization tasks, and supports a 32k context length. The model is designed for commercial and research use, outperforming many open-source and closed chat models on industry benchmarks.