NousResearch/Llama-3.2-1B
NousResearch/Llama-3.2-1B is a 1 billion parameter multilingual large language model developed by Meta, part of the Llama 3.2 collection. This instruction-tuned model, with a 32768-token context length, is optimized for multilingual dialogue use cases such as agentic retrieval and summarization. It utilizes an optimized transformer architecture and is fine-tuned with SFT and RLHF to align with human preferences for helpfulness and safety.