aisingapore/Qwen-SEA-LION-v4-32B-IT
The aisingapore/Qwen-SEA-LION-v4-32B-IT is a 32 billion parameter instruction-tuned large language model developed by AI Singapore, based on the Qwen3 architecture with a 32k context length. It is specifically optimized for Southeast Asian languages, having undergone continued pre-training on 100 billion tokens from the SEA-Pile v2 corpus across 7 SEA languages. This model excels at multilingual instruction-following and chat in languages including Burmese, Indonesian, Malay, Filipino, Tamil, Thai, and Vietnamese.