AI-Sweden-Models/gpt-sw3-126m

The GPT-Sw3 126M is a 0.2 billion parameter decoder-only transformer language model developed by AI Sweden in collaboration with RISE and WASP WARA for Media and Language. Trained on 320 billion tokens across Swedish, Norwegian, Danish, Icelandic, English, and programming code, it generates coherent text in multiple languages. This model is primarily intended for research and evaluation of LLM capabilities in Nordic languages, offering multilingual text generation and task instruction.

Cold
Public
0.2B
BF16
2048
License: other
Hugging Face