AI-Sweden-Models/gpt-sw3-6.7b-v2

GPT-Sw3 6.7B v2 is a 7.1 billion parameter decoder-only transformer language model developed by AI Sweden in collaboration with RISE and WASP WARA for Media and Language. It is pretrained on a 320 billion token dataset comprising Swedish, Norwegian, Danish, Icelandic, English, and programming code, with an increased focus on English and code compared to its predecessor. This model is designed for generating coherent text across five languages and four programming languages, and can perform various text tasks through causal language modeling.

Cold
Public
7.1B
FP8
2048
License: other
Hugging Face

No reviews yet. Be the first to review!