maritaca-ai/sabia-7b
Sabiá-7B is a 7 billion parameter auto-regressive language model developed by Maritaca AI, built on the LLaMA-1-7B architecture. It was pretrained on 7 billion tokens from the Portuguese subset of ClueWeb22, with further training on an additional 10 billion tokens. This model is specifically optimized for Portuguese language tasks and is recommended for few-shot applications due to its pretraining without instruction-tuning.
No reviews yet. Be the first to review!