Sao10K/70B-L3.3-Cirrus-x1

Sao10K/70B-L3.3-Cirrus-x1 is a 70 billion parameter language model developed by Sao10K, built upon a Llama-3-Instruct prompt format. This model utilizes a data composition similar to 'Freya' but with extended training and checkpoint merging for enhanced stability. It is designed to offer a distinct stylistic output, with occasional issues that are easily correctable, making it suitable for general text generation tasks.

Warm
Public
70B
FP8
32768
License: llama3.3
Hugging Face