Sao10K/70B-L3.3-mhnnn-x1

Sao10K/70B-L3.3-mhnnn-x1 is a 70 billion parameter language model with a 32,768 token context length, developed by Sao10K. This model is fine-tuned with a unique data composition, similar to 'Freya', focusing on creative outputs across various tasks. It excels in completion tasks, text adventures, amoral assistant roles, and general instruction following, offering more creative responses at the cost of occasional inconsistencies.

Warm
Public
70B
FP8
32768
License: llama3.3
Hugging Face

No reviews yet. Be the first to review!