BruhzWater/Sapphira-L3.3-70b-0.1

BruhzWater/Sapphira-L3.3-70b-0.1 is a 70 billion parameter language model based on the Llama3 architecture, specifically merged using deepcogito-v2-preview-llama-70B as a base. This model is optimized for storytelling and roleplay tasks, demonstrating increased coherence in generated narratives. It leverages a Multi-SLERP merge method, combining BruhzWater's Apocrypha-L3.3-70b-0.3 and Serpents-Tongue-L3.3-70b-0.3 models. With a context length of 32768 tokens, it is designed for extended conversational and narrative generation.

Warm
Public
70B
FP8
32768
Hugging Face