KaraKaraWitch/BlenderCartel-llama33-70B-Pt1

KaraKaraWitch/BlenderCartel-llama33-70B-Pt1 is a 70 billion parameter language model merge created using the SCE method, based on deepcogito/cogito-v2-preview-llama-70B. This model integrates capabilities from multiple Llama 3.1 and Llama 3.3 variants, focusing on enhancing prose style, creative writing, roleplaying, and narrative generation. It is designed to offer a versatile foundation for complex text generation tasks, supporting a 32768 token context length.

Warm
Public
70B
FP8
32768
Hugging Face

No reviews yet. Be the first to review!