Delta-Vector/Hamanasu-Magnum-QwQ-32B

Delta-Vector/Hamanasu-Magnum-QwQ-32B is a 32.8 billion parameter language model fine-tuned from Delta-Vector/Hamanasu-QwQ-V2-RP. It is specifically optimized to replicate the prose style of Claude models, including Opus and Sonnet, making it highly suitable for traditional roleplay scenarios. The model was trained for two epochs on 8x H100 GPUs and supports a context length of 131072 tokens.

Warm
Public
32.8B
FP8
131072
Hugging Face

No reviews yet. Be the first to review!