ZeusLabs/Chronos-Platinum-72B

ZeusLabs/Chronos-Platinum-72B is a 72.7 billion parameter language model based on the Qwen 2.5 architecture, fine-tuned for two epochs on the Chronos Divergence dataset. With a context length of 131072 tokens, it excels in creative writing tasks such as roleplaying and storywriting, alongside general assistant functionalities. The model utilizes the ChatML instruct template and is optimized for nuanced conversational and narrative generation.

Warm
Public
72.7B
FP8
131072
Hugging Face