Retreatcost/Shisa-K-sakurization

Retreatcost/Shisa-K-sakurization is a 12 billion parameter experimental language model merge, based on the Shisa-K-12B architecture and enhanced with a LoRa adapter from PocketDoc/Dans-SakuraKaze-V1.0.0-12b. This model is specifically designed to boost roleplaying capabilities, utilizing a 32768 token context length. It is optimized for generating creative and immersive roleplay scenarios, with a focus on character interaction and narrative depth.

Cold
Public
12B
FP8
32768
License: apache-2.0
Hugging Face