Delta-Vector/Rei-24B-KTO

Delta-Vector/Rei-24B-KTO is a 24 billion parameter language model with a 32768 token context length, developed by Delta-Vector. This model is specifically fine-tuned for creative writing and roleplaying, aiming to replicate the prose style of Anthropic Claude models. It utilizes a two-step training process involving Supervised Fine-Tuning (SFT) on the PaintedFantasy dataset and subsequent KTO (Kahneman-Tversky Optimization) to enhance coherency and instruction following.

Warm
Public
24B
FP8
32768
Hugging Face

No reviews yet. Be the first to review!