SlerpE/WoonaV1.2-9b
SlerpE/WoonaV1.2-9b is a 9 billion parameter model based on Gemma 9B IT, specifically trained on augmented synthetic Russian-language data derived from the My Little Pony: FIM fandom wiki. This model excels in providing canonical knowledge about the My Little Pony universe in Russian, outperforming larger models like GPT-4o and Gemini 1.5 Pro in specialized benchmarks for this domain. It is primarily designed for tasks requiring deep understanding of MLP lore, such as advising fanfiction writers, searching for canonical information, and serving as a foundation for role-playing games.
No reviews yet. Be the first to review!