aixonlab/Eurydice-24b-v2

Eurydice 24b v2 by Aixon Lab is a 24 billion parameter causal language model built on Mistral 3.1, featuring a 32768 token context length. It is specifically designed for multi-role conversations, excelling in contextual understanding, creativity, natural conversation, and storytelling. This model is optimized for various natural language processing tasks including text generation and question-answering.

Warm
Public
24B
FP8
32768
License: apache-2.0
Hugging Face

No reviews yet. Be the first to review!