ReDiX/Artemide-3.5

ReDiX/Artemide-3.5 is a 3.82 billion parameter instruction-tuned language model developed by ReDiX, based on Microsoft's Phi-3.5-mini-instruct architecture with a 4096-token context length. It is fine-tuned on a high-quality Italian and English multi-turn conversation dataset, demonstrating strong performance on Italian language benchmarks. This model is optimized for conversational AI tasks requiring proficiency in both Italian and English.

Warm
Public
4B
BF16
4096
License: mit
Hugging Face