SeacomSrl/SeaQwen2-0.5B

SeacomSrl/SeaQwen2-0.5B is a 0.5 billion parameter language model developed by Toti Riccardo, fine-tuned from Qwen2-0.5B. This model is specifically adapted for Italian language tasks, leveraging the Seacom/rag-data dataset for its training. It is optimized for general language understanding and generation in Italian, making it suitable for applications requiring localized linguistic capabilities.

Warm
Public
0.5B
BF16
32768
License: apache-2.0
Hugging Face

No reviews yet. Be the first to review!