dicta-il/DictaLM-3.0-24B-Thinking
Dicta-il/DictaLM-3.0-24B-Thinking is a 24-billion-parameter reasoning chat model, initialized from Mistral-Small-3.1-24B-Base-2503, developed by Dicta. This model excels in Hebrew language processing, setting new state-of-the-art performance for its weight class in both base and chat model configurations. Its primary differentiator is a unique 'thinking block' mechanism, where it internally formulates a response strategy before generating output, enhancing reasoning capabilities. The model also supports tool-calling for integration with external APIs.
No reviews yet. Be the first to review!