unsloth/Phi-4-mini-instruct

The unsloth/Phi-4-mini-instruct is a 3.8 billion parameter, decoder-only Transformer model developed by Microsoft, enhanced with Unsloth's bug fixes and optimized for efficient fine-tuning. It features a 131072-token context length and a 200K vocabulary, excelling in reasoning tasks, particularly math and logic, and is designed for memory/compute-constrained and latency-bound environments. This model is built upon synthetic and filtered public data, focusing on high-quality, reasoning-dense content, and supports broad multilingual commercial and research use.

Cold
Public
3.8B
BF16
131072
License: mit
Hugging Face

No reviews yet. Be the first to review!