RafikContractzlab/Mike_V1_SFT

RafikContractzlab/Mike_V1_SFT is a 3.8 billion parameter instruction-tuned language model developed by RafikContractzlab. This model is designed for general language understanding and generation tasks, leveraging a 32768 token context length for processing extensive inputs. Its primary strength lies in its ability to follow instructions across a wide range of applications, making it suitable for diverse conversational and text-based use cases.

Cold
Public
3.8B
BF16
32768
Hugging Face

No reviews yet. Be the first to review!