Artples/L-MChat-Small

Artples/L-MChat-Small is a 3 billion parameter language model created by Artples, developed through a SLERP merge of rhysjones/phi-2-orange-v2 and Weyaxi/Einstein-v4-phi2. This model explores the performance of smaller merged architectures, offering a compact solution with a 2048-token context length. It is designed for general chat applications and demonstrates competitive performance on various benchmarks, including an average score of 63.14 on the Open LLM Leaderboard.

Warm
Public
3B
BF16
2048
License: mit
Hugging Face