nnpy/Nape-0
Nnpy/Nape-0 is a 1.1 billion parameter Llama-based causal language model developed by nnpy, currently in an early training phase. This small model aims to exhibit diverse capabilities despite its size, with a context length of 2048 tokens. It is designed for general language understanding and generation tasks, serving as a foundation for further development and fine-tuning.
No reviews yet. Be the first to review!