TinyLlama/TinyLlama-1.1B-Chat-v0.6
TinyLlama/TinyLlama-1.1B-Chat-v0.6 is a 1.1 billion parameter Llama 2-based chat model developed by the TinyLlama project. It was fine-tuned using the Zephyr training recipe, initially on a variant of the UltraChat dataset and further aligned with DPO on the UltraFeedback dataset. This compact model is designed for chat applications requiring restricted computation and memory footprints, leveraging the Llama 2 architecture and tokenizer for broad compatibility.
No reviews yet. Be the first to review!