alexredna/TinyLlama-1.1B-Chat-v1.0-reasoning-v2
alexredna/TinyLlama-1.1B-Chat-v1.0-reasoning-v2 is a 1.1 billion parameter language model, fine-tuned from TinyLlama/TinyLlama-1.1B-Chat-v1.0. This model is specifically fine-tuned on a generator dataset, aiming to enhance its reasoning capabilities. It operates with a 2048-token context length, making it suitable for tasks requiring focused, small-scale reasoning within a compact model footprint.
No reviews yet. Be the first to review!