alexredna/TinyLlama-1.1B-Chat-v1.0-reasoning-v2

alexredna/TinyLlama-1.1B-Chat-v1.0-reasoning-v2 is a 1.1 billion parameter language model, fine-tuned from TinyLlama/TinyLlama-1.1B-Chat-v1.0. This model is specifically fine-tuned on a generator dataset, aiming to enhance its reasoning capabilities. It operates with a 2048-token context length, making it suitable for tasks requiring focused, small-scale reasoning within a compact model footprint.

Warm
Public
1.1B
BF16
2048
License: apache-2.0
Hugging Face