Chickaboo/ChickaQ

ChickaQ is a 0.6 billion parameter language model, merged from Qwen/Qwen1.5-0.5B-Chat and vilm/Quyen-SE-v0.1 using the TIES method. This model is designed for general language tasks, leveraging its compact size for efficient deployment while maintaining a substantial 32768 token context length. It offers a balanced performance profile for applications requiring a smaller, yet capable, language model.

Warm
Public
0.6B
BF16
32768
License: mit
Hugging Face