Chickaboo/ChickaQ-Large

ChickaQ-Large is a 1.8 billion parameter language model from the ChickaQ family, created by Chickaboo. This model is a merge of pre-trained language models, specifically integrating Qwen/Qwen1.5-1.8B-Chat with vilm/Quyen-Mini-v0.1 as its base. Utilizing the TIES merge method, ChickaQ-Large offers a 32768 token context length, making it suitable for applications requiring efficient processing of longer sequences.

Warm
Public
1.8B
BF16
32768
Hugging Face