duohuang/test

The duohuang/test model is a 3.2 billion parameter language model with a 32768 token context length. Developed by duohuang, this model is a foundational transformer-based architecture. Its specific capabilities and primary differentiators are not detailed in the provided information, suggesting it may be a general-purpose or experimental model.

Warm
Public
3.2B
BF16
32768
Hugging Face