hitonet/hito-1.7b

Hito 1.7B by Hitonet is a 1.7 billion parameter language model with a 32K context window, specifically trained to resist cognitive biases and perform structured thinking. It excels in reasoning and mathematical tasks, demonstrating 100% accuracy in counting, math, and reasoning benchmarks. This model is designed for applications requiring reliable, bias-resistant logical processing, even outperforming larger models in specific cognitive tests.

Cold
Public
2B
BF16
40960
License: apache-2.0
Hugging Face