M4-ai/tau-1.8B

M4-ai/tau-1.8B is a 1.8 billion parameter language model based on Qwen1.5-1.8B, further pre-trained on the UltraTextbooks-2.0 dataset. This model is specifically optimized for enhanced capabilities in machine learning, mathematics, and coding. It excels at educational question answering, text summarization, content generation for educational purposes, code understanding, and mathematical problem solving. With a 32768 token context length, it is well-suited for applications in educational technology and research.

Warm
Public
1.8B
BF16
32768
License: tongyi-qianwen-research
Hugging Face

No reviews yet. Be the first to review!