PrimeIntellect/Qwen3-1.7B-Wordle-SFT

PrimeIntellect/Qwen3-1.7B-Wordle-SFT is a 1.7 billion parameter language model, fine-tuned from PrimeIntellect/Qwen3-1.7B, specifically optimized for playing the game Wordle. With a context length of 40960 tokens, this model is designed for specialized tasks requiring strategic word generation and pattern recognition within game-specific constraints. Its primary application is demonstrating supervised fine-tuning for game-playing AI, particularly for Wordle.

Cold
Public
2B
BF16
40960
License: apache-2.0
Hugging Face