dphn/dolphin-2.9.2-qwen2-7b
Dolphin 2.9.2 Qwen2 7B is a 7.6 billion parameter language model developed by Eric Hartford, Lucas Atkins, Fernando Fernandes, and Cognitive Computations, based on the Qwen2-7b architecture. It features a 128k context length, fine-tuned with a 16k sequence length, and is designed for instruction, conversational, and coding tasks, including initial agentic abilities and function calling. This uncensored model is highly compliant and requires an external alignment layer for ethical use.