Ppoyaa/MythoNemo-L3.1-70B-v1.0
Ppoyaa/MythoNemo-L3.1-70B-v1.0 is a 70 billion parameter language model fine-tuned from nvidia/Llama-3.1-Nemotron-70B-Instruct, featuring a 32768 token context length. This model is specifically optimized for enhanced roleplaying and story writing capabilities. It maintains strong intelligence, instruction following, and reasoning skills while excelling in creative text generation.
No reviews yet. Be the first to review!