Ppoyaa/MythoNemo-L3.1-70B-v1.0

Ppoyaa/MythoNemo-L3.1-70B-v1.0 is a 70 billion parameter language model fine-tuned from nvidia/Llama-3.1-Nemotron-70B-Instruct, featuring a 32768 token context length. This model is specifically optimized for enhanced roleplaying and story writing capabilities. It maintains strong intelligence, instruction following, and reasoning skills while excelling in creative text generation.

Warm
Public
70B
FP8
32768
License: llama3.1
Hugging Face