migtissera/Tess-2.0-Llama-3-70B-v0.2
Tess-2.0-Llama-3-70B-v0.2 by migtissera is a 70 billion parameter general-purpose large language model, fine-tuned from Meta's Llama-3-70B base model with an 8192 token context length. This iteration, v0.2, has undergone an additional uncensoring step, making it highly instruction-following. It is optimized for general tasks and code generation, trained on a high-quality, uncensored dataset following LIMA principles.