migtissera/Tess-2.0-Llama-3-8B
Tess-2.0-Llama-3-8B is an 8 billion parameter general-purpose large language model developed by migtissera, fine-tuned on the Meta-Llama-3-8B base architecture. It was trained on a highly uncensored, high-quality dataset containing approximately 100K code and general training samples. This model is designed to follow instructions consistently due to its uncensored training methodology and is suitable for a wide range of conversational and generative AI tasks.