ModelsLab/Llama-3.1-8b-Uncensored-Dare

ModelsLab/Llama-3.1-8b-Uncensored-Dare is an 8 billion parameter language model created by ModelsLab, formed by merging several Llama-3.1-8B-Instruct-Uncensored and Llama-3-8B-Lexi-Uncensored variants using the DARE TIES merge method. This model is designed for uncensored instruction-following tasks, leveraging the combined strengths of its constituent models. It offers a 32768 token context length, making it suitable for applications requiring extensive conversational memory or processing longer texts.

Warm
Public
8B
FP8
32768
License: apache-2.0
Hugging Face

No reviews yet. Be the first to review!