Nexesenex/Llama_3.x_70b_Doberman_V1
Nexesenex/Llama_3.x_70b_Doberman_V1 is a 70 billion parameter language model created by Nexesenex, merged using the Model Stock method with SentientAGI/Dobby-Unhinged-Llama-3.3-70B as its base. This model integrates capabilities from NousResearch/Hermes-3-Llama-3.1-70B and Nexesenex/Llama_3.x_70b_Smarteaz_V1, offering a 32768 token context length. It is designed for general-purpose language tasks, leveraging the combined strengths of its constituent models.