KaraKaraWitch/Llama-MiraiFanfare-2-3.3-70B

KaraKaraWitch/Llama-MiraiFanfare-2-3.3-70B is a 70 billion parameter language model based on the Llama architecture, created by merging EVA-LLaMA-3.33-70B-v0.1 and Mirai-3.0-70B using the TIES method. This model leverages the strengths of its constituent models to offer enhanced performance. With a 32,768 token context length, it is suitable for applications requiring extensive contextual understanding.

Warm
Public
70B
FP8
32768
Hugging Face