Entropicengine/Luminatium-L3-8b

Entropicengine/Luminatium-L3-8b is an 8 billion parameter language model created by Entropicengine, built by merging Sao10K/L3-8B-Stheno-v3.2 and Sao10K/L3-8B-Lunaris-v1 using the SLERP method. This model is designed to combine the strengths of its base models, offering a balanced performance across various tasks. It supports a context length of 8192 tokens, making it suitable for applications requiring moderate context understanding.

Cold
Public
8B
FP8
8192
License: llama3
Hugging Face