DavidAU/MN-GRAND-Gutenberg-Lyra4-Lyra-12B-MADNESS
The DavidAU/MN-GRAND-Gutenberg-Lyra4-Lyra-12B-MADNESS is a 12 billion parameter language model with a 32768 token context length, provided in full precision 'safe tensors' format for generating various quantized versions. This model is designed as a 'Class 1' model, emphasizing the critical role of specific parameter and sampler settings to optimize its operation and enhance performance across diverse use cases, including those beyond its initial design. Its primary focus is on providing a flexible base for advanced customization through detailed configuration of AI/LLM application settings.