nbeerbower/mistral-nemo-gutenberg-12B-v4

The nbeerbower/mistral-nemo-gutenberg-12B-v4 is a 12 billion parameter language model, fine-tuned by nbeerbower from TheDrummer/Rocinante-12B-v1. It was further fine-tuned on the jondurbin/gutenberg-dpo-v0.1 dataset, specializing in text generation and understanding with a 32,768 token context length. This model is optimized for tasks requiring nuanced language processing and demonstrates specific performance metrics on various benchmarks.

Warm
Public
12B
FP8
32768
License: apache-2.0
Hugging Face