Fizzarolli/L3.1-70b-glitz-v0.2
Fizzarolli/L3.1-70b-glitz-v0.2 is an experimental 70 billion parameter L3.1 model, fine-tuned on a mix of publicly available Claude synthetic data and systemchat. Despite an incomplete training run, this model offers interesting results for developers exploring large language models. It is designed for general language tasks, leveraging its substantial parameter count and 32768 token context length for comprehensive understanding and generation.
No reviews yet. Be the first to review!