qingy2024/Lorenzo-8B-Merge

Lorenzo-8B-Merge is an 8 billion parameter language model developed by qingy2024. This model is a merge of existing architectures, designed to leverage the strengths of its constituent models. With a context length of 32768 tokens, it aims to provide enhanced performance for general language understanding and generation tasks.

Warm
Public
8B
FP8
32768
Hugging Face

No reviews yet. Be the first to review!