amazon/MistralLite

amazon/MistralLite is a 7 billion parameter language model fine-tuned from Mistral-7B-v0.1 by AWS Contributors. It significantly enhances long context processing, supporting up to 32K tokens by adapting Rotary Embedding and a larger sliding window. This model excels in long context retrieval and answering tasks, making it suitable for summarization and question-answering over extensive documents.

Cold
Public
7B
FP8
4096
License: apache-2.0
Hugging Face

No reviews yet. Be the first to review!