PatronusAI/glider

PatronusAI/glider is a 4 billion parameter language model fine-tuned from Microsoft's Phi-3.5-mini-instruct, developed by Patronus AI. This model is specifically designed for general-purpose evaluation, capable of judging texts, conversations, and RAG setups based on user-defined criteria and rubrics. It was trained on a diverse dataset covering over 183 metrics and 685 domains, including finance and medicine, and supports a maximum sequence length of 8192 tokens, with tested support up to 12,000 tokens. Its primary strength lies in providing detailed, explainable evaluations for various AI outputs.

Warm
Public
4B
BF16
4096
License: cc-by-nc-4.0
Hugging Face