tomg-group-umd/DynaGuard-8B
DynaGuard-8B is an 8 billion parameter decoder-only Transformer model developed by the University of Maryland and Capital One, based on Qwen3-8B. It is fine-tuned for evaluating text against user-defined natural language policies, functioning as a dynamic guardrail model. This model excels at moderating chatbot outputs with bespoke rules and provides interpretability through detailed explanations for policy violations. DynaGuard-8B achieves state-of-the-art performance on safety and compliance benchmarks, outperforming generalist models like GPT-4o-mini.
No reviews yet. Be the first to review!