google/datagemma-rag-27b-it
The google/datagemma-rag-27b-it model, developed by Google, is a 27 billion parameter Gemma 2 fine-tuned model with a 32768 token context length. It is specifically designed for Retrieval Augmented Generation (RAG) workflows to help LLMs access and integrate public statistical data from Data Commons. This model excels at generating natural language queries understandable by Data Commons' interface, enabling LLMs to answer statistical questions effectively.