google/gemma-3-1b-it-qat-int4-unquantized
Gemma 3 1B IT QAT INT4 Unquantized is a 1 billion parameter instruction-tuned multimodal language model developed by Google DeepMind, part of the Gemma 3 family. This version is optimized for Quantization Aware Training (QAT) to preserve bfloat16 quality while reducing memory requirements, though the provided checkpoint is unquantized. It supports a 32K token context window and is designed for text generation and image understanding tasks, including question answering, summarization, and reasoning, making it suitable for resource-constrained environments.