tablegpt/TableGPT2-7B

TableGPT2-7B is a 7.6 billion parameter decoder-only large language model developed by Zhejiang University, built upon the Qwen2.5 architecture. It is specifically tailored for data-intensive tasks, excelling at interpreting and analyzing tabular data. Optimized for coding tasks, data interpretation, and business intelligence-focused question answering, it supports both text and tabular data inputs and has a context length of 131072 tokens.

Warm
Public
7.6B
FP8
32768
License: apache-2.0
Hugging Face