watt-ai/watt-tool-70B

watt-tool-70B is a 70 billion parameter language model developed by watt-ai, fine-tuned from LLaMa-3.3-70B-Instruct. It is specifically optimized for complex tool usage and multi-turn dialogue scenarios, achieving state-of-the-art performance on the Berkeley Function-Calling Leaderboard. This model excels at understanding user requests, selecting appropriate tools, and executing them across multiple conversational turns, making it ideal for AI workflow building platforms.

Warm
Public
70B
FP8
32768
License: apache-2.0
Hugging Face