meta-llama/Llama-3.1-70B
The Meta Llama 3.1-70B is a 70 billion parameter instruction-tuned generative language model developed by Meta, part of the Llama 3.1 collection. It utilizes an optimized transformer architecture with Grouped-Query Attention and a 128k token context length. Optimized for multilingual dialogue use cases, it excels across common industry benchmarks and supports commercial and research applications in multiple languages including English, German, French, Italian, Portuguese, Hindi, Spanish, and Thai.