meta-llama/Meta-Llama-3-70B
Meta-Llama-3-70B is a 70 billion parameter large language model developed by Meta, part of the Llama 3 family. This instruction-tuned variant is optimized for dialogue use cases and general text generation, outperforming many open-source chat models on common industry benchmarks. It utilizes an optimized transformer architecture with Grouped-Query Attention (GQA) and was trained on over 15 trillion tokens with a knowledge cutoff of December 2023, making it suitable for commercial and research applications requiring high performance in English.
No reviews yet. Be the first to review!