Mr-Bhaskar/fbt-llama3-8b
Mr-Bhaskar/fbt-llama3-8b is an 8 billion parameter language model developed by Mr-Bhaskar. This model is based on the Llama 3 architecture and features an 8192 token context length. It is a foundational model intended for general language understanding and generation tasks, serving as a base for further fine-tuning and application development.
No reviews yet. Be the first to review!