hyokwan/llama31_common
The hyokwan/llama31_common is an 8 billion parameter continued pre-trained language model based on Meta's Llama 3.1-8B-Instruct architecture, with a context length of 32768 tokens. This model has been specifically trained for the Korea Polytechnics Fintech department. It is designed for general language tasks, leveraging the Llama 3.1 foundation.