electron271/graig-experiment-3
electron271/graig-experiment-3 is a 4 billion parameter language model with a 40960 token context length. Developed by electron271, this model is presented as an experimental offering. It is intended for private, experimental use rather than public deployment, focusing on exploration of its capabilities.
No reviews yet. Be the first to review!