invisietch/L3.1-70Blivion-v0.1-rc1-70B

The invisietch/L3.1-70Blivion-v0.1-rc1-70B is a 70 billion parameter causal language model, based on a merge of L3.1 Nemotron 70B and Euryale 2.2, with a 32768 token context length. This release candidate model has undergone a healing training step to further decensor it and improve issues from the merge. It is specifically designed and optimized for creative writing and roleplay scenarios.

Warm
Public
70B
FP8
32768
License: llama3.1
Hugging Face

No reviews yet. Be the first to review!