sometimesanotion/Lamarck-14B-v0.7-Fusion

sometimesanotion/Lamarck-14B-v0.7-Fusion is an experimental 14.8 billion parameter language model with a 131,072 token context length, developed by sometimesanotion. This model is a multi-stage fusion merge, emphasizing strong prose generation and exhibiting high GPQA and reasoning capabilities. It is specifically designed for free-form creativity and exploring complex merge strategies.

Warm
Public
14.8B
FP8
32768
License: apache-2.0
Hugging Face