Mawdistical/Lured-Lapine-70B

Mawdistical/Lured-Lapine-70B is a 70 billion parameter finetuned language model derived from allura-org/Bigger-Body-70b, designed for generating explicit content with a focus on feral instinct and obsessive themes. This English-only model is optimized for creative writing in specific niche genres, offering a distinct personality for narrative applications. It features a 32768 token context length and provides recommended settings for temperature and dynamic temperature for nuanced output control.

Warm
Public
70B
FP8
32768
License: llama3.3
Hugging Face