Your scientists were so preoccupied with whether or not they could, they didn’t stop to think if they should.

a coder finetuned for RP. because why not?

trained on a mixture of synthetic and natural RP data, as well as a mixture of storywriting/novel data from various sources (sugarquill, SCP, and miscellaneous novels) for 17-ish hours on 2x3090 from runpod

quants: https://huggingface.co./Hasnonname/Qwen2.5-Monte-7B-v0.0-GGUF

it's either overcooked or undercooked, and I can't tell which. regardless, thanks for giving it a shot.

use if you want:

  • lack of anatomical and spatial awareness
  • crazy mood swings
  • mean characters actually being mean (sometimes)
  • (occasionally) human-like prose

image/png

Downloads last month
11
Safetensors
Model size
7.62B params
Tensor type
FP16
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for Hasnonname/Qwen2.5-Monte-7B-v0.0

Base model

Qwen/Qwen2.5-7B
Finetuned
(26)
this model
Quantizations
1 model