Settings
#2
by
Evoc
- opened
What are the recommended settings for this model? ChatML context/instruct? Or Alpaca?
ChatML appears to work fine. I think Alpaca is also fine based on the discussion mentioned but I would personally stick to chat formats if possible :)
ChatML appears to work fine. I think Alpaca is also fine based on the discussion mentioned but I would personally stick to chat formats if possible :)
Thanks!
Evoc
changed discussion status to
closed
Konnect's Llam@ception works really well with this.
Here it is w/ the samplers I use: https://files.catbox.moe/aa1sfk.json
Konnect's Llam@ception works really well with this.
Here it is w/ the samplers I use: https://files.catbox.moe/aa1sfk.json
Whatever boats your float. Though I have to say that is an overly long system prompt for any model.