3bpw spam random words on default settings with oobabooga
Wait, my bad, this is for the normal version (not RP). Which pytorch/exllamav2 are you using?
Rp version works normally. I was using latest pytorch&exllamav2 on a fresh Ubuntu vps install yesterday (unfortunately I deleted it already and it's pain to setup everything again, but you probably can check latest available versions).
I noticed a closed topic where was a similar problem and it was solved by removing the BOS token. Unfortunately, I was not able to check, because i already deleted this and installed the RP version.
But I noticed that if the BOS token is removed from the RP version, the model becomes noticeably dumber and does not follow the instructions.
I see, it seems to be that issue. Gonna add the info in the main page in the meanwhile. Not sure, maybe it is related to the wikitext dataset. Closing it for now!