Add jinja/minja chat template to tokenizer_config.json

#10
by ThiloteE - opened

llama.cpp recently added jinja compatibility with its minja parser, which is a light-weight jinja tool. Hereby I am proposing to add a proper chat_template to keep compatibility with apps that make use of minja/jinja. One can test by quantizing the model to gguf and use it in GPT4All. Llama.cpp's minja parser is not yet perfect and will remove the \n\n in the middle of {{- '[INST]' + system_message + '\n\n' + message['content'] + '[/INST]' }}, but that's not a problem of the template, but rather a problem of the parser.

image.png

image.png

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment