what is the chat template?

#22
by Blannikus - opened

Since we don't have access to token-config.json i can't see the chat_template being used. Does anyone know?

I require to know for use with Llama.cpp

prompt = (
"<|begin_of_text|>"
"<|start_header_id|>user<|end_header_id|>"
"<|eot_id|>"
"<|start_header_id|>assistant<|end_header_id|>
)

follow_up_prompt = (
"<|begin_of_text|>" # start of prompt
"<|start_header_id|>user<|end_header_id|>" # past
f"{question}" # past
"<|eot_id|>" # past
"<|start_header_id|>assistant<|end_header_id|>" # past
f"{response}" # past
"<|eot_id|>" # past
"<|start_header_id|>user<|end_header_id|>" # new
f"{follow_up_question}" # new
"<|eot_id|>" # new
"<|start_header_id|>assistant<|end_header_id|>" # new
)

Sign up or log in to comment