When testing the instruction model, like Llama3.1-70B-Instruct, offline through lm_eval. Should we add --apply_chat_template?
· Sign up or log in to comment