Why isn't the `model_max_length` set to 2048?
#32
by
alvarobartt
HF staff
- opened
Hi here π€
Asking from my misunderstanding, why is the model_max_length
within the tokenizer_config.json
set to 1000000000000000019884624838656? Shouldn't it be 2048 as per the Zephyr paper? Does this have any side/unintended effect? Is there any rationale behind it?
aha, i have the same question.