can you describe how to extending tokenizer from the old one?
adding vocab?
路 Sign up or log in to comment