Text Generation
Transformers
Inference Endpoints
bloom7b1 / tokenizer_config.json
thanhnew2001's picture
Upload 7 files
126ccab
raw
history blame contribute delete
222 Bytes
{"unk_token": "<unk>", "eos_token": "</s>", "bos_token": "<s>", "pad_token": "<pad>", "name_or_path": "bigscience/tokenizer", "special_tokens_map_file": null, "tokenizer_class": "BloomTokenizerFast", "padding_side":"left"}