Hugging Face inference endpoints

#18
by AlexNevo - opened

Hi,
I would like to configure a Hugging Face inference endpoint to deploy this model. What would you recommend ? Expecially for the section about Max Input Length (per Query), Max Number of Tokens (per Query), Max Batch Prefill Tokens and Max Batch Total Tokens considering that my DDL is about 4500 tokens long :
image.png

Defog.ai org

Hi @AlexNevo , you may refer to their documentation for how to set the parameters.
I believe you would need to increase the max input length given your large DDL size.
https://huggingface.co./docs/inference-endpoints/index

Sign up or log in to comment