What would be the average inference time for this model using beam width =4

#31
by ashwin26 - opened

I am using this model on a database schema(average 20 tables, 30 columns for each table). Currently running on a 1x4090 GPU with 128GB ram. It is taking a long time (7-10 mins). This is how I am loading the model and inferencing
Any suggestions on how I can improve speed?

Screenshot 2024-04-09 233100.png

Defog.ai org

Sign up or log in to comment