Passing in data via function calls

#18
by thegamecat - opened

I see from the model card how functions (with params) can be selected to be returned by the query but what about calling a function to bring in data based upon rules?

I have a function called list_dogs()

My prompt says something like:

  1. If the intent relates to dogs call the function list_dogs()
  2. Do x with the list of dogs
    etc

And I let the llm know about the function make up as per the function_definitions variable.

But the list_dogs never gets used.

I use this approach with gpt but I've never tried it with a self hosted model so forgive me for my absolute ignorance and naivety should this be the problem :)

Do you guys think it's because of the "return" or "new line" character of your prompt, thus forcing it to skip "list_dogs" ?

I don't know how llama 3.2 likes the presentation of prompts, but it works fine with gpt assistants api.

You think the 1B and 3B models support tools?
Cause its not in their model card, but in 3.3 and 3.1 it is described there.

Sign up or log in to comment