LLama-cpp-python compatibility

#7
by enrico07 - opened

Is LLava-v1.6 34B compatible with LLama-cpp-python library? If so, should I use the same code that I would use for LLava-v1.5 family or should I use a different prompt format?

Owner

I don't know about the library, it's probably wrapping llama.cpp so it is compatible to inference the llava-1.6 models.
However, whatever code/functions you use to inference it, they need to access the new functionality.

llava-1.6 splits images into grids, created embeddings and permutes those into grids.
llama.cpp -> clip.cpp/llava.cpp those now contains the necessary APIs. You can see them being using in llava-cli.cpp

@enrico07 Hi, I have the same question with you. I tested the llava 1.6 mistral 7b model directly on the code of llava1.5 gguf. But the model printed nothing. DO you have any ideas? THank you very much!

@Wangderful I tested only LLava 1.6 34b and I don't have any problem. I'm using create_chat_completion() and Llava16ChatHandler() from LLama-cpp-python library.

@enrico07 Did you use it with image URLs or local images? I found that when using with local images using the base64 encoding it just hangs indefinitely.

Sign up or log in to comment