Can you provide GGUF model usable with Ollama locally?
It's not supported by Ollama nor llama.cpp.
· Sign up or log in to comment