Can you provide GGUF model usable with Ollama locally

#1
by ryg81 - opened

Can you provide GGUF model usable with Ollama locally?

It's not supported by Ollama nor llama.cpp.

Sign up or log in to comment