What is the required GPU size to run Is a 4090 possible and does it support ollama

#5
by sminbb - opened

What is the required GPU size to run
Is a 4090 possible and does it support ollama

4090 should be good enough. Yes ollama would be helpful since these are GGUF files. However you will have to import GGUF in ollama.

Unsloth AI org

What is the required GPU size to run
Is a 4090 possible and does it support ollama

Yes 4090 is enough. You don't need a GPU, CPU with 48GB RAM will be enough.

At the moment Ollama does not support it as far as I'm aware of so you will need to use llama.cpp

Wait...you're telling me if I have say a

AMD Ryzen 7 5700X 8-Core Processor
Thread(s) per core: 2
Core(s) per socket: 8

RTX 4090

64GB DDR4 Ram

That I could run a Deepseek V3 Quant?

Unsloth AI org

Wait...you're telling me if I have say a

AMD Ryzen 7 5700X 8-Core Processor
Thread(s) per core: 2
Core(s) per socket: 8

RTX 4090

64GB DDR4 Ram

That I could run a Deepseek V3 Quant?

Yes that is correct but it will probably be slow

Sign up or log in to comment