The original post: /r/selfhosted by /u/I_May_Say_Stuff on 2024-12-22 18:01:55.
Hey all,
I currently have an RTX 3070 Ti along with an Intel i7-12700k CPU & 64GB DDR4 memory in my main PC and I run Ollama (along with OpenWebUI) via docker on WSL2 on it. I have a few LLM’s loaded in it and overall I’m fairly happy with it. It’s functional…, but I know it could be a little quicker if I invest in a better GPU.
My question is: With a budget of $1000… what GPU would you recommend replacing the RTX 3070 Ti with, where the main purpose of the upgrade is better performance for Ollama running LLM models?
For a little more context… the model’s I’m currently running are all Q5_K_M models around the 7b & 8b parameter size, given the current hardware setup.
Thank you.
You must log in or register to comment.