The original post: /r/privacy by /u/DonCABASH on 2024-05-27 20:13:05.
So I use AI when I code or need to organize an essay or a PowerPoint.
When I became more privacy conscious I stopped using chatGPT, and the private way I found is to install an LLM locally like llama… But the issue with local LLM is ressources, I have a 16gb ram M2 air and using this kind of stuff overheats my computer.
Is there a way to use AI without using too much computer ressources ?
You must log in or register to comment.