Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Question without judgement: why would I want to run LLM locally? Say I'm building a SaaS app and connecting to Anthropic using the `ai` package. Would I want to cut over to ollama+something for local dev?


Data privacy-- some stuff, like all my personal notes I use with a RAG system, just don't need to be sent to some cloud provider to be data mined and/or have AI trained on them


I’ve been thinking about the local notes use case, but know zero about doing it. Do you have a good setup you could point me at?


For me it is consistency. I control the model and the software so I know a local LLM will remain exactly the same until I want to change it.

It also avoids the trouble of using a hosted LLM that decides to double their price overnight, costs are very predictable.


Lack of censorship.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: