Question without judgement: why would I want to run LLM locally? Say I'm building a SaaS app and connecting to Anthropic using the `ai` package. Would I want to cut over to ollama+something for local dev?
Data privacy-- some stuff, like all my personal notes I use with a RAG system, just don't need to be sent to some cloud provider to be data mined and/or have AI trained on them