Running Ollama & Claude-Class Models Locally
Transitioning from cloud-based models (like Claude 3.5 Sonnet) to local execution via Ollama offers a paradigm shift in development workflows. Data privacy, zero latency, and cost elimination are the primary drivers.
Use this tool to check if your rig can handle the latest open-weights.
Local vs Cloud. Comparing DeepSeek V2 (Local) against Claude 3.5 Sonnet (Cloud).
Install Ollama to handle model runtime.
curl -fsSL https://ollama.com/install.sh | sh
Pull DeepSeek Coder V2 (Lite).
ollama pull deepseek-coder-v2
Configure 'Continue' extension in VS Code.
"models": [
{
"title": "Local DeepSeek",
"provider": "ollama",
"model": "deepseek-coder-v2"
}
]