Ollama v0.18.2 Boosts Local Claude Code Speed, Refines OpenClaw Integration
Ollama has released version 0.18.2 today, introducing performance improvements and fixes for its integration with OpenClaw. A key update in this release is the enhanced speed of local AI code generation, with the development team stating that "Claude Code will now be faster when run locally, due to preventing cache breakages." The update also includes several adjustments for OpenClaw. This involves adding a preliminary check to ensure npm and git are installed prior to OpenClaw installation. Support for the "ollama launch openclaw --model
Sources
- v0.18.2 - GitHub: ollama/ollama