🐧 PenguinPulse

Linux Graphics & Gaming News

Ollama v0.18.3 Integrates Local AI Models Directly with VS Code via Copilot

Ollama released version 0.18.3 yesterday, introducing direct integration with Microsoft Visual Studio Code through GitHub Copilot. This update allows developers to select and utilize any local or cloud-based model from Ollama directly within the VS Code environment. According to the release notes, "Microsoft Visual Studio Code now directly integrates with Ollama via GitHub Copilot." This significantly streamlines the workflow for AI development, enabling easier access to local large language models for tasks like code completion, generation, and other AI-assisted functionalities. The release also includes minor improvements, such as GLM parser enhancements for tool calls and OpenClaw integration improvements for gateway checks.

Sources