Qwen3.5 + OpenCode + Ollama: Easy Setup Guide for Local AI Coding Agent

Qwen3.5 + OpenCode + Ollama: Easy Setup Guide for Local AI Coding Agent

More

Descriptions:

Fahd Mirza demonstrates how to set up OpenCode — a fully open-source, provider-agnostic alternative to Claude Code — connected to a locally running Qwen3.5 35B mixture-of-experts model via Ollama. Unlike Claude Code, OpenCode is not tied to any single AI provider and works with Anthropic, OpenAI, Google, or local models out of the box, making it a flexible choice for developers who want agentic coding capabilities without cloud API dependencies.

The setup involves installing OpenCode via a single bash script, writing a minimal JSON config file that specifies the Ollama API endpoint and model name, and launching it from the terminal. Mirza shows the Ctrl+P model-switcher interface, which lets users toggle between any locally available Ollama models mid-session. He then demos practical tasks: listing directory contents via shell command, summarizing a Python file, and attempting to open an HTML file in the browser — illustrating both OpenCode’s full terminal access and its built-in tooling capabilities.

The video is aimed at developers already comfortable with Ollama who want to replace or complement Claude Code with a self-hosted, open-source agentic coding environment. Mirza includes a brief security note about the risks of giving a terminal agent unrestricted system access, and the Qwen3.5 35B model — running at around 23 GB — is presented as a capable local backbone for the kinds of multi-step file and code tasks OpenCode is designed to handle.


📺 Source: Fahd Mirza · Published March 16, 2026
🏷️ Format: Tutorial Demo

1 Item

Channels