Agent Open logo

The open source
agentic harness

Run any model. Any provider. Your terminal.

Works withLlamaQwenDeepSeekMistralGemmaPhi
terminal — agentopen

Inference Providers

Any provider.

Local runtimes, hosted inference, or a hybrid setup. Swap providers without changing how you work.

OROpenRouter
TGTogether AI
OLOllama
GQGroq
FWFireworks
RPReplicate
VLvLLM
SNSambaNova
OROpenRouter
TGTogether AI
OLOllama
GQGroq
FWFireworks
RPReplicate
VLvLLM
SNSambaNova
OROpenRouter
TGTogether AI
OLOllama
GQGroq
FWFireworks
RPReplicate
VLvLLM
SNSambaNova
OROpenRouter
TGTogether AI
OLOllama
GQGroq
FWFireworks
RPReplicate
VLvLLM
SNSambaNova

Why Agent Open

Not another wrapper.
A real harness.

01

Any Model

Local or hosted — Llama, Qwen, DeepSeek, Mistral. Your choice, always.

02

Any Provider

OpenRouter, Together, Ollama, vLLM. Plug into whatever you're already running.

03

Fully Open Source

No vendor lock-in. No telemetry. Fork it, extend it, own it.

04

Terminal Native

Built for how developers actually work. No browser tabs, no context switching.

Who it's for

Built for developers
who think for themselves.

01

Open Source Enthusiasts

Run frontier OSS models locally. Contribute back. Be part of a community that ships in the open.

02

Privacy-First Teams

Your code never leaves your machines. No cloud APIs required. Full control over your data.

03

Cost-Conscious Devs

Use cheap inference providers instead of $20/mo subscriptions. Same power, fraction of the cost.

The Thesis

"The best coding agents shouldn't require proprietary models or cloud lock-in."

Open-source models are catching up fast. Inference is becoming a commodity. The missing piece is a harness that treats every model as a first-class citizen — local or remote, large or small. That's what we're building.