VulpineOS
Docs
Runtime
Hardened Browser
Native Firefox patches, C++ injection filter
Action Lock
Freeze JS, timers, reflows mid-think
Optimized DOM
93.1% measured compression
Trust Warming
Long-lived browser identities
Operator surfaces
Web Panel
Fleet view, sessions, approvals
TUI Workbench
SSH-friendly operator terminal
Remote Access
Control plane over the runtime
MCP Toolbelt
36 tools across navigation, audio, mobile
Ecosystem
Foxbridge
CDP to Juggler / WebDriver BiDi
Vulpine Mark
Set-of-mark visual labeling
MobileBridge
iOS & Android device sessions
Vulpine-Box
One-click Docker self-host
Documentation
Quickstart
First agent in 30 seconds
Architecture
Four-phase security model
MCP Reference
All 36 browser tools
Agent Scripting DSL
Declarative agent definitions
Open source
VulpineOS Runtime
GitHub, MPL 2.0
Foxbridge
CDP to Juggler / WebDriver BiDi
Vulpine Mark
Set-of-mark visual labeling
Android MobileBridge
Device discovery & sessions
Integrations
OpenClaw
Pre-configured agent loop
Camoufox
Anti-detect browser core
LLM providers
30+ models, one toolbelt
Docker (Vulpine-Box)
One-click self-host
Guides
AI Agent Security
Threat model, mitigations
Prevent Prompt Injection
Hidden DOM, ARIA tricks
OpenClaw + Camoufox
End-to-end setup
Camoufox vs Chrome
Why Firefox, why patches
Engineering
Changelog
Release notes as work lands
Roadmap
Now, next, later, research
GitHub Releases
Tagged builds and diffs
RenderLab
Optional render diagnostics study
Community
GitHub Discussions
Ideas, RFCs, support
Contributing
How to help
Issues
Bug reports, feature requests
Contact
support@vulpineos.com
Star on GitHub →
Integration · LLM providers

30+ LLM providers, one toolbelt.

VulpineOS exposes a 36-tool MCP toolbelt that any model client can drive — Claude, GPT, Gemini, open-weights models, hosted aggregators, local runners. Bring your own keys; VulpineOS doesn't proxy your LLM traffic.

MCP-nativeBring your own keysNo proxy, no token surcharge
01

How the toolbelt works

Every VulpineOS agent operates a Camoufox browser session through a Model Context Protocol server. The server exposes 36 typed tools: navigation, click, type, snapshot, find, verify, page-settled, fill-form, select-option, human-realism inputs, scripting, and more.

Any LLM client that speaks MCP can drive it. For clients that don't speak MCP natively, the Foxbridge translation layer exposes the same surface over Chrome DevTools Protocol.

02

Supported providers

The list is curated, not exhaustive — VulpineOS ships with 30+ providers pre-configured. Anything OpenAI-compatible can be wired up in seconds.

Frontier

The headline frontier-lab models, end to end.

  • AnthropicClaude Sonnet, Opus, Haiku
  • OpenAIGPT-4o, GPT-5, o-series
  • GoogleGemini 2.x, Gemini Pro, Gemini Flash
  • xAIGrok
Open weights · hosted

Open-source models served by their original labs or hosts.

  • DeepSeek
  • Mistral
  • Cohere
  • Qwen
Inference-optimized

Speed-first hosts for latency-sensitive agent loops.

  • Groq
  • Cerebras
  • Together AI
  • Fireworks
Aggregators

Routers and marketplaces that fan out across many backends.

  • OpenRouter
  • HuggingFace
  • Replicate
  • Anyscale
Local

Run models on your own hardware. No API key required.

  • Ollama
  • LM Studio
  • vLLM
  • LocalAI
03

BYO keys, no proxy

VulpineOS doesn't sit between you and the model provider. You configure your own API keys in the runtime config, and the agent talks to the provider directly. We don't see, log, or charge for tokens — your billing is between you and the provider.

This is a deliberate choice. Hosted runtimes that proxy LLM calls add a third party to your data path. The VulpineOS open-source runtime keeps that path local, even when running fleets of hundreds of agents.

04

Setup

Configuration lives in ~/.vulpineos/config.jsonor the panel's settings tab. Pick a provider, paste a key, choose a model. Every agent spawned afterward uses the configured stack. Per-agent overrides are supported when you want a specific model for a specific task.

The first-launch wizard walks through provider selection automatically — the runtime won't spawn agents until at least one provider is configured.

05

Resources

  • Provider configuration

    docs.vulpineos.com — full list of supported providers and configuration paths.

  • MCP toolbelt reference

    docs.vulpineos.com/mcp-tools — every tool, every parameter.

  • OpenClaw integration

    /integrations/openclaw — the bundled agent loop that drives the toolbelt by default.

Drive Camoufox from any model.

Self-host the runtime and point your model client at the bundled MCP server. No vendor lock-in, no proxy, no token surcharge.

Read the docs →Star on GitHub
VulpineOS

The browser built for AI agents.
Open-source runtime, end to end.

Camoufox 146.0.1
Product
DocsArchitectureMCP ToolbeltChangelog
Open source
GitHubFoxbridgeVulpine MarkMobileBridge
Resources
RoadmapAI Agent SecurityPrompt InjectionCamoufox vs Chrome
Community
DiscussionsContributingIssuesContact
© 2026 VulpineOSBuilt on Camoufox · Firefox 146.0.1Privacy & Terms