Running your own AI? Archive it.

Self-hosted AI doesn't mean unregulated AI. Arc Bridge connects to your internal AI infrastructure and archives every interaction - same compliance controls as enterprise platforms.

Self-hosted integrations

Arc Bridge adapts to your infrastructure. Proxy pattern for API-based tools, DB connectors for UI platforms.

LiteLLM

Live

Proxy integration - Arc Bridge sits behind LiteLLM and captures all model calls across providers.

Ollama

Live

Proxy pattern - intercept local Ollama API traffic for archiving without touching your model setup.

Open WebUI

Connected

DB connector pattern - archive conversations directly from Open WebUI's data layer.

vLLM

Coming Soon

Proxy pattern - transparent interception for high-performance vLLM inference servers.

Text Generation Inference

Coming Soon

Proxy pattern - archive HuggingFace TGI model interactions.

OpenRouter

Coming Soon

Gateway integration - capture routed model calls across multiple providers.

Two integration patterns

Proxy pattern

For API-based tools (Ollama, vLLM, LiteLLM). Arc Bridge acts as a transparent proxy - swap the endpoint URL and all traffic is captured.

Client → Arc Bridge → Ollama → Model

DB connector pattern

For UI platforms (Open WebUI). Arc Bridge connects to the application's database and exports conversation records to the archive.

Open WebUI → DB ← Arc Bridge → Archive

Your models, your infrastructure, our archive

Keep full control of your AI stack. Arc Bridge just makes sure the records are there when you need them.