Multi-format LLM proxy — translates OpenAI Chat, OpenAI Responses & Anthropic Messages. Works as a Claude Code LLM gateway with full model picker support.
-
Updated
May 4, 2026 - Python
Multi-format LLM proxy — translates OpenAI Chat, OpenAI Responses & Anthropic Messages. Works as a Claude Code LLM gateway with full model picker support.
A bridge for connecting OpenAI compliant APIs with tools that expect an Ollama API.
Zero-dependency proxy to run Claude Code via your GitHub Copilot subscription. Translates Anthropic Messages API ↔ OpenAI Chat Completions.
Localhost reverse proxy that bridges OpenAI Codex CLI with Z.ai's GLM Coding Plan API
Add a description, image, and links to the api-translation topic page so that developers can more easily learn about it.
To associate your repository with the api-translation topic, visit your repo's landing page and select "manage topics."