feat: add OpenRouter provider support#71
Conversation
221ca47 to
dcc6bf6
Compare
Marius1311
left a comment
There was a problem hiding this comment.
Looks great, thanks for the contribution! Just a few changes.
| provider = "anthropic" | ||
| elif "/" in model and not model_lower.startswith("models/"): | ||
| # OpenRouter commonly uses '<provider>/<model>' naming. | ||
| provider = "openrouter" |
There was a problem hiding this comment.
This part I don't get - if it does NOT start with "models/", does that tell us anything?
There was a problem hiding this comment.
@Marius1311 The not model_lower.startswith("models/") check exists because Gemini models use models/ prefix (e.g. models/gemini-1.5-flash), which also contains a /. Without this guard, Gemini model names would incorrectly match as OpenRouter. added to the comments in the code as well.
| except openai.OpenAIError as e: | ||
| raise e | ||
| logger.debug( | ||
| "Structured parse failed for model '%s'. Falling back to JSON-mode query. Error: %s", model, str(e) |
There was a problem hiding this comment.
@mossishahi, I broadly understand that's about using JSON-mode when structured outputs fails/isn't supported. What's the motivation for including it here?
There was a problem hiding this comment.
@Marius1311 OpenRouter routes requests to many upstream providers (Anthropic, DeepSeek, etc.) that expose an OpenAI-compatible API but don't all support the .parse() structured-output endpoint. When .parse() fails with an OpenAIError, the original code re-raised and crashed — making those models unusable. The fallback catches that error, retries with a plain JSON-mode completion, and validates the response against the Pydantic schema. This way any OpenRouter model that can produce JSON works, even if it doesn't support strict structured outputs. For native OpenAI/Gemini/Anthropic providers, .parse() succeeds and the fallback is never reached.
| self._base_url = "https://openrouter.ai/api/v1" | ||
|
|
||
| # Optional headers recommended by OpenRouter for request attribution. | ||
| import os |
There was a problem hiding this comment.
Avoid local imports if not necessary
There was a problem hiding this comment.
@Marius1311 Good catch — moved import os to the module top-level to match the rest of the codebase.
| assert provider_with_key._api_key == "test-key" | ||
|
|
||
|
|
||
| class TestOpenRouterProvider: |
There was a problem hiding this comment.
For this to work, I guess we would have to store an OpenRouter API key as a secret in the repo? I think that's how I do it for the other providers.
There was a problem hiding this comment.
@Marius1311 Yes, exactly — the real-API tests are guarded by @pytest.mark.skipif(not os.getenv("OPENROUTER_API_KEY"), ...) and will be skipped unless the secret is configured. The unit tests (initialization, repr, factory pattern, error handling) work without any key. If you'd like to enable the real-API tests in CI, an OPENROUTER_API_KEY secret would need to be added to the repo (same pattern as the existing OPENAI_API_KEY, GEMINI_API_KEY, and ANTHROPIC_API_KEY secrets). Happy to leave that to your discretion.
Add OpenRouter as a first-class LLM provider alongside OpenAI, Gemini, and Anthropic. OpenRouter aggregates models from many upstream providers behind a single API key, enabling access to a wide range of models (e.g. openai/gpt-4o-mini, anthropic/claude-3.5-sonnet, deepseek/deepseek-v3.2). Changes: - Register 'openrouter' in supported_providers and default_models. - Add OpenRouterProvider (extends OpenAIProvider with custom base_url). - Add OPENROUTER_API_KEY to APIKeyManager config. - Auto-detect OpenRouter from slash-style model IDs (e.g. provider/model). - Accept any listed OpenRouter model in test_query (catalog-based check). - Add structured-output JSON fallback for models that don't support OpenAI's .parse() endpoint (with text-repair recovery path). - Fix BaseAnnotator.query_llm signature to accept agent_description kwarg. - Update README with OpenRouter setup and usage documentation. - Add tutorial notebook: 110_openrouter_sample_annotation.ipynb. - Extend tests for providers, API keys, model detection, and integration. Made-with: Cursor
02b5095 to
a6a219f
Compare
for more information, see https://pre-commit.ci
Add OpenRouter as a first-class LLM provider alongside OpenAI, Gemini, and Anthropic. OpenRouter aggregates models from many upstream providers behind a single API key, enabling access to a wide range of models (e.g. openai/gpt-4o-mini, anthropic/claude-3.5-sonnet, deepseek/deepseek-v3.2).
Changes: