Skip to content

feat: add OpenRouter provider support#71

Open
mossishahi wants to merge 2 commits intoquadbio:mainfrom
mossishahi:feat/openrouter-provider-support
Open

feat: add OpenRouter provider support#71
mossishahi wants to merge 2 commits intoquadbio:mainfrom
mossishahi:feat/openrouter-provider-support

Conversation

@mossishahi
Copy link

Add OpenRouter as a first-class LLM provider alongside OpenAI, Gemini, and Anthropic. OpenRouter aggregates models from many upstream providers behind a single API key, enabling access to a wide range of models (e.g. openai/gpt-4o-mini, anthropic/claude-3.5-sonnet, deepseek/deepseek-v3.2).

Changes:

  • Register 'openrouter' in supported_providers and default_models.
  • Add OpenRouterProvider (extends OpenAIProvider with custom base_url).
  • Add OPENROUTER_API_KEY to APIKeyManager config.
  • Auto-detect OpenRouter from slash-style model IDs (e.g. provider/model).
  • Accept any listed OpenRouter model in test_query (catalog-based check).
  • Add structured-output JSON fallback for models that don't support OpenAI's .parse() endpoint (with text-repair recovery path).
  • Fix BaseAnnotator.query_llm signature to accept agent_description kwarg.
  • Update README with OpenRouter setup and usage documentation.
  • Add tutorial notebook: 110_openrouter_sample_annotation.ipynb.
  • Extend tests for providers, API keys, model detection, and integration.

@mossishahi mossishahi force-pushed the feat/openrouter-provider-support branch 2 times, most recently from 221ca47 to dcc6bf6 Compare March 23, 2026 19:16
Copy link
Member

@Marius1311 Marius1311 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great, thanks for the contribution! Just a few changes.

provider = "anthropic"
elif "/" in model and not model_lower.startswith("models/"):
# OpenRouter commonly uses '<provider>/<model>' naming.
provider = "openrouter"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This part I don't get - if it does NOT start with "models/", does that tell us anything?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Marius1311 The not model_lower.startswith("models/") check exists because Gemini models use models/ prefix (e.g. models/gemini-1.5-flash), which also contains a /. Without this guard, Gemini model names would incorrectly match as OpenRouter. added to the comments in the code as well.

except openai.OpenAIError as e:
raise e
logger.debug(
"Structured parse failed for model '%s'. Falling back to JSON-mode query. Error: %s", model, str(e)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mossishahi, I broadly understand that's about using JSON-mode when structured outputs fails/isn't supported. What's the motivation for including it here?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Marius1311 OpenRouter routes requests to many upstream providers (Anthropic, DeepSeek, etc.) that expose an OpenAI-compatible API but don't all support the .parse() structured-output endpoint. When .parse() fails with an OpenAIError, the original code re-raised and crashed — making those models unusable. The fallback catches that error, retries with a plain JSON-mode completion, and validates the response against the Pydantic schema. This way any OpenRouter model that can produce JSON works, even if it doesn't support strict structured outputs. For native OpenAI/Gemini/Anthropic providers, .parse() succeeds and the fallback is never reached.

self._base_url = "https://openrouter.ai/api/v1"

# Optional headers recommended by OpenRouter for request attribution.
import os
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Avoid local imports if not necessary

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Marius1311 Good catch — moved import os to the module top-level to match the rest of the codebase.

assert provider_with_key._api_key == "test-key"


class TestOpenRouterProvider:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For this to work, I guess we would have to store an OpenRouter API key as a secret in the repo? I think that's how I do it for the other providers.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Marius1311 Yes, exactly — the real-API tests are guarded by @pytest.mark.skipif(not os.getenv("OPENROUTER_API_KEY"), ...) and will be skipped unless the secret is configured. The unit tests (initialization, repr, factory pattern, error handling) work without any key. If you'd like to enable the real-API tests in CI, an OPENROUTER_API_KEY secret would need to be added to the repo (same pattern as the existing OPENAI_API_KEY, GEMINI_API_KEY, and ANTHROPIC_API_KEY secrets). Happy to leave that to your discretion.

Add OpenRouter as a first-class LLM provider alongside OpenAI, Gemini,
and Anthropic. OpenRouter aggregates models from many upstream providers
behind a single API key, enabling access to a wide range of models
(e.g. openai/gpt-4o-mini, anthropic/claude-3.5-sonnet, deepseek/deepseek-v3.2).

Changes:
- Register 'openrouter' in supported_providers and default_models.
- Add OpenRouterProvider (extends OpenAIProvider with custom base_url).
- Add OPENROUTER_API_KEY to APIKeyManager config.
- Auto-detect OpenRouter from slash-style model IDs (e.g. provider/model).
- Accept any listed OpenRouter model in test_query (catalog-based check).
- Add structured-output JSON fallback for models that don't support
  OpenAI's .parse() endpoint (with text-repair recovery path).
- Fix BaseAnnotator.query_llm signature to accept agent_description kwarg.
- Update README with OpenRouter setup and usage documentation.
- Add tutorial notebook: 110_openrouter_sample_annotation.ipynb.
- Extend tests for providers, API keys, model detection, and integration.

Made-with: Cursor
@mossishahi mossishahi force-pushed the feat/openrouter-provider-support branch from 02b5095 to a6a219f Compare March 24, 2026 10:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants