How to access an LLM that is secured by an API GW with an API_KEY in Strands? #1695
rmunjuluri
started this conversation in
General
Replies: 1 comment
-
|
This APIGW, is that a proxy to the LLM? Does it provide the same API shape as OpenAI or Gemini? The answer would depend on it. From Strands' perspective, SDK would be calling the APIGW. Whatever happens behind it is somewhat irrelevant. So the question is does the API shape and authorization match any of the model providers. If so, then you can use the correct model provider and add If that's not the case, then you'd need to create a custom model provider |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
We have an LLM from Gemini/OpenAI etc. that are secured using an API Gateway and need to access using the API_KEY. This works fine to develop custom coding agents, sending JSON, etc. How do I using such an LLM as one of the providers, with the Secure API_KEY and the URL exposed by the API Gateway? any help is appreciated?
Beta Was this translation helpful? Give feedback.
All reactions