Pre-checks
What problem does this solve?
Basically, Azure OpenAI provides these credentials:
AZURE_OPENAI_API_KEY
AZURE_OPENAI_ENDPOINT
AZURE_API_VERSION
AZURE_DEPLOYMENT
The library (for example, openai or litellm) then parses the endpoint and calls the service using a URL like this: https://AZURE_OPENAI_ENDPOINT/openai/deployments/AZURE_DEPLOYMENT/chat/completions?api-version=AZURE_API_VERSION or https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/.
However, when I enter the Base URL like this: https://YOUR_RESOURCE_NAME.openai.azure.com/v1, the system does not understand it and returns a 404 error with the path: https://YOUR_RESOURCE_NAME.openai.azure.com/v1/chat/completions
2026-04-29 10:58:20 | INFO | aa96d7be-1cc | app.core.middleware:dispatch:41 - <-- GET /api/agents/ 200 0.022s
2026-04-29 10:58:20 | INFO | aa96d7be-1cc | logging:callHandlers:1762 - 172.19.0.5:46060 - "GET /api/agents/?tenant_id=63181cbe-c021-428f-96d8-1ff23836fb37 HTTP/1.1" 200
2026-04-29 10:58:25 | INFO | 48030a80-10b | app.core.middleware:dispatch:28 - --> POST /api/enterprise/llm-test [client: 172.19.0.5]
2026-04-29 10:58:26 | INFO | 48030a80-10b | logging:callHandlers:1762 - HTTP Request: POST https://YOUR_RESOURCE_NAME.openai.azure.com/v1/chat/completions "HTTP/1.1 404 Resource Not Found"
2026-04-29 10:58:26 | INFO | 48030a80-10b | app.core.middleware:dispatch:41 - <-- POST /api/enterprise/llm-test 200 0.282s
2026-04-29 10:58:26 | INFO | 48030a80-10b | logging:callHandlers:1762 - 172.19.0.5:46068 - "POST /api/enterprise/llm-test HTTP/1.1" 200
Proposed solution
So my solution was to add the /openai path to make the path correct. It was kind of confusing at first when I didn’t add the path. I can see in the code that you guys are customizing the OpenAI-compatible setup, you can follow the params that LiteLLM or LangChain is using.
Resources:
https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/responses
https://learn.microsoft.com/en-us/azure/foundry/openai/latest
https://docs.litellm.ai/docs/providers/azure/
https://docs.litellm.ai/docs/providers/azure/azure_responses
https://docs.langchain.com/oss/python/integrations/llms/azure_openai
Willing to contribute?
Pre-checks
What problem does this solve?
Basically, Azure OpenAI provides these credentials:
AZURE_OPENAI_API_KEYAZURE_OPENAI_ENDPOINTAZURE_API_VERSIONAZURE_DEPLOYMENTThe library (for example,
openaiorlitellm) then parses the endpoint and calls the service using a URL like this:https://AZURE_OPENAI_ENDPOINT/openai/deployments/AZURE_DEPLOYMENT/chat/completions?api-version=AZURE_API_VERSIONorhttps://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/.However, when I enter the Base URL like this:
https://YOUR_RESOURCE_NAME.openai.azure.com/v1, the system does not understand it and returns a 404 error with the path:https://YOUR_RESOURCE_NAME.openai.azure.com/v1/chat/completionsProposed solution
So my solution was to add the
/openaipath to make the path correct. It was kind of confusing at first when I didn’t add the path. I can see in the code that you guys are customizing the OpenAI-compatible setup, you can follow the params that LiteLLM or LangChain is using.Resources:
https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/responses
https://learn.microsoft.com/en-us/azure/foundry/openai/latest
https://docs.litellm.ai/docs/providers/azure/
https://docs.litellm.ai/docs/providers/azure/azure_responses
https://docs.langchain.com/oss/python/integrations/llms/azure_openai
Willing to contribute?