Error 400: Bad Request with GPT-o1 on Azure AI Foundry
I just deployed GPT-o1 in Azure AI Foundry, which works fine in a simple playground environment. Although when I try to use it in a Prompt Flow, the following error is returned:
OpenAI API hits BadRequestError: Error code: 400 - {'error': {'code': 'BadRequest', 'message': 'Model {modelName} is enabled only for api versions 2024-12-01-preview and later'}} [Error reference: https://platform.openai.com/docs/guides/error-codes/api-errors]
This seems to happen in all Prompt Flow configurations, I already tested multiple standard template flows (including the basic chat template) and validated my tests using other models like GPT-4o and DeepSeek R1.
Azure OpenAI Service
-
Prashanth Veeragoni • 4,030 Reputation points • Microsoft External Staff
2025-04-22T16:04:15.24+00:00 The error message explicitly indicates that the GPT-o1 (GPT-4o) model requires the API version 2024-12-01-preview or later, but the Prompt Flow you're using is calling an older API version, which is not supported for GPT-o1.
Steps to fix it:
Step1: Update the API Version in Prompt Flow
In Azure AI Foundry Prompt Flow, you need to explicitly set the correct OpenAI API version. Here’s how:
1. Open your Prompt Flow YAML or Flow Configuration.
2. Locate the section where the model is defined. It will look something like:
model: provider: azure-openai model_name: gpt-o1 deployment_name: your-deployment-name api_version: 2023-06-01-preview # outdated
3. Update the api_version to:
api_version: 2024-12-01-preview # correct version for GPT-o1
Step2: Save and Rerun Your Flow
· Save your changes and re-run the Prompt Flow.
· This should resolve the 400 Bad Request error.
· GPT-o1 (GPT-4o) has some advanced capabilities and requires newer features from the 2024-12-01-preview API version.
· Prompt Flow or your deployment might default to an older API version if not explicitly set.
· Always check __Azure OpenAI API version compatibility __for new models.
Hope this helps, do let me know if you have any further queries.
Thank you!
-
Christian Mittler • 5 Reputation points
2025-04-23T14:22:47.7933333+00:00 Thank you very much for you quick answer @Prashanth Veeragoni !
I tried your solution in different ways, but it still returned the same error. The flow.dag.yaml for the basic chatting prompt flow looks like this:
id: template_chat_flow name: Template Chat Flow inputs: chat_history: type: list is_chat_input: false is_chat_history: true question: type: string is_chat_input: true outputs: answer: type: string reference: ${chat.output} is_chat_output: true nodes: - name: chat type: llm source: type: code path: chat.jinja2 inputs: deployment_name: o1 temperature: 0.7 top_p: 1 max_tokens: 256 question: ${inputs.question} chat_history: ${inputs.chat_history} provider: AzureOpenAI connection: ai-myconnection12356789_aoai api: chat module: promptflow.tools.aoai use_variants: false node_variants: {} environment: python_requirements_txt: requirements.txt
I already tried inserting api_version: 2024-12-01-preview in different locations, the most probable seemed to be the following:
inputs: deployment_name: o1 api_version: 2024-12-01-preview temperature: 0.7
I also tried the latest API version 2025-03-01-preview, sadly to no avail.
-
Saideep Anchuri • 6,690 Reputation points • Microsoft External Staff
2025-04-23T16:47:11.28+00:00 Here are some steps:
- Ensure that the API version you're using (
2024-12-01-preview
or2025-03-01-preview
) is actually supported for your Azure OpenAI deployment - Verify that
connection: ai-myconnection12356789_aoai
is correctly configured and has the necessary permissions to access Azure OpenAI. - Ensure that the structure of your
flow.dag.yaml
file is correct. It should have the necessary fields defined properly. Theid
field should not be empty. - Azure AI Foundry provides logs that can help identify the root cause. Look for specific error messages related to InferencingClientCallFailed
Kindly refer below link: https://learn.microsoft.com/en-us/azure/ai-foundry/how-to/flow-develop#create-and-develop-your-prompt-flow
https://learn.microsoft.com/en-us/azure/ai-foundry/how-to/prompt-flow-troubleshoot
Thank You.
- Ensure that the API version you're using (
-
Christian Mittler • 5 Reputation points
2025-04-24T12:04:00.9733333+00:00 Thank you very much @Saideep Anchuri !
I checked everything and the o1 model seems to automatically generate endpoints for an older API version, as displayed in these parameters from the target URI: ?api-version=2025-01-01-preview. There does not seem to be an option in Azure AI Foundry to change this, is there anything I am missing?
-
Prashanth Veeragoni • 4,030 Reputation points • Microsoft External Staff
2025-04-24T17:23:29.0366667+00:00 Thanks for the detailed follow-up. You’re on the right track, and I can see exactly why it’s still failing.
In Prompt Flow, the api_version is not set inside the inputs block of the node.
Instead, it must be specified inside the connection configuration — because API versioning is controlled at the Azure OpenAI connection level, not per-node.
Solution: Update the Azure OpenAI Connection settings
Here's exactly how to do it:
Step1: Locate your connection
Your connection is listed as:
connection: ai-myconnection12356789_aoai
Go to:
· Azure AI Studio → Prompt Flow
· In the left sidebar, click "Connections"
· Find your connection named ai-myconnection12356789_aoai
Step2: Update the API version in the connection
· Open the connection details.
· Click Edit (or recreate if needed).
· In the API Version field, enter:
2024-12-01-preview
This step is critical. The Prompt Flow engine pulls the API version from the connection, not from the flow DAG YAML.
Step3: Save and re-run the flow
Once the connection is updated:
· Go back to your Prompt Flow, and run it again.
· It should now be able to use GPT-o1 (gpt-4o) successfully.
Why This Works
Azure AI Foundry and Prompt Flow load credentials and config (like API version, endpoint, etc.) from the linked AzureOpenAI connection, not from the node-level YAML settings. That’s why injecting api_version into the node doesn’t take effect.
Hope this helps.
Thank you!
-
Christian Mittler • 5 Reputation points
2025-04-25T09:19:41.06+00:00 Thank you for your answer @Prashanth Veeragoni !
I checked your propositions, but there is no menu item called connections, as shown here:
There is a menu item called "Connected resources" in the Management Center, although the edit option for these connections has no functionality to specify the API version, as shown in the following image:
This problem does not seem to be documented anywhere yet, could it be that it is actually a bug with the AI Foundry platform?
-
Prashanth Veeragoni • 4,030 Reputation points • Microsoft External Staff
2025-04-25T16:54:42.5633333+00:00 Thanks for sharing the screenshots and detailed background — this issue is indeed subtle and not well documented yet. You're encountering this error:
Error 400: Bad Request ... 'Model {modelName} is enabled only for API versions 2024-12-01-preview and later'
Even after specifying the correct api_version in the flow.dag.yaml, it still fails when triggered via Prompt Flow in Azure AI Foundry, while working fine in the Playground. This hints that Prompt Flow is not yet properly passing or honoring the api_version setting, particularly for newer models like GPT-4o or GPT-o1.
Root Cause:
· The api_version for Azure OpenAI is not automatically inferred from the model version.
· The current Prompt Flow UI or YAML handling does not yet support injecting api_version at the required layer (the connection level).
· In the “Connected Resources” (screenshot 2), there is no field for api_version, which confirms that the Azure AI Foundry's Prompt Flow implementation is hardcoding or defaulting the API version, likely to a version older than 2024-12-01-preview.
Solution:
You cannot currently fix this through Prompt Flow UI alone, but you can override it via Python code using the @tool decorator or via a custom component where you manually call the OpenAI chat API and pass the correct API version.
Use a Custom Python Tool to Call Azure OpenAI API
Here’s a code snippet that creates a callable node that explicitly sets api_version.
from promptflow import tool from openai import AzureOpenAI @tool def chat_with_gpt_o1(question: str, chat_history: list) -> str: client = AzureOpenAI( api_key="YOUR_API_KEY", # Use secrets in production api_version="2024-12-01-preview", azure_endpoint="https://<your-resource-name>.openai.azure.com/" ) messages = [{"role": "system", "content": "You are a helpful assistant."}] for chat in chat_history: messages.append({"role": chat["role"], "content": chat["content"]}) messages.append({"role": "user", "content": question}) response = client.chat.completions.create( model="o1", # or your deployment name messages=messages, temperature=0.7, max_tokens=256, top_p=1 ) return response.choices[0].message.content
Then use this tool in your Prompt Flow.
Hope this helps.
Thanks.
-
Prashanth Veeragoni • 4,030 Reputation points • Microsoft External Staff
2025-04-28T16:34:57.69+00:00 Following up to see if the above suggestion was helpful. And, if you have any further query do let me know.
Thank you!
-
Prashanth Veeragoni • 4,030 Reputation points • Microsoft External Staff
2025-04-29T17:00:25.94+00:00 We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet. In case if you have any resolution, please do share that same with the community as it can be helpful to others. Otherwise, will respond with more details and we will try to help.
Thank you!
-
Christian Mittler • 5 Reputation points
2025-04-30T08:18:38.25+00:00 Thank you very much @Prashanth Veeragoni !
Your solution solved the problem, although this should only be temporary, until the issue is resolved.
Although another issue arrived, as I currently cannot deploy any Prompt Flows whatsoever, the platform is returning the following error:
The request is invalid. Status: 400 (The request is invalid.) ErrorCode: BadRequest Content: { "error": { "code": "BadRequest", "message": "The request is invalid.", "details": [ { "code": "InferencingClientCallFailed", "message": "{\"error\":{\"code\":\"Validation\",\"message\":\"{\\\"errors\\\":{\\\"\\\":[\\\"This endpoint has not been created successfully or is in deleting provisioning state. Please recreate endpoint and then try again to create deployment.\\\"]},\\\"type\\\":\\\"https://tools.ietf.org/html/rfc9110#section-15.5.1\\\",\\\"title\\\":\\\"One or more validation errors occurred.\\\",\\\"status\\\":400,\\\"traceId\\\":\\\"00-21faad9a571c440e86211a3746584f41-2ce498690ea2653f-01\\\"}\"}}", "details": [], "additionalInfo": [] } ], "additionalInfo": [ { "type": "ComponentName", "info": { "value": "managementfrontend" } }, { "type": "Correlation", "info": { "value": { "operation": "21faad9a571c440e86211a3746584f41", "request": "f41f8996739e347c" } } }
-
Prashanth Veeragoni • 4,030 Reputation points • Microsoft External Staff
2025-04-30T18:47:33.53+00:00 This new error clearly points to a platform-side issue with the Azure AI Endpoint — not the Prompt Flow itself.
The key part of the error is: "This endpoint has not been created successfully or is in deleting provisioning state. Please recreate endpoint and then try again to create deployment."
What This Means:
· The endpoint backing your Prompt Flow is either:
o Still provisioning,
o Failed during creation,
o Or is stuck in a deleting state and not yet cleaned up.
Prompt Flow is trying to deploy or call a managed endpoint, but the endpoint is not usable.
Solution Steps:
Step1: Go to Azure Portal → Azure Machine Learning → Endpoints
· Check if the endpoint is:
o Stuck (status = deleting, creating, or failed)
o Or deleted but still listed
Step2: Try Deleting the Broken Endpoint
· Manually delete the problematic endpoint.
· If it is stuck in "deleting" and cannot be deleted manually:
o Wait 15–20 minutes — Azure may take time to finish cleanup.
o Alternatively, use Azure CLI to force the deletion:
az ml online-endpoint delete --name <your-endpoint-name> --resource-group <your-rg> --workspace-name <your-ws>
Step3: Recreate the Endpoint
· Once it's fully deleted, redeploy your Prompt Flow or model.
· Make sure the deployment target is healthy (compute instance, AKS, etc.)
Hope this helps.
Thank you!
-
Prashanth Veeragoni • 4,030 Reputation points • Microsoft External Staff
2025-05-02T17:47:53.33+00:00 Following up to see if the above suggestion was helpful. And, if you have any further query do let me know.
Thank you!
Sign in to comment