Error 400: Bad Request with GPT-o1 on Azure AI Foundry

Christian Mittler 5 Reputation points
2025-04-22T14:18:45.43+00:00

I just deployed GPT-o1 in Azure AI Foundry, which works fine in a simple playground environment. Although when I try to use it in a Prompt Flow, the following error is returned:

OpenAI API hits BadRequestError: Error code: 400 - {'error': {'code': 'BadRequest', 'message': 'Model {modelName} is enabled only for api versions 2024-12-01-preview and later'}} [Error reference: https://platform.openai.com/docs/guides/error-codes/api-errors]

This seems to happen in all Prompt Flow configurations, I already tested multiple standard template flows (including the basic chat template) and validated my tests using other models like GPT-4o and DeepSeek R1.

Azure OpenAI Service
Azure OpenAI Service
An Azure service that provides access to OpenAI’s GPT-3 models with enterprise capabilities.
3,950 questions
{count} votes

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.