Azure OpenAI Batch Job (GPT-4o-mini and GPT-4o) gets status Failed with no errors in large requests
I'm creating batch jobs. I am using gpt-4o and gpt-4o-mini with Global Batch. The service version is 2024-10-21.
Everything works fine if I send the JSONL file with a maximum of 540 lines. However, when I try to send one with 1,000 lines using the same logic and structure, they reach AI Foundry but immediately fail without logging any errors. Only the input_file is available.
My code returns a Bad Request (System.ClientModel.ClientResultException: Service request failed. Status: 400 (Bad Request)).
I am doing the large request in the same way that when I send the files with 540 lines.
Does anyone know why this happens if the maximum number of requests per file, according to the documentation, is 100,000?
The file size barely reaches 1MB, so that shouldn't be the issue either...