Azure AI Foundry stuck at running

Sangmin Lee 0 Reputation points
2025-05-01T14:04:07.8+00:00

I have a project with several prompt flows. Sometimes they work. It definitely worked fine in the beginning. After a while it's stuck at running and even if I leave it running for around an hour, it's still running in the LLM process.

When it works, it doesn't take more than 30 seconds.

Each flow is one LLM and one Python script to return the output in JSON format.

I've tried creating new projects and it's the same result.

What can I do?

Azure AI services
Azure AI services
A group of Azure services, SDKs, and APIs designed to make apps more intelligent, engaging, and discoverable.
3,416 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Suwarna S Kale 2,511 Reputation points
    2025-05-01T15:13:58.86+00:00

    Hello Sangmin Lee,

    Thank you for posting your question in the Microsoft Q&A forum. 

    Azure AI Foundry’s prompt flows occasionally encountering prolonged execution times stuck indefinitely in the "Running" state suggests an underlying systemic issue rather than an isolated code problem. Since these flows consist of just one LLM call and a Python script, and since they previously worked reliably, the degradation points to potential throttling, resource allocation constraints, or backend service instability. 

    First, check Azure’s status dashboard for regional outages or degraded performance in AI services like OpenAI or Azure Machine Learning. Microsoft occasionally imposes rate limits on LLM APIs, which could cause queuing delays. If the status page shows no issues, review your application logs in Azure Monitor to identify failed or throttled requests. 

    Next, validate your Python script’s error handling. If the LLM response is malformed or times out, the script might hang while waiting for a resolution. Implement explicit timeouts and retry logic to prevent indefinite stalls. Additionally, test with simplified flows remove the Python step temporarily to isolate whether delays originate from the LLM or the script. 

    If the above answer helped, please do not forget to "Accept Answer" as this may help other community members to refer the info if facing a similar issue. Your contribution to the Microsoft Q&A community is highly appreciated. 


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.