Model Deployment Error – “Must specify a chain type in config” while serving LLM-based Chatbot via Unity Catalog

Muhammad Salman Khan 0 Reputation points
2025-04-22T06:26:47.56+00:00

Hello Community,

I’m encountering an issue while deploying a simple LLM-based chatbot on Azure Databricks using Unity Catalog. The model works perfectly within the notebook — the retriever, prompt chain, and response generation all function correctly during development.

However, when I register the model in Unity Catalog and attempt to deploy it via Databricks Model Serving, the model status remains “Not Ready” and eventually fails with the following error:

“An error occurred while loading the model: Must specify a chain type in config.”

Here are some relevant details:

The model is registered successfully in Unity Catalog.

The notebook used to log the model runs without error.

When I attempt to load the same model from Unity Catalog (even within the same notebook), it throws the same “chain type” error.

I’ve attempted to explicitly define and pass the model config (including chain type), but it makes no difference.

The full error appears in the logs as a timeout during sub-entity creation on endpoint update.

I’ve searched for similar issues online and found several unanswered threads. Would appreciate guidance on how to resolve this and ensure successful model serving.

Please have a look at this loom for further details: https://www.loom.com/share/13f1742310634d9aaa8bff1282e0a777?sid=eeb85327-36b1-4ba4-b1e6-aae816c01c14

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,413 questions
{count} votes

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.