Deltalake change data feed on synapse notebook

2025-05-02T06:06:55.0233333+00:00

Hi Team,
Could you please help us we are looking to enable change data feed to existing delta file but it is not working, we have used below steps to enable.

set spark.databricks.delta.properties.defaults.enableChangeDataFeed = true;
spark.sql("ALTER TABLE delta.`abfss://******@storage.dfs.core.windows.net/test/incremental_test/` SET TBLPROPERTIES (delta.enableChangeDataFeed = true)")
df_changes = spark.read.format("delta") \
    .option("readChangeData", "true") \
    .load(f'abfss://******@storage.dfs.core.windows.net/test/incremental_test/')

AnalysisException**: No startingVersion or startingTimestamp provided for CDC read.**

after running the alter script _change_type,_commit_version,_commit_timestamp columns are not creating in delta file, can you please help me with process and share the detailed steps like how to enable the change data feed, please provide screenshot of the table once it is enabled.

Can you please provide the whole steps as a example code like please create a dummy data frame and write it as delta and please enable change data feed (please read the change data feed with out passing starting version and staring timestamp) and also please give provide code that would helps a lot!

Thanks in Advance
Thank you

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
5,318 questions
{count} votes

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.