ADF issues when extracting Snowflake data into Azure Blob Storage

Yi Han Wong 0 Reputation points
2025-04-29T17:17:06.04+00:00

Hi all - I have a pipeline created where I am copying data from Snowflake into an Azure Blob Storage authenticated via an SAS token.

I have successfully run the pipeline in the past, however when I added new source datasets I am getting the following error message:

ErrorCode=SnowflakeExportCopyCommandOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Snowflake Export Copy Command operation failed,Source=Microsoft.DataTransfer.Connectors.Snowflake,''Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Failed to execute the query command during read operation.,Source=Microsoft.DataTransfer.Connectors.GenericAdoNet,''Type=Apache.Arrow.Adbc.C.CAdbcDriverImporter+ImportedAdbcException,Message=[Snowflake] 100089 (42501): Failed to access remote file: access denied. Please check your credentials,Source=Apache.Arrow.Adbc,'

Has anyone seen this before and can offer any helpful suggestions? I tried setting the following parameters in the source and tested it on a small table, but the pipeline is still failing:

User's image

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,464 questions
{count} votes

2 answers

Sort by: Most helpful
  1. Chandra Boorla 12,085 Reputation points Microsoft External Staff
    2025-04-29T17:39:03.42+00:00

    @Yi Han Wong

    Thanks for sharing the details and screenshot.

    The error message you're seeing: [Snowflake] 100089 (42501): Failed to access remote file: access denied. Please check your credentials indicates that Snowflake was unable to access the Azure Blob Storage location during the export. From your screenshot, I can see that the "Storage integration" field is empty and marked as "Failed", which is likely the root cause.

    Here are a few suggestions to resolve the issue:

    Reconfigure the Storage Integration

    In the Source tab of your Copy activity, under Storage integration, select or create a Linked Service pointing to your Azure Blob Storage.

    Make sure it's using the correct SAS token, and that the token has Write, List, and Read permissions.

    After selecting the linked service, click Test connection to ensure its valid.

    Check the SAS Token

    Ensure that the SAS token used in the linked service to Blob Storage:

    • Has not expired
    • Includes the necessary permissions: Read, Write, and List
    • Is valid for the correct container and storage account

    Retry with a Small Dataset

    Once the storage integration is configured correctly, try running the pipeline again with a smaller table to validate the fix.

    Check Snowflake Role Permissions

    Verify that your Snowflake role has access to the new tables/schemas you’ve added.

    Lack of SELECT privileges could prevent Snowflake from reading the source data, even before the copy operation starts.

    Review Azure Blob Firewall Settings

    If network rules are enabled on your storage account, make sure the necessary IP ranges or integration runtimes are whitelisted.

    Also confirm that "Allow access from Azure services" is enabled if applicable.

    Validate the Snowflake Stage Configuration

    If you're using a named Snowflake stage,

    • Confirm it’s configured like:
    CREATE OR REPLACE STAGE my_stage
    URL='azure://<account>.blob.core.windows.net/<container>'
    CREDENTIALS=(AZURE_SAS_TOKEN='<sas-token>');
    
    • The SAS token should match the one used in your ADF linked service.
    • The stage must point to the correct container and have write access.

    I hope this information helps. Please do let us know if you have any further queries.

    Kindly consider upvoting the comment if the information provided is helpful. This can assist other community members in resolving similar issues.

    Thank you.

    0 comments No comments

  2. Yi Han Wong 0 Reputation points
    2025-04-30T12:33:43.7333333+00:00

    After doing some troubleshooting, I was able to resolve this, this error came up due to the SAS token being expired. After using a new token the pipeline was able to run successfully. Interestingly, the pipeline was able to complete successfully even if I left the Storage Integration field blank.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.