Azure.Storage.DataMovement sdk Not working with File share to Blob Tranfser

Dheeraj Awale 21 Reputation points
2025-04-14T10:47:04.4966667+00:00

I am trying to transfer data from azure file share to azure blob container by using Azure.Storage.DataMovement library. Code runs fine without any issues. SDK also notifies 'Data transfer completed successfully.' but target blob container does not contain anything.

below is the exact code I am using:

private async Task TransferFilesFromFileShareToBlob(string fileShareConnectionString, string fileShareName, string blobStorageConnectionString, string blobContainerName)
{
    // Instantiate clients for the Azure File Share and Blob Storage.
    var shareClient = new ShareClient(fileShareConnectionString, fileShareName);
    var rootDirectoryClient = shareClient.GetRootDirectoryClient();

    // Ensure the Blob container exists.
    BlobContainerClient blobContainerClient = new(blobStorageConnectionString, blobContainerName);
    var response = await blobContainerClient.CreateIfNotExistsAsync();

    // Configure TransferManager options if necessary.
    var transferManagerOptions = new TransferManagerOptions
    {
        // For example, set a concurrent transfer limit:
        MaximumConcurrency = 8,
    };

    // Create an instance of TransferManager.
    var transferManager = new TransferManager(transferManagerOptions);

    // Enumerate files in the root of the file share.
    await foreach (var item in rootDirectoryClient.GetFilesAndDirectoriesAsync())
    {
        if (item.IsDirectory)
        {
            string directoryName = item.Name;
            _logger.LogInformation($"Processing directory: {directoryName}");

            var subDirectoryClient = rootDirectoryClient.GetSubdirectoryClient(directoryName);
            await foreach (var subDirfile in subDirectoryClient.GetFilesAndDirectoriesAsync())
            {
                if (!subDirfile.IsDirectory)
                {
                    string fileName = subDirfile.Name;
                    _logger.LogInformation($"Transferring file: {fileName}");

                    // Get the source file client.
                    var sourceFileClient = subDirectoryClient.GetFileClient(fileName);
                    var sourceResource = ShareFilesStorageResourceProvider.FromClient(sourceFileClient);

                    //var destinationBlobClient = blobContainerClient.GetBlobClient(Path.Join(directoryName, fileName));
                    BlockBlobClient blockBlobClient = blobContainerClient.GetBlockBlobClient(fileName);
                    //if(await blockBlobClient.ExistsAsync() == false)
                    //{
                    //    using var emptyContent = new MemoryStream([]);
                    //    await blockBlobClient.UploadAsync(emptyContent);
                    //}

                    var destinationResource = BlobsStorageResourceProvider.FromClient(blockBlobClient);

                    var transfer = await transferManager
                                 .StartTransferAsync(sourceResource, destinationResource);
                    await transfer.WaitForCompletionAsync();

                    _logger.LogInformation($"File '{fileName}' transferred successfully.");
                }
            }
        }
    }

    _logger.LogInformation("Data transfer completed successfully.");
}

Here is how I fetch secrets and some relevant imports:
using Azure.Storage.Blobs;

using Azure.Storage.Blobs.Specialized;

using Azure.Storage.DataMovement;

using Azure.Storage.DataMovement.Blobs;

using Azure.Storage.DataMovement.Files.Shares;

using Azure.Storage.Files.Shares;

string fileShareConnectionString = Environment.GetEnvironmentVariable("AzureFileShareConnectionString");

string fileShareName = Environment.GetEnvironmentVariable("FileShareName");

string blobStorageConnectionString = Environment.GetEnvironmentVariable("AzureBlobStorageConnectionString");

string blobContainerName = Environment.GetEnvironmentVariable("BlobContainerName");

Azure Storage Accounts
Azure Storage Accounts
Globally unique resources that provide access to data management services and serve as the parent namespace for the services.
3,471 questions
{count} votes

2 answers

Sort by: Most helpful
  1. Vinod Kumar Reddy Chilupuri 3,750 Reputation points Microsoft External Staff
    2025-04-14T13:57:54.6366667+00:00

    Hi Dheeraj Awale,

    It seems that your code is set up correctly for transferring files from an Azure File Share to an Azure Blob container using the Azure.Storage.DataMovement library. However, if the transfer completes successfully but the target blob container remains empty, there are a few things you need to check to that may solves your issue.

    1. File Paths

    It is crucial to ensure that the file paths specified for the destination blobs are accurate. When creating the BlockBlobClient, including the directory structure in the blob name is essential for maintaining organization within the blob container. If only the file name is used, all files may be placed in the root directory, leading to confusion if the expected directory structure is not preserved. Ensure the files are stored in the intended paths within the blob container, especially if subdirectories are used. Use:

    BlockBlobClient blockBlobClient = blobContainerClient.GetBlockBlobClient($"{directoryName}/{fileName}");
    

    This ensures files are placed inside the appropriate directory structure.

    2. Permissions

    Verifying that the service principal or identity used for authentication possesses the necessary permissions is vital. The required role is typically "Storage Blob Data Contributor". Without the appropriate access rights, the transfer may not succeed, even if the SDK indicates that the operation completed successfully.

    3. Logging

    While you have implemented logging, it is important to ensure that any exceptions or errors occurring during the transfer process are captured. Enhanced logging can provide insights into issues that may not be apparent from the success message alone, facilitating more effective troubleshooting.

    try
    {
        var transfer = await transferManager.StartTransferAsync(sourceResource, destinationResource);
        await transfer.WaitForCompletionAsync();
    }
    catch (Exception ex)
    {
        _logger.LogError($"Error during file transfer: {ex.Message}");
    }
    

    4. TransferManager Configuration

    Confirming that the TransferManager is configured correctly is another important consideration. Depending on your specific use case, certain options may need to be adjusted to optimize performance or accommodate particular scenarios, such as handling large files or managing high concurrency.

    5. Existing Blobs

    It is essential to check for existing blobs in the destination container. If blobs with the same names already exist and the transfer does not overwrite them, this could lead to the appearance of an empty container. Implementing logic to handle existing blobs—whether by overwriting or skipping them—is crucial for ensuring that the intended files are present.

    6. Debugging

    Consider adding more detailed logging around the transfer process. This can help capture the state of the transfer, including any intermediate steps or failures, which can be invaluable for diagnosing issues.

    This outlines the key areas to investigate when troubleshooting the issue of an empty blob container following a successful transfer. By addressing these points, you provide a thorough approach to identifying and resolving the underlying problems.

    Transfer data with the Data Movement library
    Use extension methods for BlobContainerClient

    Hope the above suggestion helps! Please let us know do you have any further queries.

    Please do consider to “Accept the answer” wherever the information provided helps you, this can be beneficial to other community members. 

    0 comments No comments

  2. Venkatesan S 1,625 Reputation points Microsoft External Staff
    2025-04-17T08:49:08.27+00:00

    Hi @Dheeraj Awale

    I am trying to transfer data from azure file share to azure blob container by using Azure.Storage.DataMovement library. Code runs fine without any issues. SDK also notifies 'Data transfer completed successfully.' but target blob container does not contain anything.

    In the Azure Data Movement library, data is transferred through a URI. If the necessary permissions are not granted on the source URI, the file will not be copied to the destination path.

    You can use the below code that will transfer file from File share to Blob Container using Azure data movement library with SDK.

    Code:

    using Azure.Storage;
    using Azure.Storage.Blobs;
    using Azure.Storage.DataMovement;
    using Azure.Storage.DataMovement.Blobs;
    using Azure.Storage.DataMovement.Files.Shares;
    using Azure.Storage.Files.Shares;
    using Azure.Storage.Files.Shares.Models;
    using Azure.Storage.Sas;
    
    
    class Program
    {
        static async Task Main(string[] args)
        {
            string storageAccountName = "xxxx";
            string storageAccountKey = "xxxxx";
            string fileShareName = "xxx";
            string blobContainerName = "xxxx";
    
            string sasToken = GetAccountSasToken(storageAccountName, storageAccountKey);
    
            Uri fileShareUriWithSas = new Uri($"https://{storageAccountName}.file.core.windows.net/{fileShareName}{sasToken}");
            Uri blobContainerUriWithSas = new Uri($"https://{storageAccountName}.blob.core.windows.net/{blobContainerName}{sasToken}");
    
            var shareClient = new ShareClient(fileShareUriWithSas);
            var rootDirClient = shareClient.GetRootDirectoryClient();
    
            var blobContainerClient = new BlobContainerClient(blobContainerUriWithSas);
    
            var transferManager = new TransferManager(new TransferManagerOptions {
                MaximumConcurrency = 8,
            });
    
            await CopyDirectoryRecursiveAsync(rootDirClient, blobContainerClient, "", transferManager, sasToken);
    
            Console.WriteLine("All files and folders copied.");
        }
    
        static async Task CopyDirectoryRecursiveAsync(
            ShareDirectoryClient dirClient,
            BlobContainerClient blobContainerClient,
            string relativePath,
            TransferManager transferManager,
            string sasToken)
        {
            var fileProvider = new ShareFilesStorageResourceProvider();
            var blobProvider = new BlobsStorageResourceProvider();
    
            await foreach (ShareFileItem item in dirClient.GetFilesAndDirectoriesAsync())
            {
                string currentPath = string.IsNullOrEmpty(relativePath)
                    ? item.Name
                    : $"{relativePath}/{item.Name}";
    
                if (item.IsDirectory)
                {
                    var subDirClient = dirClient.GetSubdirectoryClient(item.Name);
                    await CopyDirectoryRecursiveAsync(subDirClient, blobContainerClient, currentPath, transferManager, sasToken);
                }
                else
                {
                    Console.WriteLine($"Copying: {currentPath}");
    
                    var fileClient = dirClient.GetFileClient(item.Name);
                    Uri sourceUri = new Uri($"{fileClient.Uri}");
                    Console.WriteLine(sourceUri);
                    Uri destUri = new Uri($"{blobContainerClient.GetBlobClient(currentPath).Uri}");
    
                    StorageResource source = await fileProvider.FromFileAsync(sourceUri);
                    StorageResource destination = await blobProvider.FromBlobAsync(destUri);
    
                    var transfer = await transferManager.StartTransferAsync(source, destination);
                    await transfer.WaitForCompletionAsync();
    
                    Console.WriteLine($"{currentPath} - Status: {transfer.Status.state}");
                }
            }
        }
    
        static string GetAccountSasToken(string accountName, string accountKey)
        {
            var credential = new StorageSharedKeyCredential(accountName, accountKey);
    
            var sasBuilder = new AccountSasBuilder
            {
                Services = AccountSasServices.Files | AccountSasServices.Blobs,
                ResourceTypes = AccountSasResourceTypes.Service | AccountSasResourceTypes.Container | AccountSasResourceTypes.Object,
                ExpiresOn = DateTimeOffset.UtcNow.AddHours(2),
                Protocol = SasProtocol.Https
            };
    
            sasBuilder.SetPermissions(AccountSasPermissions.Read | AccountSasPermissions.List | AccountSasPermissions.Write | AccountSasPermissions.Create);
    
            return "?" + sasBuilder.ToSasQueryParameters(credential).ToString();
        }
    }
    

    The above code recursively copies files from an Azure File Share to an Azure Blob Storage container using the Azure Data Movement library. It sets up SAS tokens for authentication and uses a TransferManager to handle the transfer with specified concurrency. The program iterates through directories and files in the source File Share, copying each to the destination Blob container.

    Output:

    Copying: sample/axxxxf
    https://venkat326123.file.core.windows.net/share1/sample/aml.gif?sv=2025-05-05&ss=bf&srt=sco&spr=https&se=2025-04-17T10%3A34%3A12Z&sp=rwlc&sig=redacted
    sxxxx - Status: Completed
    ......  
    ......
    Copying: xxxxx
    https://venkat326123.file.core.windows.net/share1/winlibxxxx?sv=2025-05-05&ss=bf&srt=sco&spr=https&se=2025-04-17T10%3A34%3A12Z&sp=rwlc&sig=redacted
    xxxxx - Status: Completed
    All files and folders copied.
    

    enter image description here

    Portal:

    enter image description here

    Reference:

    Hope this answer helps! please let us know if you have any further queries. I’m happy to assist you further.

    Please do not forget to "Accept the answer” and “up-vote” wherever the information provided helps you, this can be beneficial to other community members.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.