APPLIES TO:
Azure CLI ml extension v2 (current)
Python SDK azure-ai-ml v2 (current)
In this article, you learn how to connect to Azure data storage services with Azure Machine Learning datastores.
Prerequisites
Note
Machine Learning datastores do not create the underlying storage account resources. Instead, they link an existing storage account for Machine Learning use. Machine Learning datastores aren't required. If you have access to the underlying data, you can use storage URIs directly.
Create an Azure Blob datastore
from azure.ai.ml.entities import AzureBlobDatastore
from azure.ai.ml import MLClient
ml_client = MLClient.from_config()
store = AzureBlobDatastore(
name="",
description="",
account_name="",
container_name=""
)
ml_client.create_or_update(store)
from azure.ai.ml.entities import AzureBlobDatastore
from azure.ai.ml.entities import AccountKeyConfiguration
from azure.ai.ml import MLClient
ml_client = MLClient.from_config()
store = AzureBlobDatastore(
name="blob_protocol_example",
description="Datastore pointing to a blob container using https protocol.",
account_name="mytestblobstore",
container_name="data-container",
protocol="https",
credentials=AccountKeyConfiguration(
account_key="aaaaaaaa-0b0b-1c1c-2d2d-333333333333"
),
)
ml_client.create_or_update(store)
from azure.ai.ml.entities import AzureBlobDatastore
from azure.ai.ml.entities import SasTokenConfiguration
from azure.ai.ml import MLClient
ml_client = MLClient.from_config()
store = AzureBlobDatastore(
name="blob_sas_example",
description="Datastore pointing to a blob container using SAS token.",
account_name="mytestblobstore",
container_name="data-container",
credentials=SasTokenConfiguration(
sas_token= "?xx=A1bC2dE3fH4iJ5kL6mN7oP8qR9sT0u&xx=C2dE3fH4iJ5kL6mN7oP8qR9sT0uV1wx&xx=Ff6Gg~7Hh8.-Ii9Jj0Kk1Ll2Mm3Nn4_Oo5Pp6Qq7&xx=N7oP8qR9sT0uV1wX2yZ3aB4cD5eF6g&xxx=Ee5Ff~6Gg7.-Hh8Ii9Jj0Kk1Ll2Mm3_Nn4Oo5Pp6&xxx=C2dE3fH4iJ5kL6mN7oP8qR9sT0uV1w"
),
)
ml_client.create_or_update(store)
Create the following YAML file (update the appropriate values):
# my_blob_datastore.yml
$schema: https://azuremlschemas.azureedge.net/latest/azureBlob.schema.json
name: my_blob_ds # add your datastore name here
type: azure_blob
description: here is a description # add a datastore description here
account_name: my_account_name # add the storage account name here
container_name: my_container_name # add the storage container name here
Create the Machine Learning datastore in the Azure CLI:
az ml datastore create --file my_blob_datastore.yml
Create this YAML file (update the appropriate values):
# my_blob_datastore.yml
$schema: https://azuremlschemas.azureedge.net/latest/azureBlob.schema.json
name: blob_example
type: azure_blob
description: Datastore pointing to a blob container.
account_name: mytestblobstore
container_name: data-container
credentials:
account_key: aaaaaaaa-0b0b-1c1c-2d2d-333333333333
Create the Machine Learning datastore in the CLI:
az ml datastore create --file my_blob_datastore.yml
Create this YAML file (update the appropriate values):
# my_blob_datastore.yml
$schema: https://azuremlschemas.azureedge.net/latest/azureBlob.schema.json
name: blob_sas_example
type: azure_blob
description: Datastore pointing to a blob container using SAS token.
account_name: mytestblobstore
container_name: data-container
credentials:
sas_token: "?xx=A1bC2dE3fH4iJ5kL6mN7oP8qR9sT0u&xx=C2dE3fH4iJ5kL6mN7oP8qR9sT0uV1wx&xx=Ff6Gg~7Hh8.-Ii9Jj0Kk1Ll2Mm3Nn4_Oo5Pp6Qq7&xx=N7oP8qR9sT0uV1wX2yZ3aB4cD5eF6g&xxx=Ee5Ff~6Gg7.-Hh8Ii9Jj0Kk1Ll2Mm3_Nn4Oo5Pp6&xxx=C2dE3fH4iJ5kL6mN7oP8qR9sT0uV1w"
Create the Machine Learning datastore in the CLI:
az ml datastore create --file my_blob_datastore.yml
Create an Azure Data Lake Storage Gen2 datastore
from azure.ai.ml.entities import AzureDataLakeGen2Datastore
from azure.ai.ml import MLClient
ml_client = MLClient.from_config()
store = AzureDataLakeGen2Datastore(
name="",
description="",
account_name="",
filesystem=""
)
ml_client.create_or_update(store)
from azure.ai.ml.entities import AzureDataLakeGen2Datastore
from azure.ai.ml.entities._datastore.credentials import ServicePrincipalCredentials
from azure.ai.ml import MLClient
ml_client = MLClient.from_config()
store = AzureDataLakeGen2Datastore(
name="adls_gen2_example",
description="Datastore pointing to an Azure Data Lake Storage Gen2.",
account_name="mytestdatalakegen2",
filesystem="my-gen2-container",
credentials=ServicePrincipalCredentials(
tenant_id= "bbbbcccc-1111-dddd-2222-eeee3333ffff",
client_id= "44445555-eeee-6666-ffff-7777aaaa8888",
client_secret= "Cc3Dd~4Ee5.-Ff6Gg7Hh8Ii9Jj0Kk1_Ll2Mm3Nn4",
),
)
ml_client.create_or_update(store)
Create this YAML file (update the values):
# my_adls_datastore.yml
$schema: https://azuremlschemas.azureedge.net/latest/azureDataLakeGen2.schema.json
name: adls_gen2_credless_example
type: azure_data_lake_gen2
description: Credential-less datastore pointing to an Azure Data Lake Storage Gen2 instance.
account_name: mytestdatalakegen2
filesystem: my-gen2-container
Create the Machine Learning datastore in the CLI:
az ml datastore create --file my_adls_datastore.yml
Create this YAML file (update the values):
# my_adls_datastore.yml
$schema: https://azuremlschemas.azureedge.net/latest/azureDataLakeGen2.schema.json
name: adls_gen2_example
type: azure_data_lake_gen2
description: Datastore pointing to an Azure Data Lake Storage Gen2 instance.
account_name: mytestdatalakegen2
filesystem: my-gen2-container
credentials:
tenant_id: bbbbcccc-1111-dddd-2222-eeee3333ffff
client_id: 44445555-eeee-6666-ffff-7777aaaa8888
client_secret: Cc3Dd~4Ee5.-Ff6Gg7Hh8Ii9Jj0Kk1_Ll2Mm3Nn4
Create the Machine Learning datastore in the CLI:
az ml datastore create --file my_adls_datastore.yml
Create an Azure Files datastore
from azure.ai.ml.entities import AzureFileDatastore
from azure.ai.ml.entities import AccountKeyConfiguration
from azure.ai.ml import MLClient
ml_client = MLClient.from_config()
store = AzureFileDatastore(
name="file_example",
description="Datastore pointing to an Azure File Share.",
account_name="mytestfilestore",
file_share_name="my-share",
credentials=AccountKeyConfiguration(
account_key= "aaaaaaaa-0b0b-1c1c-2d2d-333333333333"
),
)
ml_client.create_or_update(store)
from azure.ai.ml.entities import AzureFileDatastore
from azure.ai.ml.entities import SasTokenConfiguration
from azure.ai.ml import MLClient
ml_client = MLClient.from_config()
store = AzureFileDatastore(
name="file_sas_example",
description="Datastore pointing to an Azure File Share using SAS token.",
account_name="mytestfilestore",
file_share_name="my-share",
credentials=SasTokenConfiguration(
sas_token="?xx=A1bC2dE3fH4iJ5kL6mN7oP8qR9sT0u&xx=C2dE3fH4iJ5kL6mN7oP8qR9sT0uV1wx&xx=Ff6Gg~7Hh8.-Ii9Jj0Kk1Ll2Mm3Nn4_Oo5Pp6Qq7&xx=N7oP8qR9sT0uV1wX2yZ3aB4cD5eF6g&xxx=Ee5Ff~6Gg7.-Hh8Ii9Jj0Kk1Ll2Mm3_Nn4Oo5Pp6&xxx=C2dE3fH4iJ5kL6mN7oP8qR9sT0uV1w"
),
)
ml_client.create_or_update(store)
Create this YAML file (update the values):
# my_files_datastore.yml
$schema: https://azuremlschemas.azureedge.net/latest/azureFile.schema.json
name: file_example
type: azure_file
description: Datastore pointing to an Azure File Share.
account_name: mytestfilestore
file_share_name: my-share
credentials:
account_key: aaaaaaaa-0b0b-1c1c-2d2d-333333333333
Create the Machine Learning datastore in the CLI:
az ml datastore create --file my_files_datastore.yml
Create this YAML file (update the values):
# my_files_datastore.yml
$schema: https://azuremlschemas.azureedge.net/latest/azureFile.schema.json
name: file_sas_example
type: azure_file
description: Datastore pointing to an Azure File Share using an SAS token.
account_name: mytestfilestore
file_share_name: my-share
credentials:
sas_token: "?xx=A1bC2dE3fH4iJ5kL6mN7oP8qR9sT0u&xx=C2dE3fH4iJ5kL6mN7oP8qR9sT0uV1wx&xx=Ff6Gg~7Hh8.-Ii9Jj0Kk1Ll2Mm3Nn4_Oo5Pp6Qq7&xx=N7oP8qR9sT0uV1wX2yZ3aB4cD5eF6g&xxx=Ee5Ff~6Gg7.-Hh8Ii9Jj0Kk1Ll2Mm3_Nn4Oo5Pp6&xxx=C2dE3fH4iJ5kL6mN7oP8qR9sT0uV1w"
Create the Machine Learning datastore in the CLI:
az ml datastore create --file my_files_datastore.yml
Create an Azure Data Lake Storage Gen1 datastore
from azure.ai.ml.entities import AzureDataLakeGen1Datastore
from azure.ai.ml import MLClient
ml_client = MLClient.from_config()
store = AzureDataLakeGen1Datastore(
name="",
store_name="",
description="",
)
ml_client.create_or_update(store)
from azure.ai.ml.entities import AzureDataLakeGen1Datastore
from azure.ai.ml.entities._datastore.credentials import ServicePrincipalCredentials
from azure.ai.ml import MLClient
ml_client = MLClient.from_config()
store = AzureDataLakeGen1Datastore(
name="adls_gen1_example",
description="Datastore pointing to an Azure Data Lake Storage Gen1.",
store_name="mytestdatalakegen1",
credentials=ServicePrincipalCredentials(
tenant_id= "bbbbcccc-1111-dddd-2222-eeee3333ffff",
client_id= "44445555-eeee-6666-ffff-7777aaaa8888",
client_secret= "Cc3Dd~4Ee5.-Ff6Gg7Hh8Ii9Jj0Kk1_Ll2Mm3Nn4",
),
)
ml_client.create_or_update(store)
Create this YAML file (update the values):
# my_adls_datastore.yml
$schema: https://azuremlschemas.azureedge.net/latest/azureDataLakeGen1.schema.json
name: alds_gen1_credless_example
type: azure_data_lake_gen1
description: Credential-less datastore pointing to an Azure Data Lake Storage Gen1 instance.
store_name: mytestdatalakegen1
Create the Machine Learning datastore in the CLI:
az ml datastore create --file my_adls_datastore.yml
Create this YAML file (update the values):
# my_adls_datastore.yml
$schema: https://azuremlschemas.azureedge.net/latest/azureDataLakeGen1.schema.json
name: adls_gen1_example
type: azure_data_lake_gen1
description: Datastore pointing to an Azure Data Lake Storage Gen1 instance.
store_name: mytestdatalakegen1
credentials:
tenant_id: bbbbcccc-1111-dddd-2222-eeee3333ffff
client_id: 44445555-eeee-6666-ffff-7777aaaa8888
client_secret: Cc3Dd~4Ee5.-Ff6Gg7Hh8Ii9Jj0Kk1_Ll2Mm3Nn4
Create the Machine Learning datastore in the CLI:
az ml datastore create --file my_adls_datastore.yml
Create a OneLake (Microsoft Fabric) datastore (preview)
This section describes various options to create a OneLake datastore. The OneLake datastore is part of Microsoft Fabric. At this time, Machine Learning supports connection to Microsoft Fabric lakehouse artifacts in "Files" folder that include folders or files and Amazon S3 shortcuts. For more information about lakehouses, see What is a lakehouse in Microsoft Fabric?.
OneLake datastore creation requires the following information from your Microsoft Fabric instance:
- Endpoint
- Workspace GUID
- Artifact GUID
The following screenshots describe the retrieval of these required information resources from your Microsoft Fabric instance.
You will then find "Endpoint", "Workspace GUID" and "Artifact GUID" in "URL" and "ABFS path" from the "Properties" page:
- URL format: https://{your_one_lake_endpoint}/{your_one_lake_workspace_guid}/{your_one_lake_artifact_guid}/Files
- ABFS path format: abfss://{your_one_lake_workspace_guid}@{your_one_lake_endpoint}/{your_one_lake_artifact_guid}/Files
Create a OneLake datastore
from azure.ai.ml.entities import OneLakeDatastore, OneLakeArtifact
from azure.ai.ml import MLClient
ml_client = MLClient.from_config()
store = OneLakeDatastore(
name="onelake_example_id",
description="Datastore pointing to an Microsoft fabric artifact.",
one_lake_workspace_name="bbbbbbbb-7777-8888-9999-cccccccccccc", #{your_one_lake_workspace_guid}
endpoint="msit-onelake.dfs.fabric.microsoft.com" #{your_one_lake_endpoint}
artifact = OneLakeArtifact(
name="cccccccc-8888-9999-0000-dddddddddddd/Files", #{your_one_lake_artifact_guid}/Files
type="lake_house"
)
)
ml_client.create_or_update(store)
from azure.ai.ml.entities import AzureDataLakeGen1Datastore
from azure.ai.ml.entities._datastore.credentials import ServicePrincipalCredentials
from azure.ai.ml import MLClient
ml_client = MLClient.from_config()
rom azure.ai.ml.entities import OneLakeDatastore, OneLakeArtifact
from azure.ai.ml import MLClient
ml_client = MLClient.from_config()
store = OneLakeDatastore(
name="onelake_example_sp",
description="Datastore pointing to an Microsoft fabric artifact.",
one_lake_workspace_name="bbbbbbbb-7777-8888-9999-cccccccccccc", #{your_one_lake_workspace_guid}
endpoint="msit-onelake.dfs.fabric.microsoft.com" #{your_one_lake_endpoint}
artifact = OneLakeArtifact(
name="cccccccc-8888-9999-0000-dddddddddddd/Files", #{your_one_lake_artifact_guid}/Files
type="lake_house"
)
credentials=ServicePrincipalCredentials(
tenant_id= "bbbbcccc-1111-dddd-2222-eeee3333ffff",
client_id= "44445555-eeee-6666-ffff-7777aaaa8888",
client_secret= "Cc3Dd~4Ee5.-Ff6Gg7Hh8Ii9Jj0Kk1_Ll2Mm3Nn4",
),
)
ml_client.create_or_update(store)
Create the following YAML file (update the values):
# my_onelake_datastore.yml
$schema: http://azureml/sdk-2-0/OneLakeDatastore.json
name: onelake_example_id
type: one_lake
description: Credential-less datastore pointing to a OneLake lakehouse.
one_lake_workspace_name: "eeeeffff-4444-aaaa-5555-bbbb6666cccc"
endpoint: "msit-onelake.dfs.fabric.microsoft.com"
artifact:
type: lake_house
name: "1111bbbb-22cc-dddd-ee33-ffffff444444/Files"
Create the Machine Learning datastore in the CLI:
az ml datastore create --file my_onelake_datastore.yml
Create the following YAML file (update the values):
# my_onelakesp_datastore.yml
$schema: http://azureml/sdk-2-0/OneLakeDatastore.json
name: onelake_example_id
type: one_lake
description: Credential-less datastore pointing to a OneLake lakehouse.
one_lake_workspace_name: "eeeeffff-4444-aaaa-5555-bbbb6666cccc"
endpoint: "msit-onelake.dfs.fabric.microsoft.com"
artifact:
type: lake_house
name: "1111bbbb-22cc-dddd-ee33-ffffff444444/Files"
credentials:
tenant_id: bbbbcccc-1111-dddd-2222-eeee3333ffff
client_id: 44445555-eeee-6666-ffff-7777aaaa8888
client_secret: Cc3Dd~4Ee5.-Ff6Gg7Hh8Ii9Jj0Kk1_Ll2Mm3Nn4
Create the Machine Learning datastore in the CLI:
az ml datastore create --file my_onelakesp_datastore.yml
Next steps