Hi @Ajay Jeevan Jose,
Thanks for posting your query!
You can integrate Azure Data Factory (ADF) with an Azure Machine Learning (AML) Designer pipeline and pass parameters dynamically by following approach to parameterize the Python script in the "Execute Python Script" component.
Defining Parameters in ML Designer and Passing Them from ADF
In Azure Machine Learning Designer, you can create parameters by going to the Parameters section in your pipeline. Give each parameter a name (like container_name or connection_string) and choose the correct data type. These parameters can then be used in components like the Execute Python Script.
To pass these parameters from Azure Data Factory (ADF) to Azure ML, use the AzureMLExecutePipeline activity in ADF. In the activity settings, map ADF parameters to the ML pipeline parameters.
For example, you can pass container_name and connection_string from ADF to the ML pipeline like this:
"Parameters": {
"container_name": "@pipeline().parameters.container_name",
"connection_string": "@pipeline().parameters.connection_string"
}
Access these parameters inside the Python script in the Execute Python Script component
Accessing parameters inside the Python script in the Execute Python Script component is done by retrieving them through the Run Context object, where they are available as input datasets or variables when passed to the pipeline
from azureml.core import Run
# Get the run context
run = Run.get_context()
# Access the pipeline parameters using run.input_params
container_name = run.input_params.get('container_name')
connection_string = run.input_params.get('connection_string')
Best practices for securely passing connection strings and credentials
To securely pass connection strings and credentials, store them in Azure Key Vault. This keeps sensitive information safe. In Azure Data Factory (ADF), you can create a Linked Service to Azure Key Vault, which lets you retrieve these secrets during pipeline execution, avoiding hardcoding sensitive data.
In the Azure ML pipeline, you can access these secrets using environment variables or the Azure SDK. This method ensures that your credentials are securely handled and not exposed in scripts or pipeline settings.
Ref Docs: https://learn.microsoft.com/en-us/azure/data-factory/transform-data-machine-learning-service?source=recommendations
https://learn.microsoft.com/en-us/azure/data-factory/how-to-expression-language-functions
I hope this information helps. Please do let us know if you have any further queries.
Kindly consider upvoting the comment if the information provided is helpful. This can assist other community members in resolving similar issues.
Thank you.