Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
In today's data-driven world, applying predictive analytics enhances decision-making processes and operational efficiency.
Tip
This article provides an example scenario and a generalized example architecture to illustrate how to perform predictive data analysis with Microsoft Dataverse, Microsoft Fabric, and Azure AI services. The architecture example can be modified for many different scenarios and industries.
Architecture diagram
Workflow
The following steps describe the workflow that's shown in the example architecture diagram:
Data ingestion: Use dataflows to collect and transform raw data from multiple sources. Store cleaned and prepared data in Dataverse.
Data engineering and model training: Sync data from Dataverse to Fabric using the Fabric shortcut. Use Fabric's OneLake and Synapse environment to train machine learning models.
Prediction storage: Save model predictions back into Dataverse or Delta Lake in Fabric.
Visualization: Build real-time dashboards in Power BI to visualize predictions and insights.
Actionable insights: Develop a Power Apps canvas or model-driven app to provide frontline teams with predictive insights.
Components
AI Builder: Extracts key data from documents using prebuilt or custom models.
Microsoft Dataverse: Serves as the central data store for extracted document data and tracks document progress as the business process is applied.
Power Platform: Automated workflows collect and transform raw data from multiple sources.
Link Dataverse to Microsoft Fabric: Syncs data from Dataverse to Fabric using the Fabric shortcut.
Azure Machine Learning: Trains machine learning models.
Power Apps: Facilitates human review and data corrections.
Power BI: Delivers analytics and insights into the document processing workflow.
Alternatives
Azure Data Factory: Use Azure Data Factory instead of Power Platform dataflows for collecting and transforming raw data from multiple sources.
Scenario details
The scenario: A company wants to predict customer churn to prevent user dissatisfaction.
Potential use case: Predicting customer churn
In this scenario, the specific steps include:
Data collection: Use dataflows to aggregate customer data such as transactions, complaints, and engagement scores into Dataverse.
Model development: Sync Dataverse data with Fabric. Use historical data in Fabric's Spark pool to train a churn prediction model. Use Azure Machine Learning to train and deploy predictive models.
Prediction deployment: Save predictions such as churn probability into Dataverse.
Visualization: Build Power BI dashboards that show churn risk distribution by region or product category.
User action: Create a canvas or model-driven app to view and act on high-risk accounts.
Considerations
These considerations implement the pillars of Power Platform Well-Architected, a set of guiding tenets that improve the quality of a workload. Learn more in Microsoft Power Platform Well-Architected.
Performance
Dataflows for efficient data ingestion: Optimize Power Platform dataflows for ETL (Extract, Transform, Load) processes by applying incremental refresh where applicable to minimize data processing times.
Link to Microsoft Fabric for compute: Use Azure Synapse Link for Dataverse to offload heavy data computation and analytics tasks to Microsoft Fabric to ensure minimal performance impact on operational Dataverse environments. Use OneLake in Fabric to manage large datasets with efficient query capabilities.
Security
Data source security integration: Secure access to semi-structured, relational, and nonrelational data by using Microsoft Entra ID for authentication and role-based access controls.
Governance of data in Fabric and Dataverse: Enforce data classification, encryption at rest, and data loss prevention policies. Implement row-level security in Power BI for role-specific insights while maintaining secure data access.
Operational Excellence
Continuous integration and continuous delivery for Power Platform solutions: Use Azure DevOps or GitHub Actions to manage the life cycle of Dataverse, Power BI, and AI Builder solutions.
Versioning of data models: Track and document changes to machine learning models and transformations in Fabric and Dataverse. Use Purview for comprehensive data lineage and metadata management to ensure model explainability and traceability.
Contributors
Microsoft maintains this article. The following contributors wrote this article.
Principal authors:
- Pujarini Mohapatra, Principal Engineering Manager