Self-service data prep with dataflows (Public Preview)
These release notes describe functionality that may not have been released yet. To see when this functionality is planned to release, please review Summary of what’s new. Delivery timelines and projected functionality may change or may not ship (see Microsoft policy).
Power BI introduces dataflows to help organizations unify data from disparate sources and prepare it for modeling. Analysts can easily create dataflows, using familiar, self-service tools. Dataflows are used to ingest, transform, integrate, and enrich big data by defining data source connections, ETL logic, refresh schedules, and more. Data is stored as entities in Common Data Model-compliant folders in Azure Data Lake Storage Gen2. Dataflows are created and managed in app workspaces by using the Power BI service.
You can use dataflows to ingest data from a large and growing set of supported on-premises and cloud- based data sources including Dynamics 365, Salesforce, Azure SQL Database, Excel, SharePoint, and more.
You can then map data to known Common Data Model entities, modify and extend existing entities, and create custom entities. Advanced users can create fully customized dataflows, using a self-service, low-code/no-code, built-in Power Query authoring experience, similar to the Power Query experience that millions of Power BI Desktop and Excel users already know.
Once you’ve created a dataflow, you can use Power BI Desktop and the Power BI service to create datasets, reports, dashboards, and apps that leverage the power of the Common Data Model to drive deep insights into your business activities.
Dataflow refresh scheduling is managed directly from the workspace in which your dataflow was created, just like your datasets.
The preview includes more than 20 connectors to common data sources such as Excel, SQL Server, Oracle, Azure SQL Data Warehouse, Dynamics 365, and Salesforce.