将数据流与本地数据源配合使用Using dataflows with on-premises data sources

使用数据流 ,可以创建来自各种源的数据集合、清理数据并进行转换,然后将其加载到 Power BI 存储。With dataflows, you can create a collection of data from various sources, clean the data, transform it, and then load it to Power BI storage. 创建数据流时,用户可能希望使用本地数据源。When creating dataflow you may want to use on-premises data sources. 本文介绍了与创建数据流相关的要求,以及如何配置企业网关 以启用这些连接。This article clarifies requirement associated with creating dataflows and how your Enterprise Gateway needs to be configured to enabled those connections.

数据流和网关

配置企业网关以用于数据流Configuring an Enterprise Gateway for use with dataflows

若要使用网关创建数据流,用户必须是企业网关的管理员,或者管理员必须已与用户共享其计划使用的数据源。To create a dataflow using a gateway, the user must be the Enterprise Gateway’s administrator, or the administrator must have shared the datasource they are planning to use with the user.

备注

仅在使用企业网关时支持数据流。Dataflows are only supported using Enterprise Gateways.

在数据流中使用本地数据源Using an on-premises data source in a dataflow

创建数据流时,从数据源列表选择本地数据源,如下图所示。When creating a dataflow, select an on-premises data source from the data sources list, as shown in the following image.

选择本地数据源

完成选择后,系统会提示用户提供将用于访问本地数据的企业网关的连接详细信息。Once you make your selection, you're prompted to provide the connection details for the Enterprise Gateway that will be used to access the on-premises data. 必须选择网关本身,并提供选定网关的凭据。You must select the gateway itself, and provide credentials for the selected gateway.

提供连接详细信息

监视网关Monitoring your gateway

用户可以使用与监视数据集网关相同的方式监视企业网关的数据流。You can monitor your Enterprise Gateway for a dataflow in the same way you monitor gateways for a dataset.

在 Power BI 的数据流设置屏幕中,可以监视数据流的网关状态并为数据流分配网关,如下图所示。In the dataflow’s settings screen in Power BI, you can monitor a dataflow’s gateway status and assign a gateway to the dataflow, as shown in the following image.

监视网关

更改网关Changing a gateway

可以通过两种方式更改用于给定数据流的企业网关:You can change the Enterprise Gateway used for a given dataflow in two ways:

  1. 从创作工具中更改 :可以使用数据流创作工具更改分配给所有查询的网关。From the authoring tool – you can change the gateway assigned to all of your queries using the dataflow authoring tool.

    备注

    数据流将尝试使用新网关查找或创建所需的数据源。The dataflow will try to find or create the required datasources using the new gateway. 如果它无法执行此操作,则在所选网关提供所有必要数据流之前,将无法更改网关。If it cannot do so, you will not be able to change the gateway until all needed dataflows are available from the selected gateway.

  2. 在设置屏幕中更改 :可以使用 Power BI 服务中数据流的设置屏幕更改分配的网关。From the settings screen - you can change the assigned gateway using the settings screen for the dataflow in the Power BI service.

若要详细了解企业网关,请参阅本地数据网关To learn more about Enterprise Gateways, see On-premises data gateway.

注意事项和限制Considerations and limitations

使用企业网关和数据流有一些已知的限制:There are a few known limitations to using Enterprise Gateways and dataflows:

  • 每个数据流只能使用一个网关。Each dataflow may use only one gateway. 因此,应使用同一网关配置所有查询。As such, all queries should be configured using the same gateway.
  • 更改网关会影响整个数据流。Changing the gateway impact the entire dataflow.
  • 如果需要多个网关,最佳做法是生成多个数据流(每个网关一个),并使用计算或实体引用功能来统一数据。If several gateways are needed, the best practice is to build several dataflows (one for each gateway) and use the compute or entity reference capabilities to unify the data.
  • 仅在使用企业网关时支持数据流。Dataflows are only supported using enterprise gateways. 在下拉列表和设置屏幕中将无法选择个人网关。Personal gateways will not be available for selection in the drop down lists and settings screens.
  • 数据流不支持使用通过 Kerberos 对 DirectQuery 和 Import 查询使用 SSO选项配置本地数据源。On-premises data sources configured with the Use SSO via Kerberos for DirectQuery And Import queries option are not supported in dataflows.

后续步骤Next Steps

本文介绍了如何将本地数据源用于数据流,以及如何使用和配置网关以访问此类数据。This article provided information about using on-premises data source for dataflows, and how to use and configure gateways to access such data. 以下文章也可提供帮助The following articles may also be helpful

有关 Power Query 和计划刷新的详细信息,可以阅读以下文章:For more information about Power Query and scheduled refresh, you can read these articles:

有关通用数据模型的详细信息,可以阅读其概述文章:For more information about the Common Data Model, you can read its overview article: