Configure export to Azure Data Lake
The Export to Azure Data Lake feature is in limited preview and may not be available in all regions and environments supported by Finance and Operations apps. If you are unable to find the Export to Azure Data Lake functionality in Lifecycle Services (LCS) or your Finance and Operations apps, this feature is not currently available in your environment.
Currently, previews are closed. In the coming months we will enable additional environments in several regions. We are accepting requests from customers who would like to join the preview. If you would like to join a future preview, complete the survey. We will contact you when we are ready to include you. You can also join a Yammer group by completing the survey. You can use the Yammer group to stay in contact and ask questions that will help you understand the feature.
Until the feature is enabled in your environment, you have the option to prototype/plan the feature implementation using GitHub tools. The tools will enable you to export data from your sandbox environment into a storage account in the same format as exported by the feature.
At this time, Export to Azure Data Lake feature is not available in Tier-1 (developer) environments. You need a cloud-based Tier-2 or higher environment to enable this feature.
To make aggregate measurements available in a data lake, continue to use the feature in the manner that is described in Make entity store available as a Data Lake.
Create Service Principle for Microsoft Dynamics ERP Microservices
The Export to Azure Data Lake feature is built using a microservice that exports Finance and Operations app data to Azure Data Lake and keeps the data fresh. Microservice uses the Azure service principle, Microsoft Dynamics ERP Microservices, to securely connect to your Azure resources. Before you configure the Export to Data Lake feature, add the Microsoft Dynamics ERP Microservices service principle to your Azure Active Directory (Azure AD). This step enables Azure AD to authenticate the microservice.
You will need Azure Active Directory tenant administrator rights to perform these steps.
To add the service principle, complete the following steps.
- Launch the Azure portal and go to the Azure Active Directory.
- On the left menu, select Manage > Enterprise Applications, and search for the following applications.
|Microsoft Dynamics ERP Microservices||0cdb527f-a8d1-4bf8-9436-b352c68682b2|
If you are unable to find the above applications, complete following steps.
- On your local machine, open the Start menu, and search for PowerShell.
- Right-click Windows PowerShell, and then select Run as administrator.
- Run the following command to install AzureAD module:
Install-Module -Name AzureAD
- If NuGet provider is required to continue, select Y to install it.
- If an Untrusted repository message appears, select Y to continue.
Run the following commands to add the application to the Azure Active Directory.
New-AzureADServicePrincipal –AppId '0cdb527f-a8d1-4bf8-9436-b352c68682b2'
Sign in as the Azure Active Directory administrator when prompted.
Configure Azure Resources
To configure the export to Data Lake, create a storage account in your own Azure subscription. This storage account is used to store data. Next, create an Azure AD application ID that grants access to the root of your storage account. Your Finance or Operations app will use the Azure AD application to gain access to storage, create the folder structure, and write data. Create a key vault in your subscription and store the name of the storage account, application ID, and the application secrets. If you don't have permission to create resources in Azure portal, you will need assistance from someone in your organization with the required permissions.
The steps, which take place in the Azure portal, are as follows:
When you are working in the Azure portal, you will be instructed to save several values for subsequent steps. You will also provide some of these values to your Finance and Operations apps by using Lifecycle Services (LCS). You will need Administrator access to LCS in order to do this.
- Create an application in Azure Active Directory
- Create a Data Lake Storage (Gen2 account) in your subscription
- Grant access control roles to applications
- Create a key vault
- Add secrets to the key vault
- Authorize the application to read secrets in the key vault
- Power Platform integration
- Install the Export to Data Lake add-in in LCS
Create an application in Azure Active Directory
In the Azure portal, select Azure Active Directory, and then select App registrations.
Select New registration, and enter the following information:
- Name: Enter a name for the app.
- Supported Account types: Choose the appropriate option.
On the left navigation pane, select API permissions.
Select Add a permission, and in the Request API permissions dialog box, select Azure Key vault.
Select Delegated permissions, select user_impersonation, and then select Add permissions.
On the left navigation pane, select Certificates & secrets, and then select New client secret.
In the Description field, enter a name.
In the Expires field, select an option, and then select Add. The system will generate a secret and display it under the grid.
Create a Data Lake Storage (Gen2) account in your subscription
The Data Lake Storage account will be used to store data from your Finance and Operations apps. To manually create a storage account, you must have administrative rights to your organization's Azure subscription. To create a storage account, complete the following steps.
In the Azure portal, select Create new resource, and then search for and select Storage account – blob, file, table, queue.
In the Create storage account dialog box, provide values for the following parameter fields:
- Location: Select the data center where your environment is located. If the data center that you select is in a different Azure region, you may incur additional data movement costs. If your Microsoft Power BI or your data warehouse is in a different region, you can use replication to move storage between regions.
- Performance: We recommend you select Standard.
- Account kind: You must select StorageV2. In the Advanced options dialog box, you will see the option, Data Lake storage Gen2.
On the Advanced tab, select Data Lake storage Gen2 > Hierarchical namespaces, and then select Enabled. If you disable this option, you may not be able to consume data that is written by Finance and Operations apps with services such as Power BI data flows and AI builder.
Select Review and create. When the deployment is complete, the new resource will be shown in the Azure portal.
Grant access control roles to applications
You need to grant your application permissions to read and write to the storage account. These permissions are granted by using Roles in Azure AD.
- In Azure portal, open the storage account that you created earlier.
- Select Access Control (IAM) in the left navigation.
- On the Access control page, select the Role assignments tab.
- Select Add at the top of the page, and then select Add role assignment.
- In the Add role assignment dialog box, select the Role field, and then select Storage blob data contributor.
- In the Select field, select the application that you registered earlier.
Don't make any changes to the fields, Assign access to and Azure AD user, group, or service principal.
Repeat steps 4-7 to add the Storage blob data reader role, as shown.
Validate the storage account role assignment for the application you created earlier.
Application Role The application you created earlier Storage blob data contributor The application you created earlier Storage blob data reader
Create a key vault
A key vault is a secure way to share details such as storage account name to your Finance and Operations apps. Complete the following steps to create a key vault and a secret.
- In the Azure portal, select Create a new resource and then search for and select, Key Vault.
- In the Create key vault dialog box, in the Location field, select the datacenter where your environment is located.
- After the key vault is created, select it from the list, and on the left navigation pane, select Overview.
- Save the value in the DNS name field. You will need this value later.
Add secrets to the key vault
You are going to create three secrets in the Key vault and then add the values saved from previous steps. For each of the secrets, you will need to provide a secret name and provide the value you saved from earlier steps.
|Suggested secret name||Secret value that you saved earlier||Example secret value|
|app-id||The ID of the application created earlier.||8936e905-197b-xxx-xxxx-xxxxxxxxx|
|app-secret||The client secret specified earlier.||NaeIxxxxxxx---xxxx7eixxx~1g-|
|storage-account-name||The name of the storage account created earlier.||contosod365datalake|
You will need to complete the following steps three times, once for each secret.
- In the Azure portal, go to the key vault you created earlier and on the left navigation pane, select Secrets.
- Select Generate/Import, and in the Create a secret dialog box, in the Upload options field, select Manual.
- Enter a name for the secret. See the table in the introduction of this section for suggested names.
- Copy and paste the corresponding secret value in the Value field.
- Select Enabled, and then select Create.
You will notice the secret created in the list of secrets.
Authorize the application to read secrets in the key vault
- In Azure portal, open the key vault that you created earlier.
- In the Add access policy dialog box, select Add.
- On the left navigation pane, select Access policies > Add Access Policy to create a new policy.
- In the Add access policy dialog box, in the Select principal field, locate and select the application, Microsoft Dynamics ERP Microservices, and then click Select.
If you can't find Microsoft Dynamics ERP Microservices, see the Create Service Principle section in this document.
- In the Secret permissions fields, select Get and List.
- In the Access policy dialog, select Add.
You should see application with access to your key vault as shown below.
|Microsoft Dynamics ERP Microservices||Get, List|
- Select Save.
Power Platform integration
If this is the first time you are installing add-ins in this environment, you may need to enable the Power Platform integration for this environment. There are two options to set up Power Platform integration in Finance and Operations app environments.
Option 1: Set up Power Platform integration using LCS
To set up Power Platform integration from LCS, see Add-ins overview.
Option 2: Set up Power Platform integration using the Dual-write wizard
Another way to set up Power Platform integration is to create a Power Platform environment with a database and then use the Dual-write setup. Complete the following steps to create the Power Platform environment and complete the integration.
- Create an environment with database.
- Complete the requirement and prerequisite.
- Use the dual-write wizard to link your environment.
- Validate that the Power Platform integration is set up and added in the LCS environment page.
Install the Export to Data Lake add-in in LCS
Before you can export data to your Data lake from your Finance and Operations apps, you must install the Export to Data Lake add-in in LCS. To complete this task, you must be an environment administrator in LCS for the environment that you want to use.
You need the following information before you start. Keep the information handy before you begin.
|Information you need for Export to Data lake add-in||Where can you find it||Example|
|Your environment Azure AD Tenant ID||Your Azure AD tenant ID in the Azure portal. Sign in to the Azure portal and open the Azure Active Directory service. Open the Properties page and copy the value in the Directory ID field.||72f988bf-0000-0000-00000-2d7cd011db47|
|DNS name of your key vault||This name should have been previously saved. Enter the DNS name of your key vault||https://contosod365datafeedpoc.vault.azure.net/||
|The secret that contains the name of your storage account||If you used the suggested name, enter storage-account-name. If not, enter the secret name you defined.||storage-account-name|
|Secret that contains the Application ID||If you used the suggested name, enter app-id. If not, enter the secret name you defined.||app-id|
|Secret that contains the Application secret||If you used the suggested name, enter app-secret. If not, enter the secret name you defined.||app-secret|
- Sign in to LCS and navigate to your environment.
- On the Environment page, select the Environment add-ins tab. If Export Data Lake appears in the list, the Data Lake add-in is already installed, and you can skip the rest of this procedure. Otherwise, complete the remaining steps.
- Select Install a new add-in, and in the dialog box, select Export to Data lake. If Export to data lake isn't listed, the feature might not be available for your environment at this time.
- In the Setup add-in dialog box, enter the required information. To answer the questions, you must already have a storage account. If you don't already have a storage account, create one, or ask your admin to create one on your behalf.
- Accept the terms of the offer by selecting the check box, and then select Install.
The system installs and configures the data lake for the environment. After installation and configuration are complete, you should see Azure Data Lake listed on the Environment page.
The error, UnableToInitializeLakeDueToUserError indicates that the Export to Data Lake service can't connect to a storage account or the application doesn't have the required access to the storage account. To resolve this issue, try the following:
- Validate that the secret values stored in the key vault are valid and correct. For more information, see add secrets to the key vault.
- Validate that the Azure Active Directory (Azure AD) app you have requires access to the storage account. For more information, see Grant access control roles to applications.