Install Export to Azure Data Lake add-in
The Export to Data Lake add-in is generally available in the United States, Canada, United Kingdom, Europe, South East Asia, East Asia, Australia, India, and Japan regions. If your Finance and Operations environment is in any of those regions, you will be able to install the Export to Data Lake add-in in it. Microsoft will enable this feature in additional regions in the future. You can join the preview Yammer group to stay in touch and ask questions that will help you understand the feature and upcoming improvements.
The Export to Data Lake feature isn't available in Tier-1 (developer) environments. You must have a cloud-based Tier-2 or higher sandbox environment to enable this feature. However, you can prototype the feature in a Tier-1 (developer) environment by using GitHub tools. These tools let you export data from your Tier-1 or sandbox environment into a data lake in the same format that is exported by the feature.
Before you can use the Export to Data Lake feature in Finance and Operations environments, your administrator must install the Export to Data Lake add-in and connect your environment with a data lake. The Export to Data Lake add-in must be installed in your environment via LCS. You must contact your LCS administrator to perform this operation.
The Export to Data Lake add-in requires connection information for your data lake. Therefore, before you install it, you must create a storage account (that is, an Azure data lake) if you haven't already created one. To create the required Azure resources, you might have to contact an administrator who can create Azure resources on your behalf.
The following step-by-step instructions will guide you through the process.
Create Service Principal for Microsoft Dynamics ERP Microservices
The Export to Azure Data Lake feature is built using a microservice that exports Finance and Operations app data to Azure Data Lake and keeps the data fresh. Microservice uses the Azure service principal, Microsoft Dynamics ERP Microservices, to securely connect to your Azure resources. Before you configure the Export to Data Lake feature, add the Microsoft Dynamics ERP Microservices service principal to your Azure Active Directory (Azure AD). This step enables Azure AD to authenticate the microservice.
You will need Azure Active Directory global administrator rights to perform these steps.
To add the service principal, complete the following steps.
Launch the Azure portal and go to the Azure Active Directory.
On the left menu, select Manage > Enterprise Applications, and search for the following applications.
Application App ID Microsoft Dynamics ERP Microservices 0cdb527f-a8d1-4bf8-9436-b352c68682b2
If you are unable to find the above applications, complete following steps.
On your local machine, open the Start menu, and search for PowerShell.
Right-click Windows PowerShell, and then select Run as administrator.
Run the following command to install AzureAD module:
Install-Module -Name AzureAD
- If NuGet provider is required to continue, select Y to install it.
- If an Untrusted repository message appears, select Y to continue.
Run the following commands to add the application to the Azure Active Directory.
New-AzureADServicePrincipal –AppId '0cdb527f-a8d1-4bf8-9436-b352c68682b2'
Sign in as the Azure Active Directory administrator when prompted.
Configure Azure Resources
To configure the export to Data Lake, create a storage account in your own Azure subscription. This storage account is used to store data. Next, create an Azure AD application ID that grants access to the root of your storage account. Your Finance or Operations app will use the Azure AD application to gain access to storage, create the folder structure, and write data. Create a key vault in your subscription and store the name of the storage account, application ID, and the application secrets. If you don't have permission to create resources in Azure portal, you will need assistance from someone in your organization with the required permissions.
The steps, which take place in the Azure portal, are as follows:
When you are working in the Azure portal, you will be instructed to save several values for subsequent steps. You will also provide some of these values to your Finance and Operations apps by using Lifecycle Services (LCS). You will need Administrator access to LCS in order to do this.
- Create an application in Azure Active Directory
- Create a Data Lake Storage (Gen2 account) in your subscription
- Grant access control roles to applications
- Create a key vault
- Add secrets to the key vault
- Authorize the application to read secrets in the key vault
- Power Platform integration
- Install the Export to Data Lake add-in in LCS
Create an application in Azure Active Directory
In the Azure portal, select Azure Active Directory, and then select App registrations.
Select New registration, and enter the following information:
- Name: Enter a name for the app.
- Supported Account types: Choose the appropriate option.
On the left navigation pane, select API permissions.
Select Add a permission, and in the Request API permissions dialog box, select Azure Key vault.
Select Delegated permissions, select user_impersonation, and then select Add permissions.
On the left navigation pane, select Certificates & secrets, and then select New client secret.
In the Description field, enter a name.
In the Expires field, select an option, and then select Add. The system will generate a secret and display it under the grid.
Create a Data Lake Storage (Gen2) account in your subscription
The Data Lake Storage account will be used to store data from your Finance and Operations apps. To manually create a storage account, you must have administrative rights to your organization's Azure subscription. To create a storage account, complete the following steps.
In the Azure portal, select Create new resource, and then search for and select Storage account – blob, file, table, queue.
In the Create storage account dialog box, provide values for the following parameter fields:
- Location: Select the data center where your environment is located. If the data center that you select is in a different Azure region, you may incur additional data movement costs. If your Microsoft Power BI or your data warehouse is in a different region, you can use replication to move storage between regions.
- Performance: We recommend you select Standard.
- Account kind: You must select StorageV2. In the Advanced options dialog box, you will see the option, Data Lake storage Gen2.
On the Advanced tab, select Data Lake storage Gen2 > Hierarchical namespaces, and then select Enabled. If you disable this option, the Export to Data lake feature may fail with an error.
Select Review and create. When the deployment is complete, the new resource will be shown in the Azure portal.
Grant access control roles to applications
You need to grant your application permissions to read and write to the storage account. These permissions are granted by using Roles in Azure AD.
In Azure portal, open the storage account that you created earlier.
Select Access Control (IAM) in the left navigation.
On the Access control page, select the Role assignments tab.
Select Add at the top of the page, and then select Add role assignment.
In the Add role assignment dialog box, select the Role field, and then select Storage blob data contributor.
In the Select field, select the application that you registered earlier.
Don't make any changes to the fields, Assign access to and Azure AD user, group, or service principal.
Repeat steps 4-7 to add the Storage blob data reader role, as shown.
Validate the storage account role assignment for the application you created earlier.
Application Role The application you created earlier Storage blob data contributor The application you created earlier Storage blob data reader
Create a key vault
A key vault is a secure way to share details such as storage account name to your Finance and Operations apps. Complete the following steps to create a key vault and a secret. We recommend that you create a key vault for the use of the Export to Data lake feature. We do not recommend using the same key vault to provide access to multiple services.
- In the Azure portal, select Create a new resource and then search for and select, Key Vault.
- In the Create key vault dialog box, in the Location field, select the datacenter where your environment is located.
- After the key vault is created, select it from the list, and on the left navigation pane, select Overview.
- Save the value in the DNS name field. You will need this value later.
Add secrets to the key vault
You are going to create three secrets in the Key vault and then add the values saved from previous steps. For each of the secrets, you will need to provide a secret name and provide the value you saved from earlier steps.
|Suggested secret name||Secret value that you saved earlier||Example secret value|
|app-id||The ID of the application created earlier.||8936e905-197b-xxx-xxxx-xxxxxxxxx|
|app-secret||The client secret specified earlier.||NaeIxxxxxxx---xxxx7eixxx~1g-|
|storage-account-name||The name of the storage account created earlier.||contosod365datalake|
You will need to complete the following steps three times, once for each secret.
- In the Azure portal, go to the key vault you created earlier and on the left navigation pane, select Secrets.
- Select Generate/Import, and in the Create a secret dialog box, in the Upload options field, select Manual.
- Enter a name for the secret. See the table in the introduction of this section for suggested names.
- Copy and paste the corresponding secret value in the Value field.
- Select Enabled, and then select Create.
You will notice the secret created in the list of secrets.
Authorize the application to read secrets in the key vault
In Azure portal, open the key vault that you created earlier.
In the Add access policy dialog box, select Add.
On the left navigation pane, select Access policies > Add Access Policy to create a new policy.
In the Add access policy dialog box, in the Select principal field, locate and select the application, Microsoft Dynamics ERP Microservices, and then click Select.
If you can't find Microsoft Dynamics ERP Microservices, see the Create Service Principal section in this document.
In the Secret permissions fields, select Get and List.
In the Access policy dialog, select Add.
You should see application with access to your key vault as shown below.
Application Secret permissions Microsoft Dynamics ERP Microservices Get, List
Power Platform integration
If this is the first time you are installing add-ins in this environment, you may need to enable the Power Platform integration for this environment. There are two options to set up Power Platform integration in Finance and Operations app environments.
Option 1: Set up Power Platform integration using LCS
To set up Power Platform integration from LCS, see Add-ins overview.
Option 2: Set up Power Platform integration using the Dual-write wizard
Another way to set up Power Platform integration is to create a Power Platform environment with a database and then use the Dual-write setup. Complete the following steps to create the Power Platform environment and complete the integration.
- Create an environment with database.
- Complete the requirement and prerequisite.
- Use the dual-write wizard to link your environment.
- Validate that the Power Platform integration is set up and added in the LCS environment page.
If you use this approach, you must select a Power Platform environment that is in the same region as your Finance and Operations environment. If you select a Power Platform environment that is in a different region, installation of the add-in might fail.
Install the Export to Data Lake add-in in LCS
Before you can export data to your data lake from your Finance and Operations apps, you must install the Export to Data Lake add-in in LCS. To complete this task, you must be an environment administrator in LCS for the environment that you want to use.
You need the following information before you start. Keep the information handy before you begin.
|Information you need for Export to Data lake add-in||Where you can find it||Example|
|Your environment Azure AD Tenant ID||Your Azure AD tenant ID in the Azure portal. Sign in to the Azure portal and open the Azure Active Directory service. Open the Properties page and copy the value in the Directory ID field.||72f988bf-0000-0000-00000-2d7cd011db47|
|DNS name of your key vault||This name should have been previously saved. Enter the DNS name of your key vault||
|The secret that contains the name of your storage account||If you used the suggested name, enter storage-account-name. If not, enter the secret name you defined.||storage-account-name|
|Secret that contains the Application ID||If you used the suggested name, enter app-id. If not, enter the secret name you defined.||app-id|
|Secret that contains the Application secret||If you used the suggested name, enter app-secret. If not, enter the secret name you defined.||app-secret|
- Sign in to LCS and navigate to your environment.
- On the Environment page, select the Environment add-ins tab. If Export Data Lake appears in the list, the Data Lake add-in is already installed, and you can skip the rest of this procedure. Otherwise, complete the remaining steps.
- Select Install a new add-in, and in the dialog box, select Export to Data lake. If Export to data lake isn't listed, the feature might not be available for your environment at this time.
- In the Setup add-in dialog box, enter the required information. To answer the questions, you must already have a storage account. If you don't already have a storage account, create one, or ask your admin to create one on your behalf.
- Accept the terms of the offer by selecting the check box, and then select Install.
The system installs and configures the data lake for the environment. This operation might take a few minutes. After installation and configuration are completed, Export to Data Lake should be listed on the Environment page, and the status should be Installed. If a different status is shown, see the "Troubleshooting" section that follows.
Add-in installation isn't completed within a few minutes
In some cases, add-in installation might show a status of Installing or Configuring for more than 10 minutes. The cause of the delay might be a configuration issue or a missing parameter. In this case, select the Abort option, and then follow the steps to install the add-in again.
Add-in installation fails
In some cases, add-in installation might show a status of Installation failed. When installation fails, an error code and error message are shown. The "Resolution" column in the following table provides suggestions that can help you correct the reason for the failure. To correct the issue, select the Abort option, and then follow steps to install the add-in again.
|Error code and message||Resolution|
|AppidUserError: Failed to find Application ID to access the data lake. Application ID provided is incorrect or can't be found.||The application ID (app-id) that is provided in the key vault can't be found in Azure AD. Validate the application ID by following the steps in Configure export to Azure Data Lake - Create Application. You might have to contact the system administrator or the administrator who configured Azure resources.|
|AppSecretUserError: Failed to access data lake with given Application ID and Application secret.||The application ID (app-id) and application secret (app-secret) that are provided can't be used to access the storage account. Validate the application ID and application secret by following the steps in Configure export to Azure Data Lake - Create Application. Next, verify that the application has the required access to the storage account. For more information, see Configure export to Azure Data Lake - Grant access. You might have to contact the system administrator or the administrator who configured Azure resources.|
|StorageNameUserError: Failed to access the storage account using the storage name provided in the key vault.||The storage account that is provided in the key vault can't be found, or it isn't valid. Verify that the correct storage account name is entered in the key vault by following the steps in Configure export to Azure Data Lake - Add secrets. Verify that you've provided the correct secret name for the storage account by following the steps in Configure export to Azure Data Lake Add secrets.|
|KeyVaultUserError: Failed to access the key vault or the key vault secrets.||The service can't access the key vault or the secrets in it. Verify that your Azure subscription hasn't expired. Verify that you've created the service principal by following the steps in Configure export to Azure Data Lake - Create service principal. Verify that the key vault contains all the required secrets by following the steps in Configure export to Azure Data Lake - Add secrets. Verify that you've provided the correct key vault URI in the configuration steps in Configure export to Azure Data Lake - Install add-in.|
|TenantIdUserError: Failed to locate the Azure Tenant ID for the environment.||Verify that you've provided the correct Azure tenant ID by following the steps in Configure export to Azure Data Lake - Install add-in.|
Submit and view feedback for