How to copy files from sharepoint drive that has huge heirarcy of folders and subfolders
We want to get all files from the sharepoint into our ADLS container. In sharepoint drive there are arround 60k files but through our existing pipeline we are getting only 600 files. We have a web activity with below…
Azure to AWS
Hello We need to transfer files from ADLS to AWS (S3 bucket) for a SAS application hosted in third party in batches. We need to ensure data security and best practices. My understanding, we can use ADF to create a linked service for AWS S3 but IT DOES…
Assert error output not writing to blob
Hello. I have built a pipeline which gathers data from a .csv file and then goes through a couple of assert activities to check the validity of the data. Valid rows are supposed to be input into the table, while assert-failure rows are setup to be…
How can I find help validating a certificate?
Indeed, I am a student in cloud computing in Morocco and I do not have enough means to obtain the AZ 900 certification and my family also does not have enough financial means. Could someone help me validate this certificate please?
Possible bug or issue in Synapse dedicated SQL pool when exporting parquet files
I'm not sure if this is really a bug, but its definitly a frustration for me at least ;-). When trying to write data from Synapse dedicated SQL pools to data lake storage as parquet files (using a CETAS statement) it produces files with non-standard…
Delete the file from SharePoint location
Hi All, I am trying to copy the files from Share Point to ADLS and referring to the below URL pipeline to achieve the copy functionality. https://www.syntera.ch/blog/2022/10/10/copy-files-from-sharepoint-to-blob-storage-using-azure-data-factory/ I need…
How to sync Azure data lake storage with sharepoint drive?
We have copied files from sharepoint site multiple drives having nested folders in Azure data lake storage container maintaining same folder structure. Now I want to create a pipeline in Azure data factory to delete file from ADLS container which is not…
Changing Synapse Notebook variable values during deployment in Azure DevOps
How can I change the variable values in a Synapse Notebook, such as storage account name, container name, and file location, during deployment from dev to prod in Azure DevOps? The notebook is already being used by other notebooks in my dev environment.
Error trying to validate storage account name while creating a new synapse workspace
I am unable to create a new data lake storage (gen2) account name in the basic tab for create Azure Synapse Workspace. The error I get is - " There was an error trying to validate storage account name. please try again ". The error message is…
How to use a user-delegation SAS token to load parquet table from ADLS gen2?
Now I have a parquet table stored in ADLS gen2: adlss://mycontainer@mystorage.dfs.core.windows.net/folder1/table1. This is a read-only table and I want to restrict my service principal to only have read access to this table only. So I use ACL to grant…
Aritechture best practices to store data from websraping and use it in analytics
I am scrapping data from websites and I want to export what I scrapped as csv files then store them in Azure Data Lake, then apply an ETL and the final data output will be used for Power BI reports and for machine learning do I need to use Azure synapse?…
How to setup modern Arcitechure for Small/Medium Business?
Currently we're using the following setup which is slow to process the data and is slow on the power bi side: Azure VM for third parties to upload via sftp C# script to ETL data to azure sql server and move files to ADLS Gen2 Power BI report pulling…
How Can I read csv files which are in nested folders and copy them preserving the hierarchy?
I have csv.gz files which are partitioned this way /2024/01/01/xyz/x.csv /2024/01/01/yza/y.csv /2024/01/01/zab/z.csv and there are files for several years and i want to copy all those files using adf while maintaining the folder structure and hierarchy…
Azure Synapse Link for dataverse to bring Dynamics 365 data to ADLS and Synapse for analytical purpose - Facing Issues and inconsistencies in the feature
We are using Synapse Link for Dataverse feature to bring dynamics 365 data to ADLS and Synapse Analytics. We have below open issues due to which we are unable to finalize the solution: In F&O linked dataverse environment, we have created a synapse…
VPN and networking config for secure upload to ADLS Gen2 storage account?
I'm looking into network architectures and security for implementing secure access from user laptops into Azure to upload files to ADLS Gen2 data lake blob containers. We have no on-prem network or AD - just individual user laptops. We do have MS Entra…
How can i create a linked service in ADF for Sharepoint online?
I want to extract files from sharepoint to ADL by only using ADF. I followed few steps Step1: Azure Active Directory -> Registered new app -> created new secret key I have the Tenant ID, ClientID(App ID), Secret Key Step2: Sharepoint online ->…
How can I get all files inside a drive irrespective of folder structure in ADF?
I want to copy files files from sharepoint drive which has lots of nested folders. Maximum hierarchy of folders is for 12 levels. Currently I'm using below endpoint in ADF's Web activity as it was mentioned in some of the articles that it provides every…
Having problem with Azure sandbox storage account
Hi, I am having issue to access to storage account in Azure sandbox environment. I have login to Azure sandbox environment, but not able to access to Azure storage account even though my subscription is selected as sandbox. Capture.PNG
Unable to copy share point files through ADF
I am facing issue now! I followed all the steps https://learn.microsoft.com/en-us/azure/data-factory/connector-sharepoint-online-list?tabs=data-factory and tried to read pdf files from sharepoint through ADF. source: Sharepoint(PDF files)(I selected…
Special character handling for file processing
Hello, I have some CSV files as feeds into datalake storage, this file will contain data /records with some special characters eg '/' We need to process the file from one container to another and we will need to remove some of the these special…