Azure blob storage gen2
I need to restrict uploading files to Blob storage at 18 MB size and allow only .txt extension
Azure logic app , extract file from sharepoint to ADLS
Hi, I am working on a file extraction from sharepint to ADLS and the file size is 1 GB. I need to copy the file to ADLS. I am currently using standard logic app as the file size is more than the default size. After that i tried adding the file size in…
Download data from cosmos by v-studio
When I download data from cosmos from azure data lake storage of visual studio, it shows an error: 流 URL 无效
Copy Data activity fails when copying from ADLS to ADLS
I'm currently attempting to use a Copy Data activity to copy a .json file over from what we treat as a "staging area" of our data over to our Datalake all within Azure Datalake Storage. The files start out in .json format but are copied over…
Can't deploy python function with the module azure-identity or azure-storage-filedatalake
I'm trying to deploy my function that I've been working on locally, whenever I add the module "azure-storage-filedatalake" or the module "azure-identity" my function doesn't appear on the function app anymore, but when I remove this…
ErrorCode=UserErrorUnzipInvalidFile,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The file 'All Weekly 18.02.2024.zip' is not a valid Zip file with Deflate compression method.,Source=Microsoft.DataTransfer.ClientLibrary,''Type
While Files copying from SFTP to ADLS using ADF Copy activity Pipeline fails when doing unzip from SFTP to ADLS.
How to write to datalakegen2 storage using databricks in delta format when connected using SAS tocken
I converted the data from parquet form to data format. Now I want to write the data to blob storage - datalakegen2. But facing below error while writing. I useed the below command to write my data: output_path =…
How to fix "LeaseIdMissing" or "LeaseNotPresent" Error in Synapse pipeline
Hello, In past couple of weeks we are facing issues with data lake loads. As a part of the process we delete historical data from the data lake and reload the data into the data lake through synapse pipelines. When trying to delete the historical data…
proper cleanup of spark _temporary directory after an exception
We have a data collection app that periodically queries some data source, then writes the result to ADLS: df.write.mode('append').option('compression', 'gzip').parquet(rawDbPath) Every other week, the data source does some maintanence…
Urgent: Reuse Azure Storage Account Name After Deletion
Hello, I created and deleted a Azure Data Lake Storage Gen2 account in tenant #1 more than 14 days ago. Now I am trying to create Azure Data Lake Storage Gen2 account in tenant #2 with the same name but I keep getting below error message: The storage…
Understanding the Structure of Incremental D365 FO data in Data Lake Gen2
I am a data engineer new to working with Azure and I have set up an ETL process to read incremental data out of Data Lake Gen2 storage and push to Azure SQL Database. I am using Azure Synapse Link to expose Dynamics 365 FO tables to the data lake. I'm…
I am using the Student version of Microsoft azure. Can I create Data lake and Data warehouse with this version?
I am student in CIT , Coimbatore.I want to explore the concepts of Data warehouse so I choosed Microsoft azure student version.Can I create warehouse in this version if not please suggest another platforms to create effective warehouses for free.
Migrate Storage Account (Gen2 Datalake) from GRS to ZRS
Trying to move a datalake from GRS to ZRS. The documentation says to convert to LRS and then request a migration. When I try and request a migration the options are not present at documented. It does not seem possible to request this any longer. Can…
Logic app to retrieve the latest file from a blob folder
How can I create a logic app that retrieves the latest file from a blob folder when a Http request is received, where there are multiple files, and sends it as an attachment? Are there any specific steps or configurations required for this process?
Synapse Serverless CETAS fails with error "Fatal exception occurred: bad allocation".
Hello, I am trying to create an external table (CETAS) from a large amount of fairly small json files, so that they can be queried more efficiently. The json files are stored on ADLS. Previously this worked fine, when i let the query run for 1 - 1.5…
In ADF using HDFS linked service my copy file activity throws the following error
Hi, I have an issue using ADF with HDFS linked service. I created a HDFS connection then a copy acitivity from HDFS to Azure Data Lake gen2. The source is a CSV file and the copy format is binary. When I run the pipeline I get the following error: …
Azure Datafactory Out of memory error when read from Salesforce
I use Datafactory Copy activity to copy data from Salesforce to ADLS, I am facing with Out of memory error. The file size is 129k of rows 800MB, then I set block size to 100 MB and Max rows per file to 100 000, but the error still exist. What can you…
ADF pipeline to read the data from UC table to adls gen2 account
Hello Team, We have a requirement to create Azure Datafactory pipeline to read the data from UC table, access on the table is provided ( to Azure Datafactory Managed Identity) and copy the data into adls gen2. Is there a way or article to implement this?…
Consistent data in data lake gen2
Hi friends, I have to understand how data consistency works in ADLS, I have found this old…
Can I use wild card(*) in middle of File Path
Can I use wild card(*) in middle of File Path 50m ago Can I use wild card(*) in middle of File Path when I load files ADLS to Notebook? I got file path like bellow …