How can I find help validating a certificate?
Indeed, I am a student in cloud computing in Morocco and I do not have enough means to obtain the AZ 900 certification and my family also does not have enough financial means. Could someone help me validate this certificate please?
how to call Oracle procedure in ADF lookup activity
We need call Oracle Procedure from the Azure Data Factory. It seems that in lookup activity only Select is supported. How can we call the Oracle Procedure without the modification on Oracle side?
Azure Data Factory simple copy file and rename
Hello: I have a simple requirement. I have a couple of files in an Azure Blob folder (receiving folder) and I need to copy them, to another fold (Azure Blob folder) and rename them. This posting is really close to what I need to do: …
how to check who created the log analytics workspace in azure but it created more then 90n days ago we can't able to check in activity log how to we check that?
how to check who created the log analytics workspace in azure ? it created more then 90n days ago we can't able to check in activity log how to we check in another way? kindly please share me how can we check that
how to fix Error: Spark job failed: { "text/plain": "{\"runId\":\"2325e724-f898-471d-b9b3-1f28fc560b44\",\"sessionId\":\"9c4abfad-3cc6-4429-b30a-d70b25537d29\",\"status\":\"Failed\",\"payload\":{\"statusCode\":400,\"shortMessage\":\"java.lang.Exception:
while i am trying to Data preview getting this error
couldnt able to download the files completely by using copy activity download from website always downloading partially
whenever i try to download the files from website by using copy activity its downloaded partially 124.kb instead if 240MB - 350MB ..
Running specific pipelines from an Orchestration pipeline
How can I run individual pipelines from an Orchestration pipeline that has multiple pipelines running in parallel? I want to be able to choose which pipeline to run separately from the rest.
How to apply Wildcard REGEX filename from SFTP using ADF ?
I have a list of files on an SFTP server that I need to copy and move to a blob storage container. The filenames should match the following wildcard pattern: (FullName)[0-9](-)[0-9](-)[0-9]*(.)(CSV). Here are the steps I've taken: Created a new…
Azure Synapse Analytics Failed to setup debug session
Right now I'm not able to run a data flow task on our Azure Synapse environment. When I try to start a debug session I get the message Failed to setup debug session. When I trigger the pipeline instead the task fails with "Operation on target…
Understand costs AZ Databricks job
I have an Azure Data Factory pipeline that executes multiple Databricks Notebooks using job clusters. I need to track the cost of these job clusters, including both the Databricks and the underlying VM costs, specifically for this set of jobs. Currently,…
ADF Data Flows Flatten nested json array values are being populated as null
Hi All I am building a data transamination with ADF data flow using a nested json array of objects , but after parse and flatten the json node itOffer.item.LeadOfer.zdeal.item[].dealNumber I am seeing that the column values are populated as null . I…
Call API with dynamic URL and store JSON results in DataLake Storage
I am trying to make implement the following scenario using Data Factory: I am making multiple API calls, relying on a dyanmic URL (e.g. "url.com/api/{ID}" for a list of IDs). The resulting JSON from each of the API calls should be stored as a…
Script Activity not supporting multi-instruction with new Snowflake connector
I have a pipeline, in Azure Data Factory that has an "Activity Script" that executes 3 instructions against Snowflake. The script was working fine until I have updated the Snowflake connection in order to remove the legacy Snowflake connection…
Choosing an Approach for Incremental Loading with Watermark in Azure Data Factory: Efficiency and Cost Considerations
Hi all, I'm working on implementing an Azure Data Factory pipeline for incremental data loading using a watermark table approach. I have identified two different approaches but am unsure which one is considered the best practice in terms of…
How to compare two dates in ADF to see which is greater?
Hi there, I am using Azure Data Factory to compare two dates (one is stored in a string variable and one its utcnow) and I need to know when the utcnow is greater than the date in the stored string variable. Both are just dates no timestamps I looked at…
Data flow failing at Sink Step when pipeline is scheduled
I have Pipeline with Data flow debug where data is sinked into Data set. When run manually it runs fine and the parquet file is created. However when scheduled in Pipeline it gives following error. Pipeline runs fine only gives error on schedule…
How to convert type of any block blob to append blob inside adls gen2? Is there any process or activity which will convert the type of my entire blob without changing the content?
I have a json file inside a data lake gen2 storage and that json file blob type is a block blob. I want to convert that blob type to append blob through any services of azure. Need to know the process in detail.
Except Unique Assert Transformation in Dataflow
I am trying to use Asset transformation in dataflow to check the dups in source. I am not able to get the correct result when using parameters that will have single or multiple columns as primary keys (example shared below) when using below expression. …
Unable to create a cluster in Databricks getting quota exceeded error while everything is new
Azure Quota Exceeded Exception: Error code: QuotaExceeded, error message: Operation could not be completed as it results in exceeding approved standardDADSv5Family Cores quota. Additional details - Deployment Model: Resource Manager, Location:…
Azure HTTP Linked service ADF Username only no password
I am trying to use Azure HTTP Linked services to connect an API which uses just a username and no password for basic auth. The password input is mandatory in Azure. Anything i can do?