Cash Assist customer%%-Service - care Number//.8207266523//-8207266523
Hi all I can you tell me how to explore new to the world of the world of the world
Script Activity not supporting multi-instruction with new Snowflake connector
I have a pipeline, in Azure Data Factory that has an "Activity Script" that executes 3 instructions against Snowflake. The script was working fine until I have updated the Snowflake connection in order to remove the legacy Snowflake connection…
Updating Oracle on-premise table from a Synapse table
Hi, I need to update an Oracle on-premise table from a Synapse table by joining the Oracle table with the Synapse table avoiding to build a staging table on Oracle platform to copy the Synapse data. It is a problem to obtain the authorization from my…
How can i monitor azure data factory pipelines runtime logs in realtime with no latency?
Hi team, I'm trying to to retrieve Azure Data Factory pipeline execution logs in Realtime without any latency. If tried using log analytics, I see there is a latency in logs to get available. How can I get the Realtime pipeline details including IN…
How to update and delete a row using Azure Data Factory Change Data Capture
I am exploring Azure Data Factory CDC feature, and I am trying to perform the CDC from one SQL table to another SQL table, also the SQL table in the source has a primary key. Whenever a new row is added to the source table, it gets added to the…
Lookup activity output into a CSV file in ADLS
I'm using lookup activity to read all tables from a Oracle SQL Developer application for a particular schema. The query is SELECT TABLE_NAME FROM all_tables WHERE owner = 'Schema_Name' Here the scheme name is dynamic taking value from a variable. Now, my…
Trying to write an expression for a filter activity in Azure Data Factory
I currently am using 3 filter activities where I am using the following condition @contains(item().name,'example') to filter out a certain group of files from the array obtained by get metadata This works fine but is still bringing back 5 files with the…
How do I copy data from a public sharepoint URL to Microsoft Blob Storage?
Hi everyone! I am new to using Azure services. My co-author at Stanford has uploaded several files (in csv format) to a public link. This link starts as: office365stanford-my.sharepoint.com/personal It has 40 files, ranging from a few 100 MB to 100 GB. I…
Azure Data Factory simple copy file and rename
Hello: I have a simple requirement. I have a couple of files in an Azure Blob folder (receiving folder) and I need to copy them, to another fold (Azure Blob folder) and rename them. This posting is really close to what I need to do: …
Delta to Parquet mapping data flow resulting in one empty partition of 2
Hi, I've been working on something, but I can't get it to work. I seem to have found the issue, but I"m not able to fix it. I've isolated in a single run which I'll explain below: I have a Delta table on ADLSv2 (based on one partition) that I want…
Error using Azure Data Factory to Copy data (Using Upsert) from Azure Blob to Azure SQL database
Hi all, I keep getting this error when I perform an upsert (under copy activity) with this error code: Failure happened on 'Sink' side. 'Type=System.NullReferenceException,Message=Object reference not set to an instance of an…
ByPass custom logic executions in Azure Data factory copy activity
Hi, I am planning to do data refresh project using Azure data factory and like to disable the custom logics execution. is there any way I can do in ADF copy activity? Like…
Data flow: toBase64 adds break lines after 76th character in specific ADF location
Hi! Recently we've got unexpected outputs from ADF pipelines with no prior changes to the source code. Suddenly, the toBase64 function in data flow activities began to add break lines (\r\n) after the 76th character for output values. During the issue…
Azure Data Factory pagination using QueryParameters and body field
Hello everyone, I am currently working with a pagination system and facing a challenge with the URL and parameter concatenation. I am trying to append a specific string to the URL retrieved from the first page of an API response. Here's the setup: API…
How to ignore the records by applying an auditing filed column condition using ADF Data Flows
Hi All I am building a data transamination using mapping data flows ,I have a time stamp field Like TimeStampUpdated in the target table. I want to lockup historical data with incremental data transamination and ignore the records coming in the…
All ADF dataflow stages disappear after publish
Hello, I made a source for my Dataflow and added several stages to the existing one. I published and Dataflow showed an error with a new source. I tried to delete it, but could not delete it. I published again and all my stages disappeared. It is not…
Unable to create a cluster in Databricks getting quota exceeded error while everything is new
Azure Quota Exceeded Exception: Error code: QuotaExceeded, error message: Operation could not be completed as it results in exceeding approved standardDADSv5Family Cores quota. Additional details - Deployment Model: Resource Manager, Location:…
Mapping to polymorphic field in dynamics 365
Hello Team - I am loading data into the Incident table of a Dynamics 365 environment from an Azure SQL DB using the Azure Data Factory. the Incident table has lookups to the Account table on AccountId and Contacts via the ContactId. I have populated…
Failed to get access token from your token endpoint. Error message: No MediaTypeFormatter is available to read an object of type 'OAuth2AccessToken' from content with media type 'text/html
In Azure Data Factory I am trying to connect to a REST API that uses OAuth 2.0 Authentication. When supplying Data Factory with all the necessary credentials and testing the connection I get the error: Failed to get access token from your token…
Issue with Copy Dataverse data into Azure SQL using Synapse Link Template
I am experiencing issues using ADF I am trying to use the template "Copy Dataverse data into Azure SQL using Synapse Link" to move f&o data from datalake to Azure…