Linked services issue which connecting to OnPrim DB2
I am trying to connect to Linked services for OnPrim DB2 using a VM as middle layer, I have checked the telnet connectivity everything is connected but when I used the selfhosted and tried to connect then I get the attached error, Can any one help to…
How to parameterize pipeline concurrency (number of parallel runs) in Azure Data Factory
I have two pipelines: parent pipeline to read configuration file (lookup activity), filter it with filter activity and pass the resulting array (e.g. 50 items) to ForEach activity to run child pipeline to perform per-item work in a predefined number of…
Automating Subscription Creation with Azure Functions
In modern cloud environments, automating routine tasks is key to efficiency and scalability. Azure Functions, a serverless compute service, provides a powerful platform for automating tasks within the Azure ecosystem. One common use case for Azure…
Lookup activity output into a CSV file in ADLS
I'm using lookup activity to read all tables from a Oracle SQL Developer application for a particular schema. The query is SELECT TABLE_NAME FROM all_tables WHERE owner = 'Schema_Name' Here the scheme name is dynamic taking value from a variable. Now, my…
How to read nested xlsx files in Azure Data Factory
How we can read these types of nested xlsx files in ADF
Dynamic ADF Data Transformation Challenge
I'm presently engaged in an ADF data transformation task utilizing the .NET SDK, employing Data Flow for the transformation process. Within a CSV file containing three columns—First Name, Last Name, and Age—I aim to merge First Name and Last Name into a…
ADF pipeline with upsert throwing error "The data type text cannot be used as an operand to the UNION, INTERSECT or EXCEPT operators because it is not comparable"
ADF pipeline writing data from AZURE sql to on prem SQL Server with upsert operation. TEXT fields are part of target system, Even though TEXT fields are not selected in select query and are not mandatory fields still throwing error. I cannot change…
Can Azure Data Factory SAP ODP connector read from cluster tables like PCL2 & PCL4 ?
Does Azure Data Factory connector "SAP CDC" uses ODP framework of SAP? Can SAP CDC connector read from SAP Cluster tables from ECC system specifically PCL2 & PCL4 cluster tables?
Data Factory Pipeline, I want to Publish my trigger that I have created through a Http request, is this at all possible.
Hello, I have created a Data Factory Pipeline with help from the below links using a HTTP Rest API call. https://learn.microsoft.com/en-us/rest/api/datafactory/pipelines/create-or-update?view=rest-datafactory-2018-06-01&tabs=HTTP#code-try-0 …
Azure Data Factory Old Security->Credential not clear after deploy
Hi, I was testing with the UserAssignedManagedIdentity credential and I created 2 credentials with the same UserAssignedManagedIdentity but different names(UserAssignedManagedIdentity & credentialUAMI). I ran my pipeline to deploy using the dev…
Will the ADF HubSpot Dataset updated to use the HubSpot v3 Owners API?
Is there any plan to update the ADF HubSpot Dataset to use the HubSpot v3 Owners API before the v2 Owners API gets sunset on August 30, 2024? Currently, the ADF HubSpot Dataset relies on the v2 Owners API, but HubSpot has announced the end of life for…
How to store set variable json array output into csv file in data factory
I used the Lookup activity under for each loop and used Store Procedure which gave me some record as an output if the condition is fails So that output I have stored into the Set Variable and then those values I need to store into csv file whose file…
How to translate database content with Azure Translator by ADF or Synapse notebook?
There is an Azure Database table. Some of columns need to be translated from one language to another into additional columns. Such as from English to Spanish, or Portuguese to English, etc. I am exploring how I can use ADF or Synapse notebook to…
How to maintain same folder structure as source while sink the processed file
I have a requirement to process the JSON to parquet on daily basis. I have folder A,B,C needs to sink the file to another container with same structure as A,B,C for example if I'm processing a file from folder A it should sink to output container folder…
Error "The DeveloperEmail field is required." when setting up Graph Data Connect
Hello, I'm trying to set up Graph Data Connection using this tutorial: https://github.com/microsoftgraph/dataconnect-solutions/tree/main/solutions/ona/PreRequisites Unfortunately, I cannot finalize the process, because when I click on "Create",…
How to split rows and columns by using ADF?
How can I retrieve data from the following table by using ADF lookup activity and want to get the result as below? Thanks
Azure Data Factory - Copy Data action from SQLServer to Oracle gives an error like "Character, decimal, and binary parameters cannot have a precision of zero"
I am using Copy data in order to move files from Sql server to Oracle, I have cases where in Sql server side , columns have data type of nvarchar and can have null value or empty string '' for the cases where I have empty string - it can't be inserted…
ADF Lookup activity not returning correct datetime value
A pipeline uses a lookup activity and runs a script against Snowflake, returns first row of data and we are using it in a update statement. Running the lookup to get MAX(load_date) from a snowflake table. In Snowflake it is "2024-04-25 17:26:01.548…
Converting UTC to EST with daylight saving
I am trying to convert UTC to EST with following formula but its not taking daylight saving time to consideration. toString(fromUTC(currentUTC(),'EST'),'yyyyMMddHHmmss') it shows time one hour behind EST. Is there a workaround?
In ADF, I am not able to merge data from multiple streams with different columns with different data types
Hello, I am not able to merge data from multiple streams in ADF DATAFLOWS. Stream 1) flow let - only 1 column. Stream 2) flow let - only 1 column. Stream 3) source columns (54 columns) Stream 4) regular…