Error: 11402 -The value of the property '' is invalid: 'The remote name could not be resolved:
I am setting up a Linked service for ODATA in ADF, for IntegrationRuntime I am selecting "AutoResolveIntegrationRuntime" but connection to ODATA resource is failing: The value of the property '' is invalid: 'The remote name could not be…
Failed to save PrivateLinkServiceMSSQL. Error: Invalid resource request. Resource type: 'ManagedPrivateEndpoint', Resource name: 'PrivateLinkServiceMSSQL' 'Error: Invalid payload'.
Hello Team, I have created a below mentioned points in Azure Resources: standard load balancer :Test_MS_SQL load balancer resources: health probe:MSQL load balancer rule : OnPremisesSQL While creating a private Link service, I am getting error…
V2 Snowflake connector: How to copy data into non-uppercase database objects?
I just switched to the new Snowflake V2-connector in ADF but ran into problems when trying to copy data into a table that has a non-uppercase name. To give some context: I'm using a single dataset for my connection to Snowflake and parametrized schema…
How do I use the Script activity in ADF, so it uses Azure Databricks SQL Warehouse
I want to be able to use ADF Script activity to execute SQL statements on the Azure Databricks SQL warehouses (including the serverless kind). https://learn.microsoft.com/en-us/azure/data-factory/transform-data-using-script Azure Databricks SQL…
Slow Data Pipeline Performance - ADF Data Flow to Azure SQL Database
I'm having some performance issues with an Azure Data Factory (ADF) data flow pipeline. The pipeline is designed to move data from a Parquet file and insert/update it into an Azure SQL database table. The data volume is moderate, with batches of 50,000…
How to pass error details from Snowflake stored procedure to ADF script activity using the new Snowflake connector?
Since switching to the new Snowflake connector in ADF, error messages from Snowflake stored procedures don't seem to be being passed back to the calling Script activity in ADF as before. Is this intended behaviour, e.g. is there a recommended process for…
How to send parallel rest-API request through Copy data activity in ADF with Pagination rule AbsoluteURL
The pipeline flow is to send request to rest-api and load the data to a json file in ADLS2. Due to huge data requests, there is a delay in the Copy task completion. To enhance the performance of the ADF pipeline, I wanted to send 4 requests in parallel…
Oracle Service Cloud error when setting up ADF linked service
We have already setup up a vpn tunnel and all the necessary network configuration to Oracle Service Cloud. I created a Linked service to Oracle Service Cloud but when i test it i get this error: ERROR [HY000] [Microsoft][OSvC] (20) Error while…
ADF Lookup activity not returning correct datetime value
A pipeline uses a lookup activity and runs a script against Snowflake, returns first row of data and we are using it in a update statement. Running the lookup to get MAX(load_date) from a snowflake table. In Snowflake it is "2024-04-25 17:26:01.548…
Data Factory auto create table in Copy activity doesn't seem to work, or isn't very useful
Hi there I'm trying to create copy activities where the source table is replicated into the Sink database, and the table is created according to what is in the Source. I know there is the "Auto create table" option when making the copy…
ADF | ADB Activity Execution Time on Job Clusters
Has anyone noticed adb notebooks running (on job clusters) faster in ADF ? we have sequential notebook activities and seeing the start up time of clusters to be as low as 2 minutes.
Invalid property name error in ADF Copy Data Activity
We have created pipelines for copying data from sharepoint. We have configured our copy data activity to store the file with some metadata also like author name, title, link, last modified, etc. Now for some files I'm getting below…
How to fix ErrorCode=UserErrorInvalidPluginType,'Type=Microsoft.DataTransfer.Common.Shared.PluginNotRegisteredException,Message=Invalid type 'GoogleBigQueryV2 Azure Data Factory
I am trying to copy data from Google Cloud BigQuery datasets to Azure Blob Storage as parquet files. I have followed this documentation to set up the Linked Service to Google Cloud. The connection to BQ is successful. Then, I created a Dataset in ADF…
SAP CDC and SAP Tables connectors regarding SAP Note 3255746 ?
Dear Team, Is there any one who when through the SAP Note 3255746 ? What impact shroud we expect regarding the SAP CDC and SAP Tables connectors already implemented for our customers? Thanks for you input. Tarik
‘Failed to execute script. Exception: 'Odbc Operation Failed.’ Error in Script activity in ADF
Hi Team, We have few pipelines in ADF where we are fetching the Data from multiple API endpoints using Copy Activity. We are connecting to API by leveraging ‘Rest API’ linked service and Integration Runtime(IR) as ‘AutoResolveIntegrationRuntime’. We are…
How to copy files from sharepoint drive that has huge heirarcy of folders and subfolders
We want to get all files from the sharepoint into our ADLS container. In sharepoint drive there are arround 60k files but through our existing pipeline we are getting only 600 files. We have a web activity with below…
How to store an array in a json file in ADF
I have a web activity with url drive/drive-id/items/folder-id/children then I have a filter activity that filters all folders and then I'm appending it in an array. Now what I want is to store all folder ids in a json file. and later I will load that…
Need help in understanding times in ADF's tumbling window trigger
Hi Team, I'm confused by many times in tumbling window trigger. We have at least 4 times for tumbling window trigger. @trigger().outputs.windowStartTime @trigger().outputs.windowEndTime @trigger().scheduledTime @trigger().startTime Besides, there…
How to resolve invalid property name error in ADF copy activity?
ErrorCode=AdlsGen2OperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=ADLS Gen2 operation failed for: Operation returned an invalid status code 'BadRequest'. Account: 'rmsdatalakestracc1mqa'. FileSystem:…
can use a dataset for pipeline but can't use for dataflow in Azure Data Factory
Created a dataset for Azure SQL and it works fine in any pipelines of Azure Data Factory. However, it gives me the connection error from Data Flows - refer to the below. The Azure SQL is configured to be accessible from any Azure services. Spark…