Dynamic dataset not working in Data flows but does work in Pipeline - Help!

Scott Stauffer 11 Reputation points
2022-07-06T18:22:45.567+00:00

I have an Azure Data Factory Pipeline that uses Connection services with parameterizable properties that passes the table schema name for an Azure SQL DB.
It has a Managed Virtual Network Integration runtime with Interactive authoring disabled.
And it works... But...
It does not seem to work with Mapping Data flows
I believe the connection string uses SQL Authentication.
IP Whitelisting is used necessitating the Managed Virtual Network.
Is there a workaround or configuration where this would work?
I'm experimenting now, but welcome some insight on things that simply aren't supported or workarounds that make things work.

I should also say that its also using Azure Key Vault as well. The Secret name is dynamic.

This is the error message I get when I test the connection:

Spark job failed: { "text/plain": "{\"runId\":\"[SomeGUID]\",\"sessionId\":\"[SomeGUID]\",\"status\":\"Failed\",\"payload\":{\"statusCode\":400,\"shortMessage\":\"com.microsoft.dataflow.broker.InvalidOperationException: Only one valid authentication should be used for [Some Name]. SQLAuthentication is invalid. One of user/password is missing.\",\"detailedMessage\":\"Failure 2022-07-06 18:55:43.499 failed DebugManager.processJob, run=[SomeGUID], errorMessage=com.microsoft.dataflow.broker.InvalidOperationException: Only one valid authentication should be used for [Some Name]. SQLAuthentication is invalid. One of user/password is missing.\"}}\n" } - RunId:[Some GUID]

Azure Key Vault
Azure Key Vault
An Azure service that is used to manage and protect cryptographic keys and other secrets used by cloud apps and services.
1,126 questions
Azure SQL Database
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,603 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Scott Stauffer 11 Reputation points
    2022-07-06T21:36:53.917+00:00

    Although the old connection string works for pipelines, I changed some of the parameters & their order and it works for Data flows now too:
    User ID=
    vs.
    User=
    added:
    Trusted_Connection=False;
    and
    ApplicationIntent=ReadOnly;Persist Security Info=True
    moved to the end;

    2 people found this answer helpful.