question

JADB-6958 avatar image
0 Votes"
JADB-6958 asked GurvinderKandhola-1509 commented

Pipeline run failed although checked connections during creation process worked

I built a pipeline with Data Factory, reading data from a MongoDB and storing the data in a Data Storage Lake Gen2. When checking the connection to source and target both connections worked without any problems. But checking the "File Path" connection concerning the sink, I am resceiving and authorization error. The connections is set up with a Managed Identity. Moreover, I added the resource of the Data Factory to the data storage.

When executing the pipeline the run fails and I am receiving the following error:

Operation on target Copy_36v failed: Failure happened on 'Sink' side. ErrorCode=AdlsGen2OperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=ADLS Gen2 operation failed for: Operation returned an invalid status code 'Forbidden'. Account: 'xxxxx'. FileSystem: 'mongodbleads'. Path: 'output/data_62........txt'. ErrorCode: 'AuthorizationPermissionMismatch'. Message: 'This request is not authorized to perform this operation using this permission.'.
...
'Forbidden',Source=,''Type=Microsoft.Azure.Storage.Data.Models.ErrorSchemaException,Message=Operation returned an invalid status code 'Forbidden',Source=Microsoft.DataTransfer.ClientLibrary,'

Any help would be appreciated. It seems that the Data Factory still can't totally access the Data Storage.

Do I manually have to add a Blob Container at my target Data Lake Gen2?


Thanks!

azure-data-factoryazure-data-lake-storage
· 2
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Hello @JADB-6958 and thank you for your question.

I have a question. You said:

Do I manually have to add a Blob Container at my target Data Lake Gen2?

Are you trying to create new containers via Data Factory? Are you telling it to use containers not yet existing?

Also, please let us know whether VaibhavChaudhari 's answer solved your issue, or if you need more assistance.



0 Votes 0 ·

If you are still looking for assistance on this query, please let us know, @JADB-6958

0 Votes 0 ·
VaibhavChaudhari avatar image
0 Votes"
VaibhavChaudhari answered

Ensure Storage blob data contributor role is given to ADF MI ID. See second point in below document -

connector-azure-data-lake-storage


===============================================
If the response helped, do "Accept Answer" and upvote it -- Vaibhav

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

GurvinderKandhola-1509 avatar image
0 Votes"
GurvinderKandhola-1509 answered GurvinderKandhola-1509 commented

Can any one please help on this, we are stuck on this and this is in production.

We are getting this error message in ADF while writing to the BLOB (ADLS Gen2) storage.(
"Failure happened on 'Sink' side. Error Code=AdlsGen2OperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=ADLS Gen2 operation failed for: Operation returned an invalid status code 'Forbidden'.Account ")
While using the **selected network*s option under Networking in Storage Account. We have given the Storage blob data contributor Role* to ADF in the Storage Account. This is working fine with All networks, but not with the Private End Point. We had created a private end point and have approved the permissions to the request from ADF , which is generated while creating the end point in data factory.

107192-image.png




image.png (11.5 KiB)
· 2
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

I have the same problem, did you resolve the issue? it could help me , thanks

0 Votes 0 ·

Hi Alan,

You can refer to this link:

https://docs.microsoft.com/en-us/answers/questions/442909/adf-unable-to-write-to-blob-storage-using-private.html?childToView=458554#comment-458554

While using Private End Point ,we need to use managed identity for Authentication Method.

0 Votes 0 ·