question

EvanWellens-2911 avatar image
0 Votes"
EvanWellens-2911 asked EvanWellens-2911 commented

Wild card parameters in ADF not working

I have a ADF copying JSON data in blob storage to MongoDB. Works perfectly if I specify the exact filename in the input parameters but if I use or <space> or .json or anything it fails 100% of the time saying blob doesn't exist. Is this not supported ? Seems from the documentation it is, but I am possibly not using it as designed. See the file name changes every day so I need to pick up the file by some pattern. Any suggestions would be great.

azure-data-factory
· 1
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

More info, here is the error peration on target Source_Copy_Copy failed: ErrorCode=UserErrorSourceBlobNotExist,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The required Blob is missing. ContainerName: https://xxxxxxxxxx/eng-blob-source, path: DependencyFiles/Source/*.,Source=Microsoft.DataTransfer.ClientLibrary,'

122388-triggerstar.png122425-triggerfile.png


0 Votes 0 ·
triggerstar.png (11.1 KiB)
triggerfile.png (18.5 KiB)

1 Answer

KranthiPakala-MSFT avatar image
0 Votes"
KranthiPakala-MSFT answered EvanWellens-2911 commented

Hi @EvanWellens-2911,

Welcome to Microsoft Q&A forum and thanks for posting your query.

Wild card notation for parameters is not supported in ADF hence you are seeing error while trying to use wild card notation for parameters. You can use wildcard naming in Copy activity source and sink data set settings.

As per my understanding your source file name gets updated everyday and hence you want your pipeline to be more dynamic in processing the latest renamed file. Please correct me if I misunderstood your requirement.

If that is the case then you have can utilize Storage Event trigger feature available in ADF. With Event triggers you can trigger your pipeline when there is a change in file name or a new file is uploaded to the source folder. You can pass the source folder path and file name dynamically from Event trigger properties to your pipeline dataset. This way you can avoid the above issue and meet your requirement.

Below doc has detailed description about Storage Event triggers with sample:

Hope this info helps. Do let us know if you have further query.



Please don’t forget to Accept Answer and Up-Vote wherever the information provided helps you, this can be beneficial to other community members.


· 4
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Hi @EvanWellens-2911,

Just checking in to see if the above suggestion was helpful. If it answers your query, please do click “Accept Answer” and/or Up-Vote, as it might be beneficial to other community members reading this thread. And, if you have any further query do let us know.

0 Votes 0 ·

Hi @EvanWellens-2911,

We still have not heard back from you. Just wanted to check if the above suggestion was helpful? If it answers your query, please do click “Accept Answer” and/or Up-Vote, as it might be beneficial to other community members reading this thread. And, if you have any further query do let us know.

0 Votes 0 ·

Sorry this must have gotten picked up by spam fitlers. I don't get it, why does the documentation state wild cards are supported ? https://azure.microsoft.com/en-us/updates/data-factory-supports-wildcard-file-filter-for-copy-activity/

1 Vote 1 ·
Show more comments