Hello,
At the moment I am developing an ETL on azure data factory. The proccess runs well, but in the steps where there are no files to be proccesed. It throws errors. To solve this, I am checkig if there are any files inside the origins folders. In most of the cases, I am able to check if there are any files inside a folder. Because they are inside a blob container of a Storage Account. 
But In one case the origin files are on a 'Shared folder' of a blob storage because It is mount as a SFTP. In that case, the unique way I found to acces the files. Where by using a "Azure File Storage" as service link. 
With the tests I have made I does not work as the "Azure Data Lake Storage Gen2" where by using '*.csv' on the file path founds any file inside the folder. Instead it returns this error:
Field 'exists' failed with error: 'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Azure File operation Failed. Path: originfolder/*.csv. ErrorMessage: Error Message: The specifed resource name contains invalid characters. (ErrorCode: 400, Detail: The specifed resource name contains invalid characters., RequestId: ac72b70d-301a-0001-4757-0d6913000000).,Source=Microsoft.DataTransfer.ClientLibrary,''Type=Microsoft.Azure.Storage.StorageException,Message=The specifed resource name contains invalid characters.,Source=Microsoft.Azure.Storage.Common,'.
I hope I made me understand. Thanks in advance
EDIT: This is the method i get the info:
In the next step I just have to access the previous step output.exists




