Hi,
I have a question about triggers in Azure Data Factory.
My scenario is: I have 4k+ databases (one per client, distributed on many servers and elastic pools) on each database have a log table. I want to copy data from this table to a Blob Storage, creating one Blob per database.
I have one Azure Function that gets the database, server and elastic pool names and insert them into a storage queue in a JSON format. I have the plans to use that data to use on the pipeline in Data Factory.
But the Data Factory does not have a queue trigger, only a storage trigger, so instead of inserting those data on a queue, I created a json blob file that will be used for a trigger. But Data Factory I'm not finding the right way to make Data Factory read the contents of the triggered file.
Is there any way to achieve this? To make the Data Factory read the contents of the blob storage and use it as a Object Parameter to a pipeline?
Trigger:
Activity
JSON File
{
"Name": "DatabaseName",
"Pool": "ElasticPoolName",
"Server": "ServerName"
}
Thank you;