Hello,
I have an Azure ML Batch Endpoint that successfully submits. I have made the input dataset a parameter. Now, ideally, I'd want to do the following:
- Execute the Endpoint from Azure Data Factory, while passing in a dynamic dataset that exists on Blob Storage.
- Receive the output dataset and continue consuming the returned dataset with other activities on Azure Data Factory.
So far, I've had no luck or access to clear documentation that can help me achieve this functionality:
- I followed this tutorial but it seems you can use this Activity on Azure Data Factory only to trigger an experiment on Azure ML and this doesn't mention passing in datasets from Azure Data Factory.
- I set up my batch endpoint using this tutorial and the only way this mentions the consumption of an endpoint is either manually through Azure ML or through the REST Api.
Overall, I'd like to know the feasibility of my desired solution. The worst case seems that I'd have to write a Python script that can create datasets on Azure ML, then trigger the batch endpoint pipeline (using the REST Api), and then re-upload the model output on a desired location in Blob Storage, and run this Python script on a Execute Batch Activity.
Additionally, I was wondering if it's possible to get sample code for the REST API code for consuming the batch endpoint like I did for a real-time endpoint.
Thanks,
Varun.