question

FranciscoDominguez-7929 avatar image
0 Votes"
FranciscoDominguez-7929 asked MartinJaffer-MSFT commented

Query by pipeline run empty body responde

Hi,

I'm trying to run an HTTP request to gather the data of the activities ran in a certain pipeline. The documentation I've been following is this one: https://docs.microsoft.com/en-us/rest/api/datafactory/activity-runs/query-by-pipeline-run

On top of that document, we have a "Try it" button. I ran a certain pipeline in my ADF and filled the gaps as needed. I also set the request body as follows:

 {
  lastUpdatedAfter: "2021-07-27T10:35:15.805",
  lastUpdatedBefore: "2021-07-27T10:39:16.805"
 }

Which is the execution time of that pipeline with a certain margin. But, I'm receiving no body when executing that request either via ADF (using an Azure Function) or via the mentioned button.

 {
   "value": []
 }

The response code was 200 every time, so I don't really know what I'm doing wrong.

Any help is appreciated.

Thanks.

azure-data-factory
· 2
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Hello @FranciscoDominguez-7929 and welcome to Microsoft Q&A.

Is it possible the datetimes you gave were too narrow? When I specify a window, but put it too early / late , I get the same response, empty with 200 code.
The datetime window is not strictly required. When I omit the window ( an empty body), I get all the activities for that pipeline run.

I find that in practice, since a pipeline run is usually a single instance, the datetime window is useful mainly with a very long running pipeline.

1 Vote 1 ·

Hi @MartinJaffer-MSFT, thanks for your answer.

I've tried both. As I said above, I set a nice margin in the body fields and I get the same response. Also, if I leave the body empty, I'll also get the 200 code but an empty body.

Could you please confirm that using the 'Try It' tool in here using one of your pipelines, outputs the results for all the activities of that pipeline?

One last thing, are there any requirements for the pipeline to be queried? I mean, my data factory is not published yet. Does that matter when performing the query?

Thank you for you kind answer!

Regards.


0 Votes 0 ·

1 Answer

MartinJaffer-MSFT avatar image
0 Votes"
MartinJaffer-MSFT answered MartinJaffer-MSFT commented

Yes, the publishing DOES matter, @FranciscoDominguez-7929 .

Only published pipeline runs are "real". Debug runs are considered separate from triggered runs. If your pipeline has not been published, then I figure you must have been using debug runs.

This is equivalent to opening the visual authoring tool, and going to the monitor section. There are separate sections for "pipeline runs" and "debug runs".

Does this make sense?

· 4
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Yes, definitely this makes sense @MartinJaffer-MSFT

I actually managed to figured this out before you replied, but took me a nice amount of hours to figure this out. In my opinion the documentation should be a little more specific on this kind of matters.

After publishing and triggering a certain pipeline, I could use that pipeline's run ID to run the query, and it worked.

Thanks a lot for your help. This is solved.

Regards.

0 Votes 0 ·

Thank you for letting me know. I think you might be the third person with this confusion. I'll let the document team know this is a trend now.

I think the development team figured that debug runs were just for testing and development, so nobody would care to look at debug runs historically. As far as I know, the debug runs are only triggered by the visual authoring tool, and that shows you the output right there, so it didn't make sense to expose debug runs via public api.

That's my hypothesis anyway. Is there something I am missing, like a use case or another way to start debug runs, @FranciscoDominguez-7929 ?

1 Vote 1 ·

No, I agree @MartinJaffer-MSFT , there isn't actually any other use case. I'm a beginner to Azure and I was just using it that way. I think it would be enough if it was specified somehow that the runID for that (and probabl others) API call MUST be of a published pipeline.

I'm aware that my issue was just too specific and natural from someone who doesn't know Azure too much, but a simple sentence along with the current one in the docs won't hurt the more experienced developers and will definitely help us the noobs ;)

Thank you.

0 Votes 0 ·
Show more comments