I have a pipeline with a series of Data Flows in them. These data flows read data from a Cosmos database and write them to a SQL DW instance. However I am required to have a staging folder location for these. So I create a Gen2 storage location, create a container and give the data factory both Contributor and Storage Blob Data Owner permissions to the storage to enable control/read/write.
I then assign the stage folder in the settings of the Data Factory pipeline and test it which seems to work:

However, I go to run the pipeline via a time trigger and the executions fail. They fail on the write to sink task, claiming there is some missing credential:
What am I doing wrong here, this looks like it should work? Everything is located in Aus East.
Oh, this is trying to write to a Synapse DW. This is the storage location role assignments:







