Hi Team,
For one of the requirement I need to copy data from Snowflake to Cosmos DB. Later I want to expose the data using Rest API on top of Cosmos DB (built in feature). But that's for later. Right now my problem is to copy the data.
Since data is simple and does not require much transformation I thought it should be a simple thing to do using ADF. So I plan to use a ADF pipeline and inside pipeline I plan to use Copy Data Activity.
The data in the Snowflake(The source) looks like,

And the data in the Cosmos DB should look like as below,
{
"id": "123",
"String Col": "Some String",
"Some Ints": [
123,
234
],
After configuring all the basic steps like creating linked service, mapping the columns etc. when I try to publish the pipeline it gives an error,
Direct copying data from Snowflake is only supported when sink dataset is DelimitedText, Parquet or JSON with Azure Blob Storage linked service, for other dataset or linked service, please enable staging

Question 1: Is the copying from Snowflake to Cosmos DB not supported?
As I understood the answer to above question is "not supported" and to copy the data between these two we need to use an option "staging". Which means a blob storage would be used as temporary staging storage.
Question 2: Is my above understanding correct?
If answer to question #2 is correct then I went ahead and thought to use a existing blob storage. It gives me other error,

Question 3: I really do not understand what should we do now? There's no concrete example as how to retrieve the Sas url to the blob. Does this url need to be pointing to an existing file location?
Question 4: Doing all this to copy just a simple table, is it worth doing it all this? Is this a good solution?
Helps is highly appreciated.


![[6]: https://docs.microsoft.com/en-us/azure/data-factory/connector-snowflake#staged-copy-from-snowflake 110422-image.png](/answers/storage/attachments/110422-image.png)



