question

GiampaoloSpagoni-3120 avatar image
0 Votes"
GiampaoloSpagoni-3120 asked AnuragSingh-MSFT commented

compress and write a file in another container with azure blob storage trigger in nodejs

I have to make an API call passing a compressed file as input. I have a working example on-premise but I would like to move the solution to cloud. I was thinking to use azure blob storage and the azure function trigger. I have the below code that works for files but I don't know how to do the same with Azure blob storage and azure function in nodejs

const zlib = require('zlib');
const fs = require('fs');

const def = zlib.createDeflate();

input = fs.createReadStream('claudio.json')
output = fs.createWriteStream('claudio-def.json')

input.pipe(def).pipe(output)

this code read a file as a stream , compress the file and write another file as a stream.

what I would like to do is reading the file any time I upload it in a container of azure blob storage, then I want to compress it and save it in a different container with a different name, then make an API call passing as input the compressed file saved in the other container

I tried this code for compressing the incoming file

const fs = require("fs");
const zlib = require('zlib');
const {Readable, Writable} = require('stream');
module.exports = async function (context, myBlob) {
context.log("JavaScript blob trigger function processed blob \n Blob:", context.bindingData.blobTrigger, "\n Blob Size:", myBlob.length, "Bytes");
// const fin = fs.createReadStream(context.bindingData.blobTrigger);
const def = zlib.createDeflate();
const s = Readable.from(myBlob.toString())
context.log(myBlob);
context.bindings.outputBlob = s.pipe(def)
};
the problem with this approach is that in the last line of the code

context.bindings.outputBlob = s.pipe(def)

I don't have the compressed file, while if I use this

s.pipe(def).pipe(process.stdout)
i can read the compressed file

as you can see above I also tried to use the fs.createReadStream(context.bindingData.blobTrigger) that contains the name of the uploaded file with the container name, but it doesn't work

any idea? thank you

azure-functionsazure-blob-storage
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

1 Answer

AnuragSingh-MSFT avatar image
0 Votes"
AnuragSingh-MSFT answered AnuragSingh-MSFT commented

@GiampaoloSpagoni-3120

The JavaScript and Java functions load the entire blob into memory which can be accessed using context.bindings.<name>. (Where <name> is the binding name as specified in the function.json file. )

For more details, please check Azure Blob storage trigger for Azure Functions

Therefore, it does not give you input stream as you get using fs.createReadStream or an output stream as you get when using fs.createWriteStream
A simplistic example to read content of blob and write it to a different blob would look like below (you may include the logic to process the input content so that processed data gets written to the new blob in destination):

 module.exports = async function (context, myBlob) {
     context.log("Function Triggered. ", context.bindingData.blobTrigger);
        
 var input = context.bindings.myBlob;
     /*
     process the whole content stored in "input" here (encode/compress etc.);
     In my example, I am uploading a .txt file with some data, therefore input
     has the data contained in that .txt file. 
     */
     
     context.bindings.myOutputBlob = input;
     //this simply writes the content of input to output blob.
 };

The corresponding bindings in the function.json file would like below:

 {
   "bindings": [
     {
       "name": "myBlob",
       "type": "blobTrigger",
       "direction": "in",
       "path": "source/{name}",
       "connection": "storageaccountazfuna20e_STORAGE"
     },
     {
       "name": "myOutputBlob",
       "type": "blob",
       "path": "dest/{name}-processed.txt",
       "connection": "storageaccountazfuna20e_STORAGE",
       "direction": "out"
     }
   ]
 }

Here, {name} is the name of blob which triggered this function.
Please 'Accept as answer' and ‘Upvote’ if it helped so that it can help others in the community looking for help on similar topics.



· 5
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

thank you AnuragSingh-MSFT for your answer
the problem is that the zlib labrary is expecting a readable stream and a writable stream
so I need to convert the input into a readable stream, then pass it to zlib and write the output in the writable stream and convert it into a string to write a file in the other container
but it seems that you have to implement your own writable stream implementing the interface of the stream.Writable
do you have some example how to convert string into readable stream and how to write writable stream as well?
thank you
GS

0 Votes 0 ·

@GiampaoloSpagoni-3120
Thank you for the update. Since the string/content is already in memory, a stream would not be necessary to work with zlib. The below code snippet uses the deflateSync method from zlib to perform the compression. More details are available at here.

 var input = context.bindings.myBlob;
    
 var inputBuffer = Buffer.from(input);
 var deflatedOutput = zlib.deflateSync(inputBuffer);
    
 context.bindings.myOutputBlob = deflatedOutput;

I hope it helps.


0 Votes 0 ·

thank you AnuragSingh_MSFT
i'll try your solution and I let you know

GS

1 Vote 1 ·

thank you so much for your help
it works :)

GS

0 Votes 0 ·
Show more comments