VictorGarciaExposito-5884 avatar image
0 Votes"
VictorGarciaExposito-5884 asked SumanthMarigowda-MSFT commented

Problem with azure blob storage encoding when uploading a file

Hi! I'm uploading files to Azure Blob Storage with the .Net package specifying the encoding iso-8859-1. The stream seems ok in Memory but when I upload to the blob storage it ends with corrupted characters that seems that could not be converted to that encoding. It would seem as if the file gets storaged in a corrupted state and when I download it again and check it the characters get all messed up. Here is the code I'm using.
public static async Task<bool> UploadFileFromStream(this CloudStorageAccount account, string containerName, string destBlobPath, string fileName, Stream stream, Encoding encoding)
if (account is null) throw new ArgumentNullException(nameof(account));
if (string.IsNullOrEmpty(containerName)) throw new ArgumentException("message", nameof(containerName));
if (string.IsNullOrEmpty(destBlobPath)) throw new ArgumentException("message", nameof(destBlobPath));
if (stream is null) throw new ArgumentNullException(nameof(stream));
stream.Position = 0;
CloudBlockBlob blob = GetBlob(account, containerName, $"{destBlobPath}/{fileName}");
blob.Properties.ContentType = FileUtils.GetFileContentType(fileName);
using var reader = new StreamReader(stream, encoding);
var ct = await reader.ReadToEndAsync();
await blob.UploadTextAsync(ct, encoding ?? Encoding.UTF8, AccessCondition.GenerateEmptyCondition(), new BlobRequestOptions(), new OperationContext());
return true;
This is the file just before uploading it

And this is the file uploaded


Any help is highly appreciated. Thanks in advance!

image.png (35.8 KiB)
image.png (33.3 KiB)
· 1
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

@VictorGarciaExposito-5884 Firstly, apologies for the delay in responding here! I am checking on this thread!
In mean while is see you have posted similar query in SO forum, , Have you referred to the suggestion mentioned in the thread

0 Votes 0 ·

1 Answer

SumanthMarigowda-MSFT avatar image
0 Votes"
SumanthMarigowda-MSFT answered SumanthMarigowda-MSFT commented

@VictorGarciaExposito-5884 Are using the .NET SDK? If so is it the current SDK (Azure.Storge.Blobs) or the legacy SDK (WindowsAzure.Storage), and which version?
Looking at the code snippet, It looks like you on Microsoft.WindowsAzure.Storage.Blobs (<= v9.x) or Microsoft.Azure.Storage.Blobs (v10.x/v11.x). We highly recommend upgrading to Azure.Storage.Blobs (v12.x). if possible, but this is not a simple upgrade, as the API is entirely different.

There’s a lot of encoding code in the snippet you gave, and the corrupted character in your example is one where iso 8859 1 does not play well with encodings like utf8 (what the code uploads as if the Encoding is not provided). You also do not show how the file is downloaded to compare data. There’s a lot of room for the >7F characters to get corrupted before it even gets to the storage SDK.

I’m not sure why they want to read the file into a string first. You can just upload the raw filestream bytes and not have to worry about any encoding tranformations. Something like this can potentially sidestep any debugging work if the string conversion is unncecessary.

Please let us know if you have any further queries. I’m happy to assist you further.

Please do not forget to 197089-screenshot-2021-12-10-121802.png and “up-vote” wherever the information provided helps you, this can be beneficial to other community members.

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.