question

RajendranPrakash-2765 avatar image
0 Votes"
RajendranPrakash-2765 asked RajendranPrakash-2765 commented

How to add escape character in derived column and while reading json with just one escape character?

This is my expression in derived column:
concat("\\/Date(",toString(toLong(toLong(toTimestamp(concat(toString({SOWING FEMALE}, 'yyyy-MM-dd'), " 00:00:00") ,'yyyy-MM-dd HH:mm:ss')-toTimestamp('1970-01-01 00:00:00.000','yyyy-MM-dd HH:mm:ss.SSS'))/1000l)), '000+0200)\\/')

My expected output is "\/Date(1631577600000+0200)\/"

but when I store data in as json format, it gets stored into "\\/Date(1631577600000+0200)\\/". How can I keep just one escape character?

azure-data-factory
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

cooldadtx avatar image
0 Votes"
cooldadtx answered RajendranPrakash-2765 commented

The data is correct. JSON uses '\' as an escape character. Therefore if you want a single '\' in the final data you have to put '\\' in the JSON. Otherwise when the data is read out of the JSON it comes back wrong. This is identical to how C++ and C# work with their string literals.

So, "\\/Date" will be read out of the JSON as \/Date. But "\/Date" would either produce an error or potentially just be Date, depending upon the parser.

· 4
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

@cooldadtx yeah understood your point. I am out of ideas how to put '\\' in the JSON. I read the stored JSON in the look up activity and not sure how to mention one more escape character into that.

0 Votes 0 ·
cooldadtx avatar image cooldadtx RajendranPrakash-2765 ·

To be clear it isn't your responsibility to put the escape character in the string in most cases. Presumably you're using some function or writer to generate the JSON. It'll handle the escaping for you. So if you pass \blah to the JSON writer it should convert it automatically to \\blah and it'll do the inverse on a read. The only time you would need to do this manually is if you were trying to write JSON by hand which isn't generally recommended.

As an example, if you're using the json expression json('\/Date') then it should return back the JSON formatted string \\Date/.

0 Votes 0 ·

@cooldadtx Just curious, do you think if I read the JSON file which has this input "/Date(1631577600000+0200)/" through custom activity (python) might help to pass the input value to API as it is "/Date(1631577600000+0200)/" instead of converting to UTC format ? I am just checking if this will work before requesting to the client for batch pool.

0 Votes 0 ·
cooldadtx avatar image cooldadtx RajendranPrakash-2765 ·

The conversion is happening as part of the JSON serializer so it depends on the serializer. Do you need the offset in the date? If not then set it to +0000 to treat it as UTC (hence no conversion). If you do need it could you show the logic (or UX) that you're using to convert that string you showed initially to JSON?

I'm also wondering about the title of the post. It says reading JSON but I believe your ADF is writing the JSON to a third party API correct?

0 Votes 0 ·
RajendranPrakash-2765 avatar image
0 Votes"
RajendranPrakash-2765 answered cooldadtx commented

yes actually I do not want to add escape character but what happened was my initial input looked like this "/Date(1631577600000+0200)/". This then I pass to API. JSON reader automatically converts this into UTC format before calling API. That's when I thought to use escape character but no luck.

· 2
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

I see. Unfortunately escape characters aren't going to work here. The JSON reader is seeing Date(...) and converting it to UTC and escape characters aren't going to change that.

Firstly I'd ask the question why it matters? APIs that support date(time) values should handle the value in any format. They can tell the difference between UTC and timezone-offset values. It is up to them to determine how to handle it. In JSON the value is actually just a string. JSON doesn't directly support "date", it is just one of the many possible string values that can be sent.

To work around this it is going to depend upon what your API is actually expecting. If it is an older API then it might not understand timezones or UTC at all. In that case consider getting rid of the Date() and just pass the raw date value such as (yyyy-MM-dd). If it requires that you use Date() then drop the offset. The reader won't be able to tell whether it is local or UTC and should just treat it as UTC. Alternatively you might need to convert to UTC on your side. Again, it depends on what the API is expecting. In the long term though the API should be updated to support dates in any format (UTC or timezone offset) as this is the standard.

0 Votes 0 ·

BTW here's an interesting article where someone seems to have been running into a similar issue with ADF and how they solved it.

0 Votes 0 ·
RajendranPrakash-2765 avatar image
0 Votes"
RajendranPrakash-2765 answered

I agree with all your points. I have already put those questions to my team. Just like mentioned in the article ,same here that API is out of control as it is maintained by the client. So I have to check with them. I will also see if the article can give me an work around. Thank you very much @cooldadtx . I will post the update.

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

RajendranPrakash-2765 avatar image
0 Votes"
RajendranPrakash-2765 answered cooldadtx commented

Yes.. Like I mentioned I thought adding an escape character would solve this issue but unfortunately not and hence the title of this post as such. But it's true that JSON is writing towards 3rd party API. My oringal logic as below

concat("/Date(",toString(toLong(toLong(toTimestamp(concat(toString({date_column}, 'yyyy-MM-dd'), " 00:00:00") ,'yyyy-MM-dd HH:mm:ss')-toTimestamp('1970-01-01 00:00:00.000','yyyy-MM-dd HH:mm:ss.SSS'))/1000l)), '000+0200)/')

· 1
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Right, I understand the expression you're using to generate the string value but I'm trying to understand how you're then converting that to JSON as part of passing it to your API. Is this string being set as the requestBody in your ADF's Copy Activity task? Are you using the REST as Source or REST as Sink approach here? I'm assuming this is a POST/PUT request and not a GET as GET requests cannot have a body. Are you using a mapping step to map the data to the JSON body before you make the request. If so then for the specific date field did you set it to DateTime or String for the type? If you're currently using DateTime then consider changing it to String and see if the JSON serializer skips the conversion (it should).

0 Votes 0 ·
RajendranPrakash-2765 avatar image
0 Votes"
RajendranPrakash-2765 answered RajendranPrakash-2765 commented

The below screenshots tell you the flow. I use look up to call the json files and send each row into for each loop which has web activity.


158276-web-activity.png158287-api-call.png




web-activity.png (22.2 KiB)
api-call.png (12.4 KiB)
· 7
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Please use the comment button to respond to answers so we can more easily track the discussion.

So you're not even using the REST task so there should be no JSON conversion happening I believe. But given your screenshot it looks you already have created the JSON in an earlier step and your web request is literally just taking the string value that is stored in @item() and passing it along. Can you confirm that @item() has the messed upon JSON already? If so then the issue is in your earlier step where Lookup is getting the JSON so we need to take a step back and look at how that data is getting populated.

0 Votes 0 ·

yeah I have already created the JSON in the dataflow before I call the look up. I am posting the data flow screenshot which stores the JSON file in here. If you look at the 3rd screen shot , that's the output from look up activity during debug. Date is not read as string. When I open my JSON file, I can see this is how data stored as "Completed_date":"/Date(1620000000000+0200)/" But the output from look up says otherwise.


158324-pipeline.png158345-dataflow.png


158318-lookup-output.png


0 Votes 0 ·
pipeline.png (13.9 KiB)
dataflow.png (32.4 KiB)
lookup-output.png (28.2 KiB)
cooldadtx avatar image cooldadtx RajendranPrakash-2765 ·

Can you post what the ConvertDateTo... task is doing? Is that where your expression that you initially posted is coming from? I believe there is an option in the transformation task to preview the data. Is the data correct in the preview.

Also, to clarify I assume that you confirmed that the actual JSON sent to the API is what is also shown in the Output window you posted. Just want to make sure the issue here isn't just the output view converting data but the actual API gets the correctly formatted data.

0 Votes 0 ·
Show more comments
RajendranPrakash-2765 avatar image
0 Votes"
RajendranPrakash-2765 answered

I also tried to use REST API as sink in the DATA FLOW to try to POST directly to API. But I could not succeed. Some how REST API connectivity is not working as sink in data flow but works in web activity. May be something that I need to check with the team.

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

RajendranPrakash-2765 avatar image
0 Votes"
RajendranPrakash-2765 answered

I also tried External call transformation and (Alter row + REST API as sink) in data flow. But no luck.. Either I do something wrong or REST API connectivity does not work.

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.