question

CloudTexavie-1120 avatar image
0 Votes"
CloudTexavie-1120 asked ·

Real-time streaming to Web App

Is there any recommended way to stream real-time processed data (for e.g in Azure Databricks) to a custom web app deployed with App Service (not PowerBI dashboards) with minimum latency (data streaming at 50Hz)

azure-webappsazure-databricks
10 |1000 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

1 Answer

PRADEEPCHEEKATLA-MSFT avatar image
0 Votes"
PRADEEPCHEEKATLA-MSFT answered ·

Hello @CloudTexavie-1120,

Welcome to the Microsoft Q&A platform.

After intensive research effort, unfortunately I could not a find recommending way to stream real-time data from Azure Databricks to a custom web app deployed with App Service.

Here are the couple of alternatives:

If Spark structured streaming (micro-batching), it's perhaps best to write the processed data to a low latency key-value/doc store like Cosmos DB. The webapp can then read from there but I've not seen any other architectures in production.

If you are using Kafka/Confluent for the ingestion, and don't really require Spark streaming component behind it, the confluent ecosystem does suggest the use of http sink connector: https://www.confluent.io/blog/webify-event-streams-using-kafka-connect-http-sink/

One can also use Kafka + Spark streaming where Spark can write back the processed data back to another topic in Kafka and then use the http sink.

That all said, all these options really depend on many different factors - SLAs, use-cases, cost considerations, performance/scalability, ease of operations etc.

Structured streaming with Azure Databricks into Power BI & Cosmos DB

Stream processing with Azure Databricks

Hope this helps. Do let us know if you any further queries.


Please don’t forget to Accept Answer and Up-Vote wherever the information provided helps you, this can be beneficial to other community members.


· 9 ·
10 |1000 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Thank you very much for this answer! There are a few points I would like to clarify:

The webapp can then read from there but I've not seen any other architectures in production.

Do you mean that it is not very common to use low-latency key-value stores for real-time streaming in production ?

If you are using Kafka/Confluent for the ingestion, and don't really require Spark streaming component behind it, the confluent ecosystem does suggest the use of http sink connector

Would there be any major downside to processing within Kafka Streams instead of Spark

One can also use Kafka + Spark streaming where Spark can write back the processed data back to another topic in Kafka and then use the http sink.

Our current architecture uses IOT Hub instead of Kafka for message ingestion, would IOT Hub--> Spark --> Kafka be a viable architecture (since IOT Hub can not be used as a sink for Databricks) ?

0 Votes 0 ·

Hello @CloudTexavie-1120,

Thanks for the follow-up questions.

We are reaching out to the internal team to get more help on this, I will update you once we hear back from them.

0 Votes 0 ·
CloudTexavie-1120 avatar image CloudTexavie-1120 PRADEEPCHEEKATLA-MSFT ·

Great, thank you !

0 Votes 0 ·
Show more comments