Do we incur extra charges for using event hup capture to stream data into Cosmos DB?

Rishabh Tiwari 40 Reputation points
2023-07-28T09:21:50.9566667+00:00

Do we incur extra charges for using event hup capture to stream data into Cosmos DB? I am using Standard Tier Azure Event Hub with 2 Throughput Units in US East region. I am also capturing data in parquet, JSON, CSV and delta lake format in ADLS Gen2. In standard tier, capture charges 73 dollars per month per throughput unit. Will streaming to cosmos db costs extra on top of 73 dollars per month per throughput units?

Azure Event Hubs
Azure Event Hubs
An Azure real-time data ingestion service.
564 questions
Azure Stream Analytics
Azure Stream Analytics
An Azure real-time analytics service designed for mission-critical workloads.
333 questions
Azure Cosmos DB
Azure Cosmos DB
An Azure NoSQL database service for app development.
1,456 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Q&A Assist
    2023-07-28T09:21:50.96+00:00
    AI generated content. This question contains an answer created with AI from Q&A Assist. Learn more

    Using Azure Event Hubs Capture to stream data into Cosmos DB will not incur any extra charges. However, it is important to note that in the Standard tier of Azure Event Hubs, the traffic is controlled by throughput units and Event Hubs Capture charges $73 per month per throughput unit. In the Premium tier, traffic is controlled by processing units and the achievable throughput depends on various factors such as payload size, partition count, egress request rate, and usage of Event Hubs Capture. It is recommended to use the Premium tier for high-end streaming scenarios that require elastic and superior performance with predictable latency.

    Additionally, Event Hubs Capture copies data directly from the internal Event Hubs storage, bypassing throughput unit or processing unit egress quotas and saving your egress for other processing readers, such as Stream Analytics or Spark. Once configured, Event Hubs Capture runs automatically when you send your first event and continues running. The destination storage account (Azure Storage or Azure Data Lake Storage) must be in the same subscription as the event hub and supports any storage account with support for block blobs.

    References:

    0 comments No comments