In-editor Debugging Telemetry Reference Architecture
This reference architecture focuses on the development phase and a small number of users, gathering data from gameplay sessions and displaying it directly within the game engine - Unreal Engine in this case. It provides the fastest response time so your development and QA teams don't have to wait to get results from testing sessions.
This article will describe the architecture used in this sample on GitHub. Keep in mind that the code from this reference architecture is only an example for guidance and there may be places to optimize the code before using in a production environment.
- Azure Event Hub - Selected as it's a service tailored for analytics pipelines and is simple to use with little configuration or management overhead. It is capable of receiving and processing events in real-time.
- Azure Functions - Selected because we would like a simple authentication mechanism, as well as to customize how data processed and queried.
- Azure Cosmos DB - Selected for being able to scale and store data with a higher rate of ingest.
Step by step
- Invoke the Azure Function from the device client, sending the telemetry data.
- Validate and forward that data to the Azure Event Hub.
- The Azure Event Hub triggers a second Azure Function that transforms the data into individual Azure Cosmos DB documents.
- From the Azure Function target, add a new document to the Azure Cosmos DB database with the telemetry data.
- Within the game engine, a query is generated sent to an Azure Function that converts it into an Azure Cosmos DB query.
- The data is the pulled from Azure Cosmos DB and returned to the game engine for visualization.
If you are looking to visualize data in a dashboard, hook up the Azure Cosmos DB database to Power BI.
Click the following button to deploy the project to your Azure subscription:
This operation will trigger a template deployment of the telemetry_server.deployment.json ARM template file to your Azure subscription, which will create the necessary Azure resources. This may induce charges in your Azure account.
Have a look at the general guidelines documentation that includes a section summarizing the naming rules and restrictions for Azure services.
If you're interested in how the ARM template works, review the Azure Resource Manager template documentation from each of the different services leveraged in this reference architecture:
To run the Azure Functions locally, update the local.settings.json file with these same app settings.
- Validates the incoming telemetry payload
- Transforms the data into the expected format for the next stage of the data pipeline
- Sends the data on to the Azure Event Hub
- Returns 202 if the data was accepted by the Azure Event Hub
Event Hub Trigger Function
- Reads the event data payload
- Creates individual Azure Cosmos DB documents for each event
- Uploads the documents to Azure Cosmos DB
- Parses the client generated query
- Generates a Azure Cosmos DB SQL formatted query
- Wraps the results in a JSON object and returns them to the client
Choosing the right pricing plan for your needs will depend on much the telemetry service is used, and what else is running in the same Azure Function App.
You can expire old data automatically stored in Azure Cosmos DB using Azure Cosmos DB TTL (Time To Live), setting a time horizon where stored documents will be purged.
Events sent to the Ingestion Azure Function should be batched on the client to reduce HTTP overhead. Consider using client-side compression if the batches are large.
Compressing the batches server-side prior to transmission to Event Hub can help reduce costs of Throughput Units. This is especially helpful if multiple services are consuming the events, or there are other Event Hubs in the same Namespace.
Additional resources and samples
- Big data reference architecture and implementation for an online multiplayer game
- Processing 100,000 Events Per Second on Azure Functions
- Reliable Event Processing in Azure Functions (how to avoid losing a message)
- In-order event processing with Azure Functions
Advanced streaming aggregation support
If you are looking for windowing support out-of-the-box, meaning you want to perform set-based computation (aggregation) or other operations over subsets of events that fall within some period of time, then you should consider replacing the Azure Function that connects the Azure Event Hub to Azure Cosmos DB with Azure Stream Analytics.
If you don't have an Azure subscription, create a free account to get started with 12 months of free services. You're not charged for services included for free with Azure free account, unless you exceed the limits of these services. Learn how to check usage through the Azure Portal or through the usage file.
You are responsible for the cost of the Azure services used while running these reference architectures. The total amount will vary based on usage. See the pricing webpages for each of the services that were used in the reference architecture:
- Event Hubs pricing
- Azure Functions
- Azure Cosmos DB pricing
- Azure Stream Analytics pricing
- Azure Virtual Machines pricing
You can also use the Azure pricing calculator to configure and estimate the costs for the Azure services that you are planning to use. Prices are estimates and are not intended as actual price quotes. Actual prices may vary depending upon the date of purchase, currency of payment, and type of agreement you enter with Microsoft. Contact a Microsoft sales representative for additional information on pricing.
Caricamento dei commenti...