Reporting Services Load Testing


A common ask from consultants and architects working with SQL Server Reporting Services is about capacity planning and how to validate that the environment proposed will meet the estimated workload, it is a simple question with a very complex answer because there are plenty of variables in an enterprise SQL Server Reporting Services deployment.

One of the tools we use in the development team is synthetic load generation using Visual Studio Load Tests. Simulating every detail of an interaction with the server is a daunting task, so we use synthetic workloads where we favor the flexibility of define different settings in the workload instead of an accurate 1:1 simulation of what will happen in the browser and in the native clients, the load generation is an abstraction that allows us to create workloads in the server in an agile way.

Nothing stops you of create your own set of load tests using the different tools that Visual Studio and other tools offers. However , you might find it challenging due the complexity of the different APIs and security features you will need to use in order to drive the workload.

Ladies and Gentlemen don’t hold your breath and let me introduce you to the Reporting Services Load Test project hosted in GitHub, we welcome you to try it out and contribute to it, this project take care of many or the challenges, although is not for the faint of heart as it have it’s own set of complexities, we have done our best effort in order to document how to use it and all that is written with the world acclaimed style of engineer prose in the project readme.

The project contains two sample workloads that we use in our development cycle, one called PaginatedLoad , it has a mix of paginated reports only, and MixedLoad has a combination of paginated, mobile and portal tests, you can also use the tutorials How to onboard a new Paginated Reports Scenario and How to onboard a new Mobile Reports Scenario to create your own workloads with your own reports.

In case you need to validate your topology but you don’t have any hardware , it contains an Azure ARM template that we use in our development cycle to bootstrap a minimal enterprise environment in Azure with the following machines

  • Domain Controller
  • SQL Server Engine to host the Reporting Services Catalog
  • SQL Server Engine and AS Tabular to host Data Sources
  • SQL Server installation with public DNS to configure Reporting Services

This project is very close to my heart as is something that we have had in the team for a long time but never had the chance to share it with the community, also was a challenging project as the code required a “little bit” of cleanup and changes to make it simpler (it’s still complex and it has it’s own personality and quirks as any good old seasoned project).

You might be thinking, well all that is good and fancy but what can I do with it besides overwhelming with requests my dear Reporting Services environment?

One of the scenarios is to validate that the environment that you are setting up will be able to support your estimated demand, remember that the workload is synthetic so is not at 1:1 mapping between user and test cases but is an approximation, basically you can take your existing reports, the usage pattern that you expect (from exports and rendering) and create a Load Test with them (see the tutorials and the existing load test samples in the solution).

Then you can take the Load Test and run it in your environment and monitor your CPU, memory and windows performance counters to figure out what are your bottlenecks. You can combine that with the metrics that Visual Studio Load Test will collect for every report and every scenario like:

  • Passed Tests / Sec
  • Failed Test / Sec
  • Avg. Response Time

Among many others, and use them to figure out when your environment is not able to support the load consistently, for example the image below shows a view of the of the suite reports. The test/sec and the passed tests increased nicely along with the user load until a bottleneck is reached and then the avg. response time increases and the number of tests/sec drops dramatically.



This is just one example of the validation you can do with the suite. You can experiment with different work loads, step patterns, combination of tests (actions in the portal) and after you setup one, you only have to change the config file to point it to a different server and run the same workload again (either from Visual Studio Load Test on premises or in the cloud with Visual Studio Team Services on the cloud)


This posting is provided "AS IS" with no warranties, and confers no rights