How to Run Grid-based Web Applications and Databases at Scale in the Cloud
Cloud-hosted, grid-based applications
Running grid-based web applications in the cloud is a growing trend. The ability to host grid-based applications and scale them as needed is a very common use case can scenario that I plan to cover in this post.
DevExpress offers one of the most powerful grids in existence. I've used it for many, many years - across web and thick client applications. They now support many other platforms as well.
This post will cover the whole spectrum of downloading data from a government site, loading it into an on premise SQL Server database, migrating this on premise SQL Server database to SQL database in Windows Azure. Next, we will add the powerful DevExpress ASP.NET Data Grid, which will read the data from the SQL database in the cloud. Next, we will add a to express grid, which will read the data from the SQL database in the cloud. As a final exercise migrate the entire web application into the cloud as well, so that we can provide needed scale.
I've help the couple of developers get over the hump and I want to share my learnings with you here today.
DevExpress and Windows Azure can be used to create solutions that support an almost unlimited number of users.
Build or Buy?
As a developer, it is crucial to be able to provide a highly functional grid-based user interface to end-users. Ideally, this grid would behave like Excel, since spreadsheets are one of the best understood user interfaces on the planet. Minimal training is required and it can be very productive to use.
Not trivial to build
Providing grid-based functionality in a browser is very challenging. Writing your own just doesn't make any sense. It can become very complex extremely quickly. To start, there's the whole technical challenge of supporting multiple browsers. Even for a single browser, there are many different versions to support. second, supporting basic crud operations is it trivial either, as the grid must now support the ability to insert, update and delete data. Third, there is the challenge of good performance and fast scrolling. Imagine a data store that has hundreds of thousands of records that need to be scrolled efficiently. How do you plan to implement paging inefficiently? And fourth, it’s important that grid be able to support touch on mobile devices.
|How do you plan to support the moving of columns?|
|How do you plan to support sorting and grouping?|
|Will your grade be able to create SQL queries, on-the-fly, depending on what the end user is doing with the grid?|
|What about caching - how minimize roundtrips between server and client?|
|Do you plan to use sprites, which is the ability to put all images on a single PNG file, and then have the code pulled the appropriate image from that single PNG file - to minimize chatter on the wire?|
|What about custom HTTP handlers to handle callback errors?|
|Do you plan to support themes, such as the office 2013 theme?|
|Are you going to support both web forms and MVC?|
Let's walk through a simple end to end example.
Starting with data
I've downloaded movie location data as the example from San Francisco.
Choose CSV format.
The data is in comma separated format.
Before we can import this into a table in SQL Server, we will need to add a column that represents the primary key for the clustered index. The bulk insert command will fail unless we do so. We will insert a column to the left of column A. Simply right mouse click on column A, within Microsoft Excel
In cell A1 , I simply put in some text to indicate that this will be the fake ID. We can leave all the rows blank.
We will just save over the original file. Take note of the folder since it will be needed in the bulk insert command.
The assumption that I am making is that you have downloaded and installed the latest DevExpress products,
Getting to a database
Since we are going to run this in the cloud, it makes sense to get this data into SQL database, which is Microsoft’s solution for running a relational database in one of the many global data centers.
Start with a local database
One of the easiest ways to get started is to create a database and the corresponding tables using a local, on premise version of SQL Server. From there we can use some of the tooling inside of to SQL Server Management Studio to migrate our database up into the cloud. So start SQL Server Management Studio.
Start by creating a new database. Right mouse click on databases.
Let us call this database SFMovies.
Bring of a query window so that we can issue a create table statement.
Here is a query to create a table called Movies.
|123456789101112131415161718192021||Create table [dbo].[Movies]( MovieRowId [int] IDENTITY(1,1) NOT NULL, Title [char] (255) NULL, Release_Year [char] (255) NULL, Locations [char] (255) NULL, Fun_Facts [char] (255) NULL, Production_Company [char] (255) NULL, Distributor [char] (255) NULL, Director [char] (255) NULL, Writer [char] (255) NULL, Actor_1 [char] (255) NULL, Actor_2 [char] (255) NULL, Actor_3 [char] (255) NULL CONSTRAINT [PK_Movies] PRIMARY KEY CLUSTERED( MovieRowId ASC) WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON))|
There are a few things to notice about this bulk insert command. First, we can issue this command inside of a query in SQL Server management studio. Also notice that the first row property is set to because we want to skip over the column headings. Since this is a comma delimited file you need to specify that as well. In addition to the newline which acts as terminators for each row.
|12||BULK INSERT SFMovies.dbo.Movies FROM 'C:\Users\Bruno\Downloads\Film_Locations_in_San_Francisco.csv ' WITH (FIRSTROW = 2,FIELDTERMINATOR = ',' , ROWTERMINATOR = '\n');|
One thing to note is that the data is not structured well. The data actually does contain one-to-many relationships, but is stored in SQL Server as one big table. Normalizing the data to multiple tables is left as an exercise to the reader. Because the data is not properly normalized, it will prevent optimal behavior of the underlying data grid.
Provisioning a database in the cloud
Before we can export the on premise database to the cloud we need to establish a server. A single server can service many databases. You can provision more servers as needed.
The Windows Azure portal allows you to provision a server. Notice that in the figure below, we can list all the databases. But what we want is to list the services, because we will migrate a database to the server. We do not need to provision a separate database through the portal you see below. Instead, the tooling will provision this for us using the on premise database.
You can see the name of the server below. You will need the server name as you begin to migrate the database into the cloud. You will also need a login ID and the password that you indicated when you provision the server.
You can provision new servers as seen at the bottom of the portal. Simply click the add button below.
Let’s turn our attention back to SQL Server management studio. Within object Explorer, we can right mouse click on the database that we wish to migrate, and select tasks, then deploy database to SQL Azure. It is no longer SQL Azure and has been rebranded to “SQL database” so the menu selection is wrong below. But it does what we needed to do.
We will now go through a wizard within SQL Server management studio to do the deployment. You will be asked to connect to the server, we noted in the portal in an earlier step.
As you can see in the figure below you do need to have your login information to be able to connect to your server. The same information will be needed later, when the web application needs to create a connection to the database.
Once the deployment is complete, you should see the confirmation screen below.
One of the things you need to consider with the database server machine is that you need to have allowed IP addresses configured. You probably would not want to do what you see below, because it allows any connections to the server in the Microsoft data center.
In this sample the best thing to do is to not allow any IP addresses connect directly to SQL Server. That is because we will deploy our web after the cloud and only allowed that cloud-based web app to access the database. That’s the point of the “allowed services” section in the figure below. But if you wish to connect to your database in the cloud using SQL Server management studio, you will need to specify the IP address of your computer that is running SQL Server management studio.
Selecting “yes” allows that web application to have access to the server and the underlying database.
You can obtain the needed connection information to the cloud hosted database directly from the Windows Azure portal. From the bat dashboard section, you will see a selection called, “Show connection strings.”
If you click on the SFMovies database from the dashboard section of the portal, you will be able to get a needed connection string.
Creating the Web Application
We are now ready to create the web application. The assumption is that you have already installed the DevExpress tooling, which embeds itself into the Visual Studio New Project dialog boxes. So the task remaining at this point is to create a web application that leverages a DevExpress grid, which will connect to the database we just created in a Microsoft Data center. The final step in this post will be to migrate this web application into the cloud. At that point, both the database and the web application will be hosted in the Microsoft data center. In addition, a great deal more scale can be leveraged should that be needed.
From Visual Studio, create a DevExpress Template Library.
We selected "web forms" as the template. You could choose MVC as well, if desired. In the lower right corner of the dialog box below you will click on “Create Project.”
The project will get created with all the necessary references specified. What we need to do next visit add a blank form and then use the toolbox to drag a grid control to the blank form. From there we will connected to the database and start to set some properties for the grid to meet our functionality requirements.
Name the new web form that we just added.
Within solution Explorer, you can see the new platform. It is currently empty, and we will now use the toolbox to add a DevExpress grid to it.
When you do this for the first time and select the toolbox, you may have to wait a few moments as the toolbox gets updated and will see the dialog box below.
There is a very rich set of controls, that you can select from. We are interested in the grid view control. Drag the grid view control onto your form.
Now that the grid view control is on the form, we will need to connect it to the database in the cloud, in addition to setting other features that we wish to include.
The “choose data source” option will allow us to connect to the database. Inside the combo box, select “New.”
The data source type is a database. And we will specify an id for it, DataSourceMovies.
Since this is a new connection click on the button, “New Connection.”
You will need the connection information for the server running in the Microsoft data center. You will need four things:
|The server name, which is preceded by TCP|
|The database on that server|
Click next to “Save the connection.”
For the sake of simplicity, we will set up a grid with the minimum of requirements. There is a vast amount of options and capabilities that we will not leverage in this example. What we will do is specify the columns that we are interested in as well as an ordering sequence for the data itself. We will sort that data by release year, as well as by the movie title.
You can see that the wizard will build the SQL statement for you.
Click “Next,” then “Finish.”
Now you are ready to run the application. But this web application is running locally on the development computer. Ultimately, we want to migrate it to a Microsoft data center. First, let’s validate that it runs locally.
Specify the start page.
Compile the application.
Start to debug it.
Validate that it works. Notice that it is running on localhost, on port 37553.
It’s important to note that there are a huge amount of capabilities that we are not leveraging in this example. This post is already quite long so for additional capabilities, visit the DevExpress website.
This dialog box below provides some indication of all these capabilities.
Migrating the app to the cloud
We are now ready to take this locally deployed grid application and migrated to the Windows Azure. There are numerous data centers throughout the world where you can deploy this web application. It makes a lot of sense to deploy it to the same data center as the database, which will minimize latency and maximize performance.
To enable this about to be deployed you will need to add a Windows Azure cloud service project to the existing grid view project. This can be achieved with the simple right mouse click.
As you can see now there are two projects within the solution. The first project is the one we felt that contains the grid view control. You can think of the second project as the configuration and deployment project.
Although we will not change the instance count, this is the way you can scale your application. If you to type in the number 10 in the instance count field, that means that 10 copies of this web application would be running on 10 separate virtual machines running IIS. All 10 of these instances with the load balanced. In addition, they would be behind separate fault domains and update domains. This means that they could be updated by the cloud service without bringing more than one of them down at a time. This also means that if there is a crash on any one of the 10, the other nine would be on a separate rack, on a separate router to minimize the potential for downtime.
Before we actually publish this to a Microsoft data center, you can run the application in the Windows Azure emulator. By compiling and running this application you can see that the address contains 127.0.0.1 on port 81. The beauty of the emulator is that you can test your cloud application locally before going to the trouble of deploying it to the cloud, then testing it.
One final step to accomplish before publishing is to set the copy local property of each of the references to true. This means that the actual DevExpress assemblies will be copied you along with the rest of our project once we deploy.
We are now ready to deploy the application. Right mouse click on the deployment project and select publish. It is very easy to set up the ability to publish.
You can read more about it here:
A publish wizard will now appear allowing you to walk through the publishing process.
You can deploy or publish to an existing virtual machine for you can create a new one. There are other options here as well. For example, you could choose to enable remote desktop, which provides you the ability to actually remote into the virtual machine and view the exact configuration and setup.
The Windows Azure cloud offering is not to be a platform as a service offering. This means that you are not bound to individual virtual machine names. Rather, you are about to an entire service, and the concept of individual virtual machines has been abstracted away by the service.
I’ve written about platform as a service here:
So this is the end game right here. Both the database and the web application itself are running inside of one of Microsoft’s huge mega data centers. Our SQL Server data is not triply replicated and we also have the ability to scale the web application to meet virtually any demand. There is also a host of other capabilities within the Windows Azure platform that we can leverage, but are not covered in this post. There is also a tremendous amount of customization that can be done with the DevExpress tooling.
With a little more work you can make your application with like this:
So this post covered some powerful concepts. First, we talked about leveraging some powerful gridview technologies by it makes sense to buy instead of build. We talked about starting with on premise databases and web applications and then migrating both pieces to the cloud. We talked about the ability to scale code data as well as the web apps. Ultimately, this all boils down to one simple fact - how do we provide highly functional web applications to meet any level of demand, while maximizing performance and uptime.