Backup Strategies with SQL Server 2005 Analysis Services
By Edward Melomed
Summary: This white paper briefly discusses several issues related to executing the Analysis Services Backup command. You will learn how to choose the correct backup strategy, learn various options provided in the Backup command, and see an example of how to automate Analysis Services backup.
On This Page
Backup performance considerations
Automating backup using SQL Server Agent jobs
Backup and restore operations are an integral part of the Analysis Services infrastructure. In Microsoft® SQL Server™ 2005 Analysis Services, the backup operation has been completely rebuilt.
Let’s take a look at following diagram that shows the interaction between Analysis Services and backup and data files:
Backup and restore are done by issuing XML for Analysis commands against Analysis Services.
After receiving a Backup command, Analysis Services determines which files need to be packages in the single backup file. It will take care of the consistency of the database at the time of the backup and restore. It will make sure that all the objects in the backup file are of the same version.
It is also important to mention the granularity of backup and restore. You back up the entire Analysis Services database into a single file and restore the entire database as well.
The first and easiest place to start looking at the backup feature is by using the backup UI provided through SQL Server Management Studio.
To use the backup UI:
Start SQL Server Management Studio.
Open the connection to the Analysis Services server.
Right-click the database to back up and select Back up.
You will see the Backup Database dialog box as shown in Figure 2. In this dialog box, there are a number of settings for you to configure.
Enter the name and location for the backup file.
If you would you like to encrypt your backup file, select Encrypt backup file and provide a password.
You can script your Backup command. Click Script at the top of the dialog box to get a script which looks like following:
<Backup xmlns="http://schemas.microsoft.com/analysisservices/2003/engine"> <Object> <DatabaseID>FoodMart 2000</DatabaseID> </Object> <File>FoodMart 2000.abf</File> </Backup>
Executing this script will back up your database into the FoodMart 2000.abf file which by default is located in your C:\\Program Files\\Microsoft SQL Server\\MSSQL.1\\OLAP\\Backup folder.
DDL for the Backup command looks like the following:
<Backup> <object>object_ref</object> <File>BackupFile</File> [<AllowOverwrite>true/false</AllowOverwrite>] [<BackupRemotePartitions>true/false</BackupRemotePartitions >] [<Locations> [<Location> <File>BackupFile</File> <DataSourceID>Datasource ID</DataSourceID> </Location >] </Locations>] [<ApplyCompression>true/false</ApplyCompression >] [<Password>Password</Password>] </Backup>
By using the Location argument of the Backup command, you can control whether or not a single Backup command will back up a database that resides on multiple servers when you are using remote partitions.
As shown in Figure 3, if you have remote partitions on the slave server, by sending the Backup command to the master server and specifying the correct information in the Locations argument, you can make sure that a single Backup command backs up the entire database. This will cause the master and slave servers to create two backup files that constitute the entire backup.
Backup command options
You can specify whether a backup operation will compress data files. Usually this won’t result in much improvement for the partition files, mainly because they are stored in the compressed format already. But for the files attributed to the dimensions, you should see some space saving, especially if you are using lots of strings in the dimensions.
The password option allows you to specify whether or not the backup file will be encrypted. Analysis Services will not store the password you entered. It is the responsibility of the database administrator to keep it in a secure location.
It is important to choose the right strategy for backing up Analysis Services databases. With the wrong strategy, you could waste a lot of backup space if you do too many backups. Or, you might find out that your backup was not recent enough and you might have lost your data.
It is important to understand the storage mode of Analysis Server objects.
If all the partitions and dimensions in your database use OLAP storage mode, as opposed to ROLAP and HOLAP, the data that you are importing into the Analysis Services database changes only at the time of processing. Therefore, it makes sense for you to back up your data right after processing.
If you use ROLAP or HOLAP storage mode for your objects in the Analysis Services database, you should realize that Analysis Services is not actually holding the data. The most recent updates and most recent data will be dynamically read for you by Analysis Services.
In this case, the most important information for you to back up is not the data that Analysis Services holds, but the object definitions.
If your database is completely ROLAP, you can issue the same Backup command and back up your database into a comparatively small file.
If you use HOLAP, or some kind of proactive caching solution, you will probably be better off scripting database objects using interfaces provided by Analysis Management Objects (AMO). Backing up in such a case would be overkill. The data you would be backing up is just a cache which could be rebuilt once you get your database back online.
Which brings us to the case where you already have a backup of your relational database, and you are completely certain that you will be able to reprocess your Analysis Services database once you restore the relational part of your data warehouse.
In this case, to conserve backup space and the time required to back up, you might consider backing up just the Create statement for your database.
Please note that in this case you will have to completely reprocess your Analysis Services database after recreating it from scratch. Do a test run for your restore procedure, making sure that you can get everything back to normal before you choose such a strategy.
If you are using the writeback feature of Analysis Services, please make sure to back up your relational write-back tables.
If you use the Query log to collect statistical information for your database so that you can later optimize query performance by using Usage-Based analysis, you should also back up your query log table.
Please note that the query log table contains information about all the databases on the server. Restoring the query log table will reset statistics for all databases.
You can use the Database column in the query log table to determine which information is related to your database and back up only that.
The backup operation is to be performed by Analysis Services administrators or by the database administrators.
Backup performance considerations
There are two main operations done by Analysis Services during backup:
Figure out which files are to be streamed into the backup file.
Create the backup file.
The first operation is extremely fast and you should not see any significant processor activity during this phase.
The second part is an extremely I/O-intensive operation.
Analysis Services will start streaming data files into a backup file.
Placing the backup file on a separate drive could improve the performance of the backup operation. This would prevent Read operations from colliding with Write operations.
The backup operation is going to obtain a Read lock on the database it operates on. You should be able to start the processing operation. However, your processing operation will not be able to commit before the backup is finished.
It is possible in certain situations to back up your entire data folder in order to guarantee that you can recover from a disaster.
Some of the customers of SQL Server 2000 Analysis Services used this method to back up their systems.
In SQL Server 2005 Analysis Services, this approach is not recommended until you are absolutely sure that you have gone through a complete analysis of your configuration and made sure that you worked through all the caveats.
Here is a sample worry list.
It is only possible to back up the entire data folder containing all Analysis Services data, and not a single database.
You need to keep track of all the partitions you have placed in different locations on your file system when using the StorageLocation property on the partition.
If you are using remote partitions and will be restoring your data folder on the machine that has a different name, you will need to make sure to adjust all the data sources used to specify locations for your remote partitions.
Automating backup using SQL Server Agent jobs
There are several ways of automating an Analysis Services backup.
Lets take a look at the automation provided to you through SQL Server Agent jobs.
To manage a backup using SQL Server Agent jobs:
Right-click the Jobs node under SQL Server Agent in the SQL Server Management Studio and launch the creation of the new job.
You should see the dialog box in Figure 4.
Name the job in the Name text box.
Click on the Steps node in the left pane, then click on New to create a new step in your job. You will see the dialog box shown in Figure 5.
Name the step and choose the step type SQL Server Analysis Services Command.
Provide the name of the Analysis Services server on which you want to run this command. Then, cut and paste the Backup command you will be running into the Command window.
Click OK in the New Job Step dialog box (Figure 5) and then OK in the New Job dialog box (Figure 4).
Go back to SQL Management Studio and you will find a new node for the job you just created.
Right-click the new node and start the job.
You should see a progress dialog box and then see your backup created.
From this point, you can use the full power of SQL Server Agent to manage the job.
For example, you can add another step to copy the database file into a secure location and so on.
SQL Server Agent also provides you with the ability to look at the history for executing jobs and many other exciting abilities.
After you have learned all of the options of the Backup command, chosen the right backup strategy, and made sure that your automation scripts are functioning, you should feel a little more confident with your ability to recover from a disaster.