Data Joins The Team

Introducing Visual Studio 2005 Team Edition for Database Professionals

Brian A. Randell

This article discusses:
  • Source control and change management for databases
  • Working with new and existing databases
  • Unit testing stored procedures, functions, and triggers
  • Comparing databases
This article uses the following technologies:
Visual Studio 2005 Team System


Managing Database Versions
Getting Started
The Import Process
Editing Schema and Objects
Data Generation
Unit Testing
Comparing Databases
Build and Deployment

Database development has gotten easier, thanks to Visual Studio® 2005 Team Edition for Database Professionals (or DB Pro edition, for short). If you need to create, manage, or build solutions that connect to SQL Server™, this tool is for you.

DB Pro edition provides a number of features to make managing databases easier. As a member of Visual Studio 2005 Team Suite, DB Pro edition works in conjunction with Team Foundation Server's source code control feature to manage database changes just as you manage your source code. DB Pro edition supports sandbox development of databases, and it includes tools for database comparison, data generation, unit testing, and database deployment. But what is most compelling about DB Pro edition is using all of its features together as part of a holistic, database development life cycle.

There are a few things that should be clarified before we dive into the article. First, the initial release of DB Pro edition will support SQL Server 2000 and SQL Server 2005 relational databases, but it will offer no Analysis Services support. Support for other Microsoft and non-Microsoft databases is planned for future versions via a provider-based model, but it is not available as of this writing. Second, using DB Pro edition in conjunction with Team Foundation Server enhances many of the DB Pro edition features; however, Team Foundation Server is not a prerequisite for using DB Pro edition.

Managing Database Versions

To build a successful database-driven application, it is essential that you know which copy of the database is the "true" version. The problem is that making changes and testing against a live system is fraught with peril. In fact, typically only the database administrator (DBA) has the power and authority to do such things. DB Pro edition offers a solution. The production database system maintains the true, live version of the data. Meanwhile, DB Pro edition maintains the true version of the schema via a project-based system that, ideally, should be under version control. DB Pro edition manages each database object as a set of .sql files; it also understands the dependencies and relationships between these objects.

This separation of data and schema, from a change management perspective, is a big deal. Now, you can have a true iterative development process for your database-backed projects.

In addition, as a Team System family member, DB Pro edition integrates with Team Foundation Server and other products in the Team System family. This allows for true team development and management of your database schema. Arguably the most compelling feature is version control. Additional features-such as work item tracking, code check-in policies, and integration with nightly builds-make the database a regular artifact that's tracked and managed. You no longer need to treat it as a specialized entity that needs its own, potentially costly, management and development procedures.

Getting Started

At its core, DB Pro edition is a set of project templates and tools that live inside Visual Studio 2005. You can use some of the tools without generating a project; however, the real power starts with a DB Pro edition project.

You will need at least two installations of SQL Server: one that will function as your production system, and another as your development system. Depending on the complexity of your database and the software built on top of it, you may choose to have additional installations. For a team environment, you should have a local private instance for each database professional. In addition, you should have at least one shared development instance for integration and regression testing. (Remember, as part of your DB Pro edition license, you also receive a license for SQL Server 2005 Developer Edition, which has all the features of the Enterprise Edition.)

Getting started on your local developer machine requires that you have DB Pro edition installed with a local copy of SQL Server 2005 Developer Edition, Enterprise Edition, or even Express Edition. DB Pro edition uses a SQL Server 2005 instance for design-time validation of both SQL Server 2000 and SQL Server 2005 projects. To control the instance of SQL Server used for validation, go to the Visual Studio 2005 Tools | Options | Database Tools | Design-time Validation Database option. If you're working with Team Foundation Server, you'll also want Team Explorer installed.

While DB Pro edition will let you create a database from scratch, you most likely have existing databases you need to manage. There are two ways to get your database's schema loaded into a project. The first is to import a SQL script file. This is useful, for example, if you've been managing your schema changes by storing copies of the schema in version control, or when you can't connect to your current version of the schema directly and do a live import due to security or other business requirements.

The second way to load your schema is to connect to a running SQL Server instance and import the schema live. You'd use this option if you've already created a new project or as part of project creation via the New Project wizard. If you work in an environment where you have a DBA, this second option offers a good way to integrate the DBA into the development process. In fact, in a perfect world, when you start working with DB Pro edition against an existing database, the DBA should create the project, load it into source code control, and then hand it off to development, but all the while, the DBA should remain present during the development life cycle.

There is one important distinction between importing via a script versus importing via a live database. You can only import from a live database once, whereas you can import via scripts as often as needed.

The Import Process

To start the import process, open the New Project dialog box in DB Pro edition. From the top-level node called Database Projects (after a bit of node expansion), the templates for SQL Server are available. You can choose either a blank project or to use a wizard.

Running the wizard makes it easy to get started. The wizard's questions are broken down into pages labeled Project Properties, Set Database Options, Import Database Schema, and Configure Build/Deploy.

Project Properties allows you to specify how you want your project organized on disk-either by object type or by schema. It's important that you think about this up front because there's no easy way to change this option once you have created the project and imported the schema. Schema-based organization can be useful, for example, when you need to restrict access to parts of the database for security reasons.

You use the Set Database Options page to define database settings, such as ANSI Nulls, Quoted Identifies, and more. On the Import Database Schema page, you choose whether to import a database schema from a live server. If you do choose to import at this point, you then specify whether to use the database's options or the project's options.

Finally, on the Configure Build/Deploy page, you adjust options that affect build scripts placement, the default target database connection, and so on. These options, like most of those defined by the wizard, are adjustable via the Project Properties.

Once you have completed the wizard and imported the production database schema, you have successfully created your offline workspace. This is a place where you can safely work offline on the database schema.

DB Pro edition provides two views of the project. There's a literal file view exposed through the Solution Explorer (see Figure 1), and a hierarchical schema view (organized by object type or by schema and then object type) viewable through the Schema View window. These two views provide flexibility in project organization and management. You can completely rearrange the file structure on disk using drag-and-drop functionality in the Solution Explorer, yet maintain easy access to objects by object and/or schema type.

Figure 1 Database in Solution Explorer

Figure 1** Database in Solution Explorer **(Click the image for a larger view)

DB Pro edition shreds the database schema into atomic objects represented as .sql files in the Solution Explorer and thus the file system (not just top-level objects like tables, views, and stored procedures, but indexes, constraints, and the like). In addition, the project gets a set of scripts segmented into a set of folders labeled Pre-Deployment and Post-Deployment. You can use these scripts to handle T-SQL operations that don't represent schema objects. Additionally, you can create your own management and utility scripts and organize them as you see fit.

At this point, if Team Foundation Server is in use, you add the project to Source Code Control and check off work items. Microsoft is updating the guidance provided with Team Foundation Server to include roles and work streams specific to database development for both the MSF and MSF CMMI templates.

Editing Schema and Objects

With the database under Source Code Control, making a change to an existing object isn't so overwhelming. Simply open it, start editing (which, by default, checks it out), and save your changes. From there you can check in your changes and move forward. On the other hand, you could use the shelving feature to post your changes to the server for review by another team member, such as your DBA, before checking it in.

The offline workspace encourages team collaboration. You also mitigate the inherent risk of making changes to the database schema with the ability to roll back. In addition, Team Foundation Server check-in policies enable teams to control workflow, making sure, for example, that all changes are associated with a work item, that all check-ins have comments, and so on.

The offline workspace provides a way to manage the database schema. However, it does not provide a runtime environment to actually create objects, run queries, or change data. You do this work against a live instance of SQL Server-your sandbox. Once you have imported the production schema, the next step is to publish the schema to your local developer instance of SQL Server. This requires two simple steps. First, you specify the target connection and new database name on the Build Tab of the Project designer. This option can be set via the import wizard if you used it to create your database project. Then you right-click on the database project in the Solution Explorer and select the Deploy option. Once complete, you are ready to start working on your database.

Designing database objects is a matter of adding a new item from one of three locations: the Project menu, the context menu in the Solution Explorer, or the context menu in Schema View. (One feature that's not making this first release is interactive design tools.) You create all of your database objects by entering the appropriate T-SQL statements. For example, to create a new table, you select the Add Item command from the Project menu, which opens the Add New Item dialog. You then select the Tables and Views option from the Categories section followed by the Table item in Templates section. Finally, you provide a name for the new table. Be aware that when you import a database from a live server, the names used for the .sql files on disk will follow the form of objectname.objecttype.sql. Keep this in mind when working out your standards for file names in your database projects. However, to be clear, the name you choose for your file is orthogonal to the object name used inside the database. Once you've provided a name and closed the Add New Item dialog, a T-SQL Editor window opens, providing a basic template from which to start.

The T-SQL Editor included with DB Pro edition is a full-featured editor and execution window. It supports color-coded editing, syntax validation, client execution statistics, and multiple result formats. This first release, however, doesn't support IntelliSense®. As you create your table, you can validate syntax by pressing Ctrl+F5. DB Pro edition will prompt you to make a connection the first time you do this.

Take care when working on Data Definition Language (DDL) statements in the T-SQL Editor. The familiar F5 keystroke will execute the T-SQL and perform the commands listed against whatever database you've chosen for syntax validation. Once the table design is complete, you can commit it to Source Code Control or shelve it for review.

Remember that when working in T-SQL, the objects defined inside DB Pro edition are very granular. You create indexes, constraints, and more individually. This can be quite a change depending on how you are used to building database objects. Say, for example, you wanted to add auction support to the Northwind sample database. Creating an auction table that has an identity column for its primary key would require that you first create the auction table, followed by a primary key constraint. Adding a foreign key constraint would also be its own step and require its own .sql file.

While you might find yourself adding new objects to a database, you will also find yourself making changes to existing objects. Some changes aren't so bad, like adding additional error handling to a stored procedure. Other changes feel like a more serious redesign, and words like refactoring start running through your brain. Changing object names, a major refactoring pain point with databases, is addressed by a refactoring tool provided by DB Pro edition.

To rename an object, you first need to find it using the Schema View window. Next, you right-click on the object and select Refactor | Rename. The first dialog to appear (see Figure 2) lets you specify the new name, as well as indicate if you want to preview changes, whether you wish to have a log generated, and how you want to handle unqualified script and unit test references. Clicking OK to continue brings up the Preview Changes dialog (see Figure 3) if you accept the default options.

Figure 2 The Rename Dialog

Figure 2** The Rename Dialog **(Click the image for a larger view)

Figure 3 The Preview Changes Dialog

Figure 3** The Preview Changes Dialog **(Click the image for a larger view)

A tree view displays every item that needs updating. Not only will it look for changes inside your database project, but DB Pro edition will also examine your database unit tests, your data generation files, and additional supporting .sql files.

Expanding an item displays each change, clicking on an item displays the new T-SQL, and clicking on a specific change highlights the new object name. Assuming the changes look good, you click the Apply button. Remember, this will only apply changes to your local workspace copy of the schema. If you're using Source Code Control, you will need to commit the changes. You will also need to deploy an update script to have the changes show up in an existing database-preferably your sandbox copy.

Data Generation

There's a chicken and egg issue when it comes to database applications. There is a need for data, but security and legal constraints often prevent you from working with copies of production data. In addition, new tables are somewhat uninteresting without data, as are objects that work with those tables, such as stored procedures.

DB Pro edition provides rich data-generation tools to ease your development pain. A database project can contain any number of data generation plans. Each plan supports data generation for one or more tables in your schema. If you enable a table for data generation, you specify a data generation plan for each column. You can use the built-in generators or you can write your own custom data generators.

The data generation tools provided let you generate the same set of repeatable test data over time, assuming the data generators are deterministic. This gives a consistent baseline against which all your tests can run.

Out of the box, you'll find a number of different standard generators. When you define a data generation plan, DB Pro edition automatically assigns a default value generator based on the data type of the column. In addition, DB Pro edition is aware of columns that enforce constraints, such as foreign key columns.

You can use value generators, available for most of the built-in data types (such as strings, integers, GUIDS) to generate arbitrary data for a column. Each value generator supports a number of properties to control generation. For example, for string data types such as NVARCHAR, you can set properties like Minimum Length, Maximum Length, and Percentage Null. If your column requirements are more complex, you can use the Regular Expression generator for generating phone numbers, postal codes, and the like. You can use the Data Bound generator to load existing data from tables in the database or from another supporting database. With the ability to create custom generators using managed code, you should have no trouble creating really useful data generation plans.

Unit Testing

You probably use unit testing for your Visual Basic® or Visual C#® code. Most likely, however, you don't use this same powerful technique when testing stored procedures, functions, and triggers. DB Pro edition brings the notion of a repeatable mechanism for testing the aforementioned objects in an easy, automated fashion. It includes database unit tests that you use as part of a standard Team System test project. The unit tests themselves are T-SQL mapped on top of either Visual Basic or Visual C# class files, allowing for advanced customization.

You use unit testing in DB Pro edition to validate the contract between the database schema and the applications that consume said schema. Getting started is easy-the Northwind sample database has a number of stored procedures with which you can start practicing.

With the Northwind database already imported, you'll want to find the CustOrderHist stored procedure via the Schema View window. Right-clicking on the object provides an option to Create Unit Tests. Selecting this displays the dialog shown in Figure 4.

The dialog is broken down into three groups. You use the Current selection group to specify which item or items for which DB Pro edition should generate unit tests. You can select any number of stored procedures, functions, or triggers. (Note that the dialog box in Figure 4 doesn't show a functions node or triggers node because the Northwind database only has stored procedures.)

Figure 4 The Create Unit Tests Dialog

Figure 4** The Create Unit Tests Dialog **(Click the image for a larger view)

In the Output project group, you define whether to create a new test project or add the new unit tests to an existing project. If you are creating a new project, you choose the default programming language (Visual Basic or Visual C#). Finally, you specify a new class name or select an existing class (assuming you have an existing test project selected).

The first time you start to create unit tests for a database and you encounter the Create Unit Tests dialog, you might be tempted to just select everything and click OK. Don't. Do you really want all of your testable objects in the same class? Do you even want all of them in the same test project? Just as you don't put all of your code in a single class inside a single class library, you probably don't want to put all of your tests in such a confining configuration. The fact that your database might use schemas provides a good factoring mechanism. Just think before you click OK.

For this example, select the CustOrdersOrders and CustOrdersDetails stored procedures. DB Pro edition automatically puts a check next to CustOrderHist. Each programming object you select will have a test script auto-generated for it (you will probably need to customize these). Choose a programming language, change the name of the project to NorthwindTests, change the name of the class to Customers, and click OK to generate the test project and unit tests. (Note that choosing a programming language does not mean you have to use it, as you can do everything you need to with T-SQL.) After you click OK, the Test Project Configuration dialog box appears (see Figure 5). Starting from the top, you set connection settings in the first section.

Figure 5 Test Project Configuration Dialog

Figure 5** Test Project Configuration Dialog **(Click the image for a larger view)

The first item lets you specify the connection to the live database server to use when executing the unit tests. The second item lets you specify the connection to use when validating the unit tests.

Consider using two different connections. You set the first connection for execution to run using credentials of a typical user account that will execute the object you're testing. You set the second connection for validation to run using credentials of a user with elevated privileges. If you specify two connections, the testing runtime executes pre-test, post-test, initialize, and cleanup procedures using the validation connection-otherwise, it uses the execution connection. This is particularly useful since many validation operations, like cleaning up tables, require elevated privileges.

With the connections defined, you set deployment options in the next section of the dialog box. You can choose to deploy a new or updated database schema prior to execution of the unit test by specifying an existing DB Pro edition project. You also specify the deployment configuration to use. Finally, you can choose a data generation plan to run. The testing runtime executes the data generation plan after it has deployed the database (if chosen) but before it executes the tests.

If you do choose to generate data, you can have the testing runtime clear the tables before loading new data. If you're following along, just choose the same Northwind connection you used to create the database project for the execution connection, accept the defaults for the other settings, and click OK. The test project generation completes with the Database Unit Test designer open, showing generated T-SQL code to help you get started (see Figure 6).

Figure 6 Test Designer

Figure 6** Test Designer **(Click the image for a larger view)

The designer supports a T-SQL script editor and a class editor for Visual Basic and Visual C#. You'll see that the window is broken into two sections: an editor and a conditions manager.

Each database unit test class can contain one or more database unit tests, and each unit test is composed of up to three scripts: a pre-test, a test, and a post-test. However, only the test script is required. In addition, each database unit test class supports common pre- and post-test scripts. At the top of the designer, you'll find a navigation bar where you can choose the active unit test and test script.

Your generated tests will have a default test condition of inconclusive. You manage test conditions using the lower portion of the designer; each test can have more than one test condition. The lack of any test conditions will cause the test to pass automatically when run. For this reason, the default use of an inconclusive test condition ensures that you edit the auto-generated script and define meaningful test conditions.

In the case of the sample dbo_CustOrderHistTest, modify the auto-generated T-SQL to provide a customer ID, such as ALFKI. Then select the default test condition and delete it by clicking the Delete button (just to the right of the Test Conditions combo box). If you expand the Test Conditions combo box, you'll see a list of built-in Test Conditions, including Execution Time, Not Empty ResultSet, and Row Count. In addition, you can write your own custom test conditions using managed code.

For the dbo_CustOrderHistTest unit test, add the Not Empty ResultSet test condition. Also add the Row Count test condition and change its Row Count property to 11. Note that you can rename each test condition and you can enable or disable test conditions as needed.

With the designer window active, you can run your tests by pressing F5. This saves your project and compiles it. The test engine then executes all of your unit tests. Figure 7 details the execution flow of a database unit test class.

Figure 7 Database Unit Test Execution

Figure 7** Database Unit Test Execution **(Click the image for a larger view)

You can run an individual test by showing the Test View window, selecting a particular unit test, right-clicking on it, and selecting the Run Selection command. In addition, you can debug your tests. If you're using DB Pro edition with Team Foundation Server, you can publish your test results to the Team Foundation Server as well as have your tests run as part of your team builds. Also, if you're running a version of Visual Studio 2005 that includes the Team Test features, you can create any number of test lists to logically group tests into different test suites containing any type of valid test.

Comparing Databases

After you've done all this work on your database, it's time to move the changes into your test system. DB Pro edition lets you compare your project's schema with any running database. In addition, you can compare the data contained in tables and views in two databases using the data comparison tools. You can run either the schema or the data compare tools with or without a DB Pro edition project open.

You can use the schema compare tool to compare two schemas and, if you want, generate a DDL change script. You can immediately apply the script from within DB Pro edition or export it for later use. You access the schema compare tool via Data | Schema Compare | New Schema Comparison. The new Schema Comparison dialog box then opens, allowing you to specify the source and target schemas. You can compare a DB Pro edition project to a database, or compare one database to another.

Once you've defined the source and target, the comparison starts. The Schema Compare window, which has three sections, opens and displays the results. The top section provides a list of all the schema objects, grouped by type. The read-only Object Definitions section in the middle displays two text comparison windows showing the T-SQL for the currently selected object. The read-only Schema Update script window at the bottom displays the proposed update script.

You can filter the displayed objects based on whether they compare as different, missing, new, or equal. As you examine objects, you can either include or exclude them from the schema update script by changing an item's Update Action property. DB Pro edition sets the Update Action property to Skip for equal objects. All other objects are set to either Create, Update, or Drop and can be changed from the default Update Action to Skip. You can then easily filter out which objects you want modified from those you want skipped. All the while, DB Pro edition updates the proposed script in real time.

Once you're happy with the update script, you have a few choices. If you're feeling bold, you can write the changes to the target schema. This might work well in your lab, but probably won't fly with your DBA. Instead, you can either export the update script to a file or to DB Pro edition's built-in T-SQL editor. After a good review, you're ready to update the target database. You have lots of control over how schema comparisons are performed via Tools | Options | Database Tools | Schema Compare.

To perform a data comparison, first specify the source and target databases. Tables and views must have the same name, the same owner, and matching schemas. Tables must have the same key and index layout. You start the wizard by selecting the Data | Data Compare | New Data Comparison menu item. You can adjust various options related to the data comparison process by going to Tools | Options | Database Tools | Data Compare. Once you have completed a data comparison, you can use the generated script to perform data motion between the source and target databases.

Build and Deployment

You've seen that you can generate a T-SQL update script manually via the Schema Comparison tool. However, as part of the build process, DB Pro edition can generate a complete script for deploying your database project. This deployment script can do either a complete build or an incremental update. The build process can even consolidate all of your pre- and post-deployment scripts into one complete deployment script. You can deploy the script via the Build | Deploy Selection command right from within Visual Studio 2005.

Under project properties, you will find a number of options to control and adjust the build process. The Build tab contains the core settings, such as Target connection, Target database name, and Block incremental deployment if data loss might occur. You'll note there is also a Build Events tab that you can use to type pre- or post-build event commands. DB Pro edition uses MSBuild for its build process and supports integration with Team Build if you're using Team Foundation Server.


If you're a professional who works with SQL Server 2000 or SQL Server 2005, you will find Visual Studio 2005 Team Edition for Database Professionals to be an essential tool for building database-backed solutions. The product provides a rich feature set-especially for a first release-that will change the way you manage databases in a development environment.

Brian A. Randell is a senior consultant with MCW Technologies, LLC. Brian spends his time speaking, teaching, and writing about Microsoft technologies. Brian is the author of Pluralsight's Applied Team System course and a Microsoft MVP. Contact Brian via his blog at