Finally, after some pretty exciting last minute drama with the CD's, the VS 2005 Beta is released! I hope you like the new features we have in the debugger, and I hope you like its stability.
Two of the features I enjoy owning are the new datatips and breakpoints. Datatips are unlike anything you saw in previous versions of VS, and BP has been given a bit of an overhaul to allow setting more conditions, which should be useful to power users. We also support source checksums now, so the old problem of one breakpoint binding to multiple files of the same name is gone.
What did we do to test the debugger? Well we have a list of testcases that we automate. Most testcases tend to be automated, and we run these on different platforms, under different configurations. Others are done manually in a test pass. We had a 3-week test pass for the Beta release, we had another one back in March which was the real bug-fixing pass, this recent one was just for verification, all non-major issues caught here were moved to the live branch, only "severe" issues were fixed in the Beta tree.
The quality of the beta release (or product release in general) depends on the variety of platform configurations tested on, and the quality of the testcases themselves. I like to think we have a decent list of testcases, and how we come up with the cases is a different topic altogether. The intangible is how a manual test is performed and how an automated test is written, given a testcase you can perform it as badly or well as you like (testcase : verify that you can step into GetEnumerator from MoveNext for an Iterator - how many types of Iterators do you need to be sure that this is "ok"?), and it is often difficult to say whether the coverage was "proper".
One measure of goodness of scenarios we use is code coverage, though this is a time consuming activity.