Tuesday, June 14, 2011

Microsoft Test Manager - Test Scribe Power Tool

I read about this Power Tool at Safari Books Online “Software Testing with Visual Studio 2010”.

Requires MS OpenXML SDK 2.0 (installed if missing)

Requires Office 2010 (not installed if missing)

This adds the Tools menu to Test Manager where you can view the test run summary in grid format. Select one or more Test Cases and generate a Test Run Summary Report.

Tuesday, May 24, 2011

DSN error message

I got a strange error when an ODBC Data Source existed in the 32-bit Data Sources and was missing from the 64-bit Data Sources (ODBC) =  "The specified DSN contains an architecture mismatch between the Driver and Application". I added the DSN to 64-bit control panel to fix it. The error message made it seem like I used the wrong ODBC driver, but really it was unable to find the DSN.

Wednesday, May 18, 2011

Writing Good Defect Reports

These are my ideas based on experience. I am writing this down to help you form your own ideas.
  1. Title includes the feature or area where the problem was found - The developer needs to know where you saw the problem. The person who retests the fix needs to know. Other testers need to know whether or not a bug has already been reported before entering a new bug. It would be nice to sort the bug list and see which areas have the most problems. The bug title BEGINS with location.
  2. Title describes the problem - Some people write defect titles with no problem description, similar to this: "Version 1.0.0> Home Page". It's even worse when they enter 5 bugs reports with the exact same title. Other testers have no way of quickly searching to see if the the problem has already been reported. The title should describe the problem so others can get a quick view of the types of problems that are being reported. The title should make the developer want to read it.
  3. The version number should not be included in the title - Bug tracking systems usually have another field where you can enter the version number where the defect was found. Besides, this issue may exist in multiple versions of the software. The version number doesn't make sense as part of the bug title.
  4. Steps to reproduce -  The steps to reproduce need to be repeatable. Get them narrowed down to the simplest form and write it down. Try following them yourself. You might find you are missing a key detail when you follow steps you have already written.
  5. Report relevant details - Don't include details unless they are relevant to the specific problem you are reporting. Include any preconditions that are required.
  6. Don't combine problems into one bug report - Each problem needs a separate bug report. You should create a new bug report if you are including more than one result. Each unexpected result equates to a new bug report.
  7. Ideas are not bugs - We are reporting results of an experiment, not writing opinions of how things should work. If expected behavior is documented in another document, reference the document and page number or reference a similar website, but we shouldn't report our opinion or ideas for improvements as a bug. Submit a feature request instead of a bug when you have ideas for improvement.
  8. Attach screenshots -  Highlight the section of the screen capture where you see the problem. The problem might be obvious to us, but other people have to decipher what we are talking about. Highlight the screen capture to save time. Attach screen captures as graphics files and not embedded in Word documents. It speeds up the process of reviewing the images and saves disk space.
  9. Include Expected Results - Describe what you expect to happen
  10. Include Actual Results -  Describe what actually happened
  11. Include supporting log files or databases - Attach evidence of the issue. Many errors depend on the data that was input to the application. Including a copy of the data used to generate the error sometimes helps narrow down the cause of the problem.
Which one do you think would get read and fixed first?

Version 1.0.0> Home Page
Home Page> company logo incorrect
Live Home Page> the company logo is currently being replaced with an offensive image

Friday, March 25, 2011

CodedUI Test Close Browser

I am testing a web application and some error messages are generated through JavaScript. I created a Coded UI test that verified the error messages when the user attempts to login with an empty username or an empty password. I couldn't use a Web Performance Test because the JavaScript is not executed during playback.

I forgot to record closing the browser, so I added it under TestCleanup.

Here is what I changed in the CodedUI Test:

1) click + to expand the Additional test attributes region
2) Un-comment the TestCleanup method
3) add a line to close the browser: ";"

        public void MyTestCleanup()
            // To generate code for this test, select "Generate Code for Coded UI Test" from the shortcut menu and select one of the menu items.
            // For more information on generated code, see

Wednesday, March 2, 2011

Rename Virtual PC

I have some virtual machines used for testing. I am writing down what I do when we make a copy of the Virtual PC. Sometimes I make a copy of a virtual machine or give a copy to someone else to use. Then I have to rename the virtual PC, so there are not conflicts; rename the machine in Windows, assign a new MAC address to both network adapters, rename the SQL Server, and change the database connection strings in used by services and applications.

1) login to the Virtual PC; select Control Panel> System> Computer Name> Change... and enter a new name

2) restart Windows when it prompts you to reboot

3) open SQL Query Analyzer and rename the SQL Server instance
sp_addserver , local
4) close virtual machine and commit changes to hard disk
5) edit the .vmc file to change the MAC address; there is one line for each network adapter, 
so in a laptop I had to change two lines one for the wireless adapter and one for the wired connection
6) reconfigure any applications that use the machine name or IP address

Thursday, January 6, 2011


I was not sure if I would be using these since I am testing a web application. CodedUI Tests can do things that the Web Performance Tests can't do like validating error messages generated through JavaScript. I used the CodedUI Test to validate an error message when username or password is empty; the error message is generated by JavaScript. The web tests cannot verify that error message since the page didn't so a post back.

Web Performance Test - Expected HTTP Status Code

I have a web performance test that is submitting an invalid username and password and verifying the error message is "Invalid Username or Password". I added a data source from csv file and started putting a bunch of different values in it. When I ran the data driven test some of the rows failed because page validation is enabled and the .NET Framework throws an exception. This happened if the username or password contained script tags.

Since some of the requests failed, I made another web test for the failures; browse to the login page, enter script tags in the username and password, and click the Login button. The test was failing because it returned a 500. There is a way to validate the HTTP status code returned by a Web Performance Test request. There is a property on the request called Expected HTTP Status Code. When it is 0 a return code in the 200 – 300 range is success. I set the Expected HTTP Status Code to 500 to verify error page is returned when script tags are input to username/password and also checked for this text: "System.Web.HttpRequestValidationException: A potentially dangerous Request.Form value was detected from the client (userNameTextBox="")."

I ended up with two data driven tests for the login page. One to check for the normal error message when login information is incorrect and another to check that page validation is working when script tags are submitted.

This is the error page when request validation is enabled:

Server Error in '/WebPARCS' Application.

A potentially dangerous Request.Form value was detected from the client (userNameTextBox="").

Description: Request Validation has detected a potentially dangerous client input value, and processing of the request has been aborted. This value may indicate an attempt to compromise the security of your application, such as a cross-site scripting attack. You can disable request validation by setting validateRequest=false in the Page directive or in the configuration section. However, it is strongly recommended that your application explicitly check all inputs in this case.

Exception Details: System.Web.HttpRequestValidationException: A potentially dangerous Request.Form value was detected from the client (userNameTextBox="").

Source Error:

[No relevant source lines]

Source File: c:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files\webparcs\b39bbbab\1a2d57ab\App_Web_gjejbkkx.0.cs    Line: 0


Tuesday, January 4, 2011

VS2010 Schema Compare only works when databases are on the local drive

In Visual Studio 2010 Ultimate there is an option called Data> Schema Compare. I was unable to compare two databases on the test server. Visual Studio generated an error message of "Schema information could not be retrieved because of the following error: InternalError: SqlCeManager could not be initialized."

I found this post with the answer: "The error message is not accurate as what the specific issue is, its actually hitting an unsupported case.  SQLCE can not work against remote storage or mapped drives. It has to use a local drive for its sdf files and cache."

So I am unable to use Schema Compare because these databases are too big to copy and restore locally. Seems like a pretty big limitation to me.