Saturday, November 24, 2007

QaTraq Lite

Recently I wrote about the QaTraq test management tool, and hinted that it could be used in a lightweight manner suitable for agile software development environments with very short release schedules.

The approach is to use only the features of QaTraq that are essential for a rapid release environment, and to give up several unnecessary assumptions.

One of the assumptions made in QaTraq is that a new test plan will be created for each release. For a live web site with releases every day, this is not a good assumption. In the time it would take to copy and customize a test plan for each daily release, there would be little time to test before the release went out the door.

An alternative is to use the "test plan" of QaTraq as a living repository of all the test cases appropriate to a particular application or feature area. When test cases need to be rerun, they are run in the same test plan.

In this approach, new test results overwrite old results, so the results database becomes a snapshot of only the most recent result for each test case rather than all results. In my experience, though, test results from past releases are almost never used, even in more traditional software development environments.

Another assumption in QaTraq is that Products are the major categories used for reporting purposes. But if test plans are used as living repositories of test cases rather than being associated with a specific release, Products become almost unnecessary.

Products cannot be completely ignored in QaTraq because test cases and test scripts in QaTraq must be associated with a Product, but there is nothing that requires that more than one Product be used. By defining only a single Product with a single Version in QaTraq, all new test cases will be automatically assigned to that Product and Version with no extra work required.

The built-it reports of QaTraq are based around Product, so it will be necessary to create your own reports to track test results by test plan, design, and script. This is straightforward using PHP, which must be installed to run QaTraq anyway. To build queries for your custom reports, PHPMyAdmin is very useful interface to MySQL. I recommend installing it on any server running QaTraq.

The above is the essence of what I call the "QaTraq Lite" approach. The keys points are:

* Use Test Plans as living repositories of test cases and results rather than for a particular release. Add tests when you need them, rerun tests when you need to.

* Create a single Product with a single Version. All test scripts and cases will be associated with this Product automatically.

* Report test results by test plan, test design, and test script instead of by Product. A test case shows up in reports wherever you put it. No need to maintain a separate physical hierarchy and reporting hierarchy. They are one and the same.

This approach reduces much of the management overhead of using QaTraq to track your tests, even in an agile environment with daily releases.

6 comments:

google said...

Thanks for the useful hints. Did you have some practical experience with TestLink (or other tools that you would recommend). We are currently in the startup process of evaluating a Test Manager Tool that we would use in our organization and replace the tedious "excel-testing" (commercial products would be also acceptable, but with reasonable pricing, i.e. NOT Testdirector). We are an organization with 12 software developers (Java, J2EE), having small project teams and delivering official releases every 1-2 months.

I appreciate any help.

Take care
Alex

Chris Struble said...

Alex,

I haven't used TestLink, most of my experience is with expensive commercial tools like TestExpert (which I don't recommend) or in-house tools for large teams.

For a team your size, you may want to try TestRun. It's commercial but inexpensive. I tried out their online version recently and it is very easy to use. For info, go to www.runtestrun.com.

google said...

Chris,

Thanks for the prompt answer and advice. I'll certainly try TestRun.

All the best,
Alex

google said...

Sorry to bother you...

How QaTraq compares to TestRun (just a short spontaneous opinion ;-)) -> if these solutions are comparable at all!?

Thanks
Alex

Chris Struble said...

Alex,

You could try each for a week and see how you like each one. It may also help to write up a list of the features you need in a tool and rate each tool against that list.

sdguero said...

Hey Chris,

Nice blog.

We are implementing QATraq in the group I work with now and I was thinking about using it similarly. We have extremely short development periods... around 3 to 4 days...

You post will give me some ammo to pursue further.

Thanks :)
Ryan