The idea for Zeta Test Management was born out of our own needs as a software vendor. We needed to efficiently test our own applications – both in development and after release. These included not only consumer products for large audiences, but individually developed in-house solutions for business customers too.
Unfortunately we found no existing test applications that fitted our needs; either they had far too many functions we didn’t need and were much too expensive, or they had functions which we had no use for.
To summarize, our goal was to produce testing software with which we could ensure that our applications behave exactly how we and the customer expects them to behave. If the tests are performed for changes to existing products, they are called "Regression tests".
We aimed to enable anyone with a manual in their hands to perform tests, and not just the developers of the software themselves.
Very early on, we decided not to get down to the source code level but rather to cover all other aspects of the application like the installation process, the documentation, the usability of the user interface and the behavior of an application on different operation systems.
These were the ideas that guided us in developing Zeta Test.
During the development stage we were pleasantly surprised by the fact that nearly every customer we talked to about the project showed interest in Zeta Test Management. Due to close partnerships with selected customers from different industries, Zeta Test Management evolved to an application to cover a broad range of scenarios that goes way beyond our initial intention of simply testing software applications.
The workflow when creating, performing and evaluating test cases with Zeta Test Management is easy to understand and apply, but still comprehensive enough to cover very different application scenarios:
- Internal in-house tests on software created and updated by the software vendor (ISV) itself.
- External tests by companies introducing new software into their operational environment or implementing a new version ("Release management") of existing software within their enterprise.
- Arbitrary tests that have to be planned, performed and then be evaluated.
A typical workflow during testing is as follows:
- An administrator creates a new project with Zeta Test.
- A project manager creates global test units and, within these units, global test cases. Test units can be nested arbitrarily.
- A test manager defines test plans that will be performed by testers. The test manager adds test units and test cases to the respective test plan. He or she may also group the test plans in test folders that can be nested arbitrarily. In addition, the test manager sets permissions for the testers so that only they can access the test plans that are intended for their use.
- A tester performs the test runs he or she has permissions for and works through the individual test cases. Testers execute the tests described and document the test results as written text, as well as setting a test result status indicator to show the success of a completed test case. There is also the option to, if an external bug tracker database is connected automatically transfer erroneous test results to the bug tracker, thus enabling the developer responsible to correct the error.
- The test manager and the project manager have cumulated status indicators (traffic lights) in order to have an up-to-date overview of a test run or a test plan. In addition, detailed reports can be shown and exported to common formats (Microsoft Office Excel, Microsoft Office Word, and Adobe PDF).
- The steps 4 and 6 are repeated as often as necessary until the test manager or project manager is satisfied with the results.