ISTQB Syllabus

Back to the table of content

Tool Support for Test ing (K2)

6.1   Types of Test Tools ( K2)

Terms

Configuration manag ement tool, coverage tool, debugging tool, dyna mic analysis tool, incident management tool, load testing tool, modeling tool, monito ring tool, performance te sting tool, probe effect, re quirements managemen t tool, revie w tool, security tool, stati c analysis tool, stress te ting tool, test comparator, test data preparation to ol, test desi gn tool, test harness, test execution tool, test man agement to ol, unit test fr amework tool

6.1.1           Tool Sup port for T esting (K2)

Test tools can be used for one o r more activities that support testing. These include:

  1. Tools that are directly used i n testing such as test ex ecution tools, test data generation to ols and result comp arison tools
  1. Tools that help i n managing the testing process such as those used to manag e tests, test results, data, req uirements, incidents, defects, etc., and for reporting and mon itoring test exec ution
  2. Tools that are us ed in recon naissance, or, in simple terms: explor ation (e.g., tools that mo nitor file a ctivity for an application )
  1. Any tool that aids in testing (a spreadsheet is also a test tool in this meaning)

Tool sup port for testing can have one or mor e of the following purposes depending on the con text:

o

Impr ove the effic iency of test activities by automating repetitive tasks or supp orting manu al test

 

o

activ ities like test planning, t est design, t est reporting and monitor ing

 

Auto mate activities that require significan t resources when done manually (e. g., static testing)

 

o        Auto mate activities that cann ot be executed manually (e.g., large scale performance testi ng of clien t-server app lications)

o        Incr ease reliability of testing (e.g., by aut omating large data comp arisons or simulating beh avior)

The term “test frameworks” is als o frequently used in the industry, in at least three meanings: o Reusable and e tensible testing libraries that can be used to buil d testing tools (called test harn esses as we ll)

o A ty pe of design of test auto mation (e.g., data-driven, keyword-driven) o Overall process of execution of testing

For the purpose of th is syllabus, the term “te st frameworks” is used in its first two meanings a described in Section 6.1.6.

6.1.2           Test Tool Classifi cation (K2 )

There are a number of tools that support diffe rent aspect s of testing. Tools can be classified based on sever al criteria su ch as purpose, commer cial / free / open- source / shareware, technology used and so f orth. Tools are classified in this sylla bus according to the testing activities that they support.

Some tools clearly s upport one a ctivity; others may supp ort more tha n one activit y, but are classifie d under the activity with which they are most closely associat ed. Tools from a single provider, especially those that ha ve been de signed to work together, may be bun dled into one package.

Some types of test t ools can be intrusive, which means th at they can affect the ac tual outcome of the test. For exampl e, the actual timing may be different due to the ex tra instructio ns that are execute d by the tool, or you may get a differe nt measure of code cov erage. The consequence of intrusive tools is call ed the probe effect.

Some tools offer sup port more a ppropriate for developers (e.g., tools that are used during compon ent and component integ ration testing). Such tools are marked with “(D)” in the list below.

6.1.3           Tool Sup port for Managem ent of Testing and T ests (K1)

Management tools apply to all test activities over the enti re software life cycle.

Test Management T ools

These to ols provide interfaces for executing tests, tracking defects an d managing requirements, along with support fo r quantitative analysis a nd reporting of the test objects. They also support tracing t he test objec ts to requirement specifications and might have an independent version control capability or an interface to an e ternal one.

Requirements Management To ols

These to ols store re quirement statements, store the attrib utes for the requirements (including priority), provide uni que identifiers and suppo rt tracing th e requireme nts to individual tests. Th ese tools may also help with identifyi ng inconsist ent or missing requirements.

Incident Manageme nt Tools (Defect Tracking Tools)

These to ols store and manage in cident reports, i.e., defe cts, failures, change requ ests or perceived problems and anom alies, and help in managing the life c ycle of incide nts, optionally with support for statistica l analysis.

Configuration Man agement Tools

Although not strictly test tools, these are nec essary for st orage and v ersion management of testware and related software especially when configurin g more than one hardware/software environ ment in term s of operating system ve rsions, comp ilers, brows ers, etc.

6.1.4           Tool Sup port for S tatic Tes ting (K1)

Static testing tools provide a cost effective w ay of finding more defects at an earli er stage in th e develop ment process.

Review Tools

These to ols assist with review processes, ch ecklists, review guideline s and are us ed to store and commun icate review comments and report o n defects and effort. They can be of further help b y providin g aid for online reviews f or large or g eographicall y dispersed teams.

Static Analysis Too ls (D)

These to ols help dev elopers and testers find defects prio r to dynamic testing by providing support for enforcing coding standards (i ncluding secure coding), analysis of structures an d dependen cies. They ca n also help i n planning or risk analysis by providi ng metrics for the code ( e.g., comple xity).

Modeling Tools (D)

These to ols are use d to validate software models (e.g., physical data model (PDM ) for a relational databas e), by enum erating incon sistencies and finding d efects. Thes e tools can o ften aid in generating some test cases base d on the mo del.

6.1.5           Tool Sup port for T est Spec ification ( K1)

Test Design Tools

These to ols are use d to generate test inputs or executable tests and/ or test oracle s from requirem ents, graphical user inte rfaces, desi gn models (s tate, data o r object) or code.

Test Data Preparation Tools

Test data preparation tools manipulate databases, files or data trans missions to set up test da ta to be used during the execution of tests to ensure security t hrough data anonymity.

6.1.6           Tool Sup port for T est Exec ution and Logging (K1)

Test Ex ecution Too ls

These to ols enable tests to be ex ecuted automatically, or semi-autom atically, usin g stored inputs and exp ected outco mes, through the use of a scripting language and usually provide a test lo g for each test run. They can also be used to record tests, and usually support scriptin g languages or GUI-based configura tion for para meterization of data and other customization in the tests.

Test Harness/Unit Test Frame work Tools (D)

A unit test harness or framework facilitates th e testing of components or parts of a system by simulatin g the enviro nment in wh ich that test object will ru n, through the provision of mock objects as stubs or drivers.

Test Comparators

Test co mparators de termine diffe rences betw een files, da tabases or t est results. Test executi on tools typically includ e dynamic c omparators, but post-execution comp arison may be done by a separate compariso n tool. A test comparator may use a test oracle, especially if it is automated.

Covera ge Measure ment Tools (D)

These to ols, through intrusive or non-intrusive means, m easure the percentage of specific types of code str uctures that have been e xercised (e.g., statements, branches or decision s, and module or function calls) by a set of tests.

Security Testing To ols

These to ols are use d to evaluate the security characteris tics of softw are. This includes evalu ating the ability of the soft ware to prot ect data confidentiality, in tegrity, authentication, authorization , availability, and non-repudiation. Security tools are mostl y focused on a particular technology, platform, and purpos e.

6.1.7           Tool Sup port for P erforman ce and Monitoring (K1)

Dynamic Analysis Tools (D)

Dynamic analysis to ols find defe cts that are e vident only when softwa re is executing, such as time depende ncies or memory leaks. They are typ ically used in componen t and compo nent integra tion testing, and when te sting middle ware.

Perform ance Testin g/Load Tes ting/Stress Testing Tools

Perform ance testing tools monito r and report on how a sy stem behaves under a v ariety of sim ulated usage c onditions in terms of num ber of conc urrent users, their ramp- up pattern, frequency an d relative percentage of transactio ns. The simu lation of loa d is achieve d by means of creating virtual users carrying out a selected set of transacti ons, spread across various test mach ines commo nly known as load gene rators.

Monitoring Tools

Monitori ng tools continuously an alyze, verify and report on usage of specific syste m resource s, and give warnings of possible service problems.

6.1.8           Tool Sup port for S pecific T esting Needs (K1)

Data Qu ality Assessment

Data is a t the center of some projects such as data conve rsion/migrat ion projects and applica tions like data warehouse s and its attributes can v ary in terms of criticality and volume. In such contexts, tools ne ed to be em ployed for da ta quality assessment to review and verify the da ta conversio n and migratio n rules to ensure that the processed data is correct, complete and compli es with a pre - defined context-spec ific standard .

Other testing tools exist for usability testing.

6.2   Effective Use of Tools: Potential Benefits and Risks ( K2)

Terms

Data-dri ven testing, keyword-driv en testing, s cripting lang uage

6.2.1 Potential Benefits and Risks of Tool Support f or Testing (for all to ols) (K2)

Simply p urchasing or leasing a t ool does not guarantee success with that tool. Each type of tool may require additional effort to achieve real a nd lasting benefits. Ther e are potential benefits and opportun ities with th e use of tools in testing, but there are also risks.

Potential benefits of using tools include:

o    Repetitive work is reduced (e .g., running regression tests, re-ente ring the sam e test data, and chec king against coding stan dards)

o    Gre ater consistency and repeatability (e.g., tests executed by a t ool in the sa me order with the same frequency, and tests derived from equirement s)

o Obje ctive asses sment (e.g., static measu res, covera ge)

o        Eas e of access to information about test s or testing (e.g., statistics and graphs about test prog ress, incide nt rates and performance )

Risks of using tools include:

o Unr ealistic expe ctations for the tool (inclu ding functionality and ea se of use)

o        Underestimating the time, cost and effort for the initial introductio n of a tool (in cluding train ing and external exp ertise)

o        Underestimating the time an d effort need ed to achiev e significant and continuing benefits from the tool (including the need for changes in the testing process an d continuous improvement of the way the tool is used)

o Underestimating the effort required to ma intain the test assets generated by t he tool

o        Over-reliance on the tool (re placement fo r test design or use of a utomated te sting where manual testing w ould be better)

o Neglecting versi on control of test assets within the tool

o        Neglecting relationships and interoperab ility issues between critical tools, such as requirements management too ls, version c ontrol tools, incident management to ols, defect t racking tools and tools from multiple vendors

o        Risk of tool vend or going out of business, retiring the tool, or selli ng the tool to a different ven dor

o Poor response fr om vendor for support, u pgrades, an d defect fixe s

o Risk of suspension of open-s ource / free tool project

o Unfo reseen, such as the ina bility to support a new pl atform

6.2.2           Special Considera tions for Some Typ es of Too ls (K1)

Test Ex ecution Too ls

Test exe cution tools execute test objects usi ng automated test scripts . This type o f tool often requires significant e ffort in order to achieve significant b enefits.

Capturing tests by re cording the actions of a manual test er seems attractive, but this approac h does not scal e to large numbers of aut omated test scripts. A c aptured scrip t is a linear representati on with specific data and actions as part of each script. This type of scrip t may be unstable when unexpected events o ccur.

A data-driven testing approach separates ou t the test inputs (the data ), usually into a spreadsheet, and use s a more ge neric test script that can read the inp ut data and e xecute the s ame test script with diffe rent data. Testers who are not familiar with the scripting lang uage can then create th e test data for these predefined scripts.

There are other techniques employed in data-driven techniques, where instead of hard-coded data combinations placed in a spreadsheet, data is generated using algori thms based on configurable parameters at run ti me and supplied to the a pplication. F or example, a tool may use an algorithm, which ge nerates a ra ndom user I D, and for r epeatability in pattern, a seed is employed for controlli ng randomn ess.

In a key word-driven testing appr oach, the sp readsheet co ntains keyw ords describ ing the actio ns to be taken (also called action word s), and test data. Testers (even if th ey are not familiar with the scripting language) c an then define tests usin g the keywo rds, which c an be tailore d to the application being tested.

Technic al expertise in the scripti ng language is needed fo r all approaches (either by testers or by specialis ts in test au tomation).

Regardl ess of the scripting techn ique used, the expected results for e ach test nee d to be stor ed for later co mparison.

Static Analysis Too ls

Static an alysis tools applied to s ource code can enforce c oding standards, but if a pplied to existing code ma y generate a large quantity of messa ges. Warnin g messages do not stop the code fro m being tra nslated into an executa ble program, but ideally s hould be addressed so that maintenance of the co de is easier in the future. A gradual implementation of the analysis tool with initial filters to exclude some mess ages is an effective appr oach.

Test Management T ools

Test management to ols need to interface wit h other tools or spreadsh eets in order to produce useful information in a format that fits th e needs of the organization.

6.3   Introducing a Tool into an Organization (K1)

Terms

No specific terms.

Background

The main considerations in selec ting a tool fo r an organization includ e:

o        Ass essment of o rganizationa l maturity, strengths and weaknesses and identification of opp ortunities for an improve d test proces s supported by tools

o Evaluation again st clear requ irements and objective criteria

o        A proof-of-conce pt, by using a test tool during the ev aluation pha se to establi sh whether it perf orms effectiv ely with the software un der test and within the cu rrent infrast ructure or to

o

identify changes needed to t hat infrastru cture to effec tively use the tool

 

Evaluation of the vendor (including traini ng, support a nd commer cial aspects) or service support

 

o

sup pliers in case of non-commercial tool s

 

Identification of internal requirements for coaching an d mentoring in the use of the tool

 

o Evaluation of training needs considering the current test team’s te st automati on skills

 

o

Esti mation of a c ost-benefit ratio based o n a concrete business c ase

 

Introducing the selec ted tool into an organization starts with a pilot pr oject, which has the following objective s:

o Lear n more detail about the tool

o    Evaluate how th e tool fits with existing processes and practices, a nd determin e what would nee d to change

o    Decide on standard ways of using, managing, storing and maintaining the tool and the tes t asse ts (e.g., deciding on na ming conven tions for files and tests, c reating libraries and defining the modularity of test suites)

o Ass ess whether the benefits will be achie ved at reas onable cost

Success factors for the deploym ent of the too l within an o rganization include:

o Rolling out the to ol to the rest of the organization incrementally

o Adapting and improving processes to fit with the use of the tool

o Prov iding training and coaching/mentoring for new us ers

o Defining usage g uidelines

o Implementing a way to gathe r usage information from the actual use

o Monitoring tool u se and bene fits

o Prov iding support for the tes t team for a given tool

o Gat hering lesso ns learned from all teams