A. Organizational Approach
At Bitsoft, every project has the following
general team organization, although the project size may dictate
that a single individual handles several of the roles.
- Project lead .
- Design/Development lead and
team members .
- Test lead and team members
(part of the Verification and Validation team - a pooled
- QA member/s (pooled from
company resources and reporting directly to the QA organization
in the Company) .
- Configuration Control member
(again from pooled resources).
B. Development/Testing Strategy
Based on the nature
of the project, an overall product test plan is prepared. This
addresses both management issues (staffing, time schedules) and
technical issues (tools, standards, guidelines)
Product/package level testing
With the Software Requirements
Specifications (SRS), the design team proceeds with the design
(top level, component level, and module level). With the same
SRS, the test team prepares Product Level Test Procedures and
test cases. Essentially, this is black box testing. Testing
attempts to find errors in the following categories:
This testing is
automated as much as possible. The test team writes test drivers and
test stubs, prepares input data, computes expected output, writes
output comparators, etc. If required, a database is used.
- Incorrect or missing
- Interface errors (either
hardware interface or software interface) or external
database access errors .
- Performance errors.
- Initialization and
Once the component level design documents
are ready, the test team prepares component level test
procedures and test cases. Essentially, this is also black box
testing. Testing attempts to find errors mentioned in product
level testing ( but at component level ) as well as errors in
the following categories :
- Errors in data structures
- Interface errors between
- Access errors (addressing,
shared, public, private, dependency, etc.) .
- Memory leaks .
Various black box testing techniques like Equivalence
Partitioning, Boundary Value Analysis, Cause-Effect Graphing,
Random testing, Error Guessing, and User-profile testing are
used. Again this is dynamic analysis. The testing is completely
automated with the test team writing test drivers, test cases,
Module or Unit level
(In our experience, this step is very
critical and is often ignored. Most of the time, the programmer
superficially does it herself. Good structural testing always
- This testing is done once
the module/Unit coding is done. Both the design and test
teams take part in this testing, though the primary
responsibility lies with the test team.
- In this step both static
and dynamic analysis of the code is carried out.
- Static analysis of the
code is done first. Some of the methods we use for static
analysis are :
- Data flow analysis (to
identify undefined or un-initialized variables,
variables declared but not used, redundant operations
data reads from locations not previously written into,
- Control flow analysis
(to identify multiple entries and exits, redundant paths
and unused code, unreachable code, etc.)
- Information flow
analysis (dependencies between input and output
variables, use of unused data when forming output
values, use of incorrect variables to compose output
- Complexity Metric
Analysis (McCabe's complexity metric). We have
experience in using McCabe's testing tool.
- After static analysis, the
team performs dynamic analysis. Here "Structural" or "White
Box" testing is done on the unit code. This involves a
reasonable amount of complete path testing, branch or
decision testing, condition testing, and loop testing