Testing Conventional Applications
Testing Conventional Applications
Testing Conventional Applications
Pressman – 7 th edition
MACFAST
Lat-00 reg- 00
Overview
• Here are the generic steps followed to carry out any type of
Black Box Testing.
– Initially requirements and specifications of the system are examined.
– Tester chooses valid inputs (positive test scenario) to check whether
SUT processes them correctly . Also some invalid inputs (negative test
scenario) are chosen to verify that the SUT is able to detect them.
– Tester determines expected outputs for all those inputs.
– Software tester constructs test cases with the selected inputs.
– The test cases are executed.
– Software tester compares the actual outputs with the expected
outputs.
– Defects if any are fixed and re-tested.
LEVELS APPLICABLE TO
• Black Box Testing method is applicable to all
levels of the software testing process:
• Unit Testing
• Integration Testing
• System Testing
• Acceptance Testing
• The higher the level, and hence the bigger and
more complex the B box
• While White Box Testing (Unit Testing) validates internal structure and
working of your software code, the main focus of black box testing is on the
validation of your functional requirements.
• To conduct White Box Testing , knowledge of underlying programming
language is essential. Current day software systems use a variety of
programming languages and technologies and its not possible to know all of
them. Black box testing gives abstraction from code and focuses testing
effort on the software system behaviour.
• Also software systems are not developed in a single chunk but development
is broken down in different modules. Black box testing facilitates testing
communication amongst modules (Integration Testing) .
• In case you push code fixes in your live software system , a complete system
check (black box regression tests) becomes essential.
• White box testing has its own merits and help detect many internal errors
which may degrade system performance
Tools used for Black Box Testing:
• Black-box technique that divides the input domain into classes of data
from which test cases can be derived
• An ideal test case uncovers a class of errors that might require many
arbitrary test cases to be executed before a general error is observed
• Equivalence class guidelines:
• If input condition specifies a range, one valid and two invalid equivalence
classes are defined
• If an input condition requires a specific value, one valid and two invalid
equivalence classes are defined
• If an input condition specifies a member of a set, one valid and one
invalid equivalence class is defined
• If an input condition is Boolean, one valid and one invalid equivalence
class is defined
Equivalence Class partitioning
• Divide the input space into equivalent classes
• If the software works for a test case from a class the
it is likely to work for all
• Can reduce the set of test cases if such equivalent
classes can be identified
• Getting ideal equivalent classes is impossible
• Approximate it by identifying classes for which
different behavior is specified
Testing 79
Equivalence class partitioning…
• Rationale: specification requires same
behavior for elements in a class
• Software likely to be constructed such that it
either fails for all or for none.
• E.g. if a function was not designed for negative
numbers then it will fail for all the negative
numbers
• For robustness, should form equivalent classes
for invalid inputs also
Testing 80
Equivalent class partitioning..
Testing 82
Equivalence class…
• Once eq classes selected for each of the
inputs, test cases have to be selected
– Select each test case covering as many valid eq
classes as possible
– Or, have a test case that covers at most one valid
class for each input
– Plus a separate test case for each invalid class
Testing 83
Example
• Consider a program that takes 2 inputs – a
string s and an integer n
• Program determines n most frequent
characters
• Tester believes that programmer may deal
with diff types of chars separately
• A set of valid and invalid equivalence classes is
given
Testing 84
Example..
Testing 85
Example…
• Test cases (i.e. s , n) with first method
– s : str of len < N with lower case, upper case, numbers, and
special chars, and n=5
– Plus test cases for each of the invalid eq classes
– Total test cases: 1+3= 4
• With the second approach
– A separate str for each type of char (i.e. a str of numbers,
one of lower case, …) + invalid cases
– Total test cases will be 5 + 2 = 7
Testing 86
Boundary value analysis
• A greater number of errors occurs at the boundaries of the
input domain rather than in the “center.”
• Boundary value analysis leads to a selection of test cases that
exercise bounding values.
• Programs often fail on special values
• These i/p values often lie on edge/boundary of equivalence
classes
• Test cases that have boundary values have high yield
• These are also called extreme cases
• A BV test case is a set of input data that lies on the edge of a
eq class of input/output
Testing 89
Boundary Value Analysis
Testing 92
• Popularized by G. Taguchi.
• An orthogonal array is a type of experiment where the columns for the independent
variables are “orthogonal” to one another.
• Benefits:
• 1. Conclusions valid over the entire region spanned by the control factors
• and their settings
• 2. Large saving in the experimental effort
• 3. Analysis is easy
• To define an orthogonal array, one must identify:
• 1. Number of factors to be studied
• 2. Levels for each factor
• 3. The specific 2-factor interactions to be estimated
• 4. The special difficulties that would be encountered in running the
• experiment
OAT
• Orthogonal array testing can be applied to problems
in which the input domain is relatively small but too
large to accommodate exhaustive testing.
• The orthogonal array is a systematic, statistical way of
software testing & is particularly useful in finding
region faults—an error category associated with faulty
logic within a software component.
• The net effects of organizing the experiment in such
treatments is that the same piece of information is
gathered in the minimum number of experiments
• X y z has 3 values associated . So 27 diff t
cases. one input item at a time may be varied
in sequence along each input axis. This results
in relatively limited coverage of the input
domain (lhs)
• When orthogonal array testing occurs, an
L9(3^3) orthogonal array of test cases is
created. The L9 orthogonal array has a
“balancing property”. That is, t c (represented
by dark dots in the figure) are “dispersed
uniformly throughout the test domain,” as
illustrated in rhs. Test coverage across the
input domain is more complete. *
• See example in doc
• Eg: send fn of fax , 81 tc reqd
• Runs: the number of rows in the array. This directly translates to the number of test cases
that will be generated by the OATS technique.
• Factors: the number of columns in an array. This directly translates to the maximum
number of variables that can be handled by this array.
• Levels: the maximum number of values that can be taken on by any single factor. An
orthogonal array will contain values from 0 to Levels-1.
OAT
• An orthogonal array is an array of values in which
each column represents a factor.
• A factor is
– a variable in an experiment; in our case, a
specific class family in the software system
– Or states of that class family
• Each variable takes on a certain set of values called
levels (rows); in our case, specific classes or states
of those classes
• Each cell is an instance of the class, that is, a
specific object or a particular state
What are orthogonal
arrays
• Consider numbers 1 and 2 – how many pair
combinations are there?
• An orthogonal array is a 2-dimensional array of
numbers that has this property: Choose any two
columns of the array; all the pairwise combinations
of its values will occur in every pair of columns.
Example
The failure occurred because the horizontal velocity exceeded the maximum value for a
16 bit unsigned integer when it was converted from it's signed 64 bit representation.
This failure generated an exception in the code which was not caught and thus
propagated up through the processor and ultimately caused the failure.
The failure was thus entirely due to a single line of code.
Errors- Casualities
• 85-87, 2 patients died by OD of radiation
delivered by Therac-25 accelerator- control sw
failure
• 91 Gulf War Scud penetrated Patriot
antimissile shield and struck at Dhahran, SA
-28 Americans killed. Israelis found the error.
• 2003 US treasury has generated cheques with
out the name of beneficiary
"Phased Array Tracking Radar to Intercept
On Target"