Software Testing Reference Software Engineering Ian Sommerville 6

  • Slides: 37
Download presentation
Software Testing Reference: Software Engineering, Ian Sommerville, 6 th edition, Chapter 20 CMSC 345,

Software Testing Reference: Software Engineering, Ian Sommerville, 6 th edition, Chapter 20 CMSC 345, Version 4/03

Topics Covered l l The testing process Defect testing • Black box testing »

Topics Covered l l The testing process Defect testing • Black box testing » Equivalence partitions • White box testing » Equivalence partitions » Path testing l Integration testing • • • l Top-down, bottom-up, sandwich, thread Interface testing Stress testing Object-oriented testing CMSC 345, Version 4/03 2

Testing Goal l l The goal of testing is to discover defects in programs.

Testing Goal l l The goal of testing is to discover defects in programs. A successful test is a test that causes a program to behave in an anomalous way. Tests show the presence, not the absence, of defects. Test planning should be continuous throughout the software development process. CMSC 345, Version 4/03 3

The Testing Process l Component testing • • l Integration and system testing •

The Testing Process l Component testing • • l Integration and system testing • • • l Testing of individual program components (e. g. , functions, methods, or classes) Usually the responsibility of the component developer (except sometimes for critical systems) Testing of groups of components integrated to create a subsystem or entire system Usually the responsibility of an independent testing team Tests are based on system specification Acceptance testing • • Run in presence of customer or by customer Used to validate all system requirements CMSC 345, Version 4/03 4

The Testing Process (con’t) l l Test data - Inputs that have been devised

The Testing Process (con’t) l l Test data - Inputs that have been devised to conduct the particular test Expected output – Recorded before test is conducted Actual output – The output actually received Pass/fail criteria – Criteria to determine if test passed or failed when the actual results are compared to the expected results. Determined before test is conducted. CMSC 345, Version 4/03 5

Black Box Testing l l l An approach to testing where the program is

Black Box Testing l l l An approach to testing where the program is considered as a “black box” (i. e. , one cannot “see” inside of it) The program test cases are based on the system specification, not the internal workings (e. g. , algorithms), of the program. Use equivalence partitions when conducting black box testing CMSC 345, Version 4/03 6

Equivalence Partitioning l l l Input data and output results often fall into different

Equivalence Partitioning l l l Input data and output results often fall into different classes where all members of a class are related. Each of these classes is an equivalence partition where the program behaves in an equivalent way for each class member. Test cases should be chosen from each partition, especially at the boundaries. CMSC 345, Version 4/03 7

Equivalence Partitions Example 1 l Partition system inputs into groups (partitions) that should cause

Equivalence Partitions Example 1 l Partition system inputs into groups (partitions) that should cause equivalent behavior. Include both valid and invalid inputs. • l If input is a 5 -digit integer between 10, 000 and 99, 999, equivalence partitions are: < 10, 000 - 99, 999 > 99, 999 Choose test cases at the boundaries of these partitions: • 9, 999 CMSC 345, Version 4/03 10, 000 99, 999 100, 000 8

Equivalence Partitions Example 2 CMSC 345, Version 4/03 9

Equivalence Partitions Example 2 CMSC 345, Version 4/03 9

Example - Search Routine procedure Search (Key : ELEM ; T: ELEM_ARRAY; Found :

Example - Search Routine procedure Search (Key : ELEM ; T: ELEM_ARRAY; Found : in out BOOLEAN; L: in out ELEM_INDEX) ; Pre-condition -- the array has at least one element T’FIRST <= T’LAST Post-condition -- the element is found and is referenced by L ( Found and T (L) = Key) or -- the element is not in the array ( not Found and not (exists i, T’FIRST >= i <= T’LAST, T (i) = Key )) CMSC 345, Version 4/03 10

Search Routine Input Categories l l Inputs that conform to the pre-conditions Inputs where

Search Routine Input Categories l l Inputs that conform to the pre-conditions Inputs where a pre-condition does not hold* Inputs where the key element is a member of the array Inputs where the key element is not a member of the array *According to Sommerville CMSC 345, Version 4/03 11

What Tests Should We Use? CMSC 345, Version 4/03 12

What Tests Should We Use? CMSC 345, Version 4/03 12

Search Routine Test Cases CMSC 345, Version 4/03 13

Search Routine Test Cases CMSC 345, Version 4/03 13

White Box Testing l l Sometimes called structural testing Derivation of test cases according

White Box Testing l l Sometimes called structural testing Derivation of test cases according to program structure (can “see” inside) Objective is to exercise all program statements at least once Usually applied to relatively small program units such as functions or class methods CMSC 345, Version 4/03 14

CMSC 345, Version 4/03 Binary search (Java)15

CMSC 345, Version 4/03 Binary search (Java)15

Any New Tests? CMSC 345, Version 4/03 16

Any New Tests? CMSC 345, Version 4/03 16

Binary Search Test Cases CMSC 345, Version 4/03 17

Binary Search Test Cases CMSC 345, Version 4/03 17

Path Testing l l The objective of path testing is to ensure that the

Path Testing l l The objective of path testing is to ensure that the set of test cases is such that each path through the program is executed at least once. The starting point for path testing is a program flow graph. CMSC 345, Version 4/03 18

CMSC 345, Version 4/03 Binary search flow graph 19

CMSC 345, Version 4/03 Binary search flow graph 19

Independent Paths l l l 1, 2, 8, 9 1, 2, 3, 4, 6,

Independent Paths l l l 1, 2, 8, 9 1, 2, 3, 4, 6, 7, 2 1, 2, 3, 4, 5, 7, 2 Test cases should be derived so that all of these paths are executed. A dynamic program analyser may be used to check that paths have been executed (e. g. , LINT). CMSC 345, Version 4/03 20

Cyclomatic Complexity l l The minimum number of tests needed to test all statements

Cyclomatic Complexity l l The minimum number of tests needed to test all statements equals the cyclomatic complexity. CC = number_edges – number_nodes + 2 In the case of no goto’s, CC = number_decisions + 1 Although all paths are executed, all combinations of paths are not executed. CMSC 345, Version 4/03 21

Integration Testing l l Tests the complete system or subsystems composed of integrated components

Integration Testing l l Tests the complete system or subsystems composed of integrated components Integration testing is black box testing with tests derived from the requirements and design specifications. Main difficulty is localizing errors Incremental integration testing reduces this problem. CMSC 345, Version 4/03 22

Incremental Integration Testing CMSC 345, Version 4/03 23

Incremental Integration Testing CMSC 345, Version 4/03 23

Approaches to Integration Testing l Top-down testing • l Bottom-up testing • l Start

Approaches to Integration Testing l Top-down testing • l Bottom-up testing • l Start with high-level system and integrate from the top-down, replacing individual components by stubs where appropriate Integrate individual components in levels until the complete system is created In practice, most integration testing involves a combination of both of these strategies: • • Sandwich testing (outside-in) Thread testing CMSC 345, Version 4/03 24

Top-down Testing CMSC 345, Version 4/03 25

Top-down Testing CMSC 345, Version 4/03 25

Bottom-up Testing CMSC 345, Version 4/03 26

Bottom-up Testing CMSC 345, Version 4/03 26

Method “Pro’s” l Top-down l Bottom-up l Sandwich l Thread CMSC 345, Version 4/03

Method “Pro’s” l Top-down l Bottom-up l Sandwich l Thread CMSC 345, Version 4/03 27

Interface Testing l Interface misuse • l A calling component calls another component and

Interface Testing l Interface misuse • l A calling component calls another component and makes an error in its use of its interface; e. g. , parameters in the wrong order or of the wrong type. Interface misunderstanding • A calling component embeds assumptions about the behavior of the called component which are incorrect. CMSC 345, Version 4/03 28

Interface Types l Parameter interfaces • l Shared memory interfaces • l Block of

Interface Types l Parameter interfaces • l Shared memory interfaces • l Block of memory is shared between procedures Procedural interfaces • l Data passed from one procedure to another Sub-system encapsulates a set of procedures to be called by other sub-systems Message passing interfaces • Sub-systems request services from other sub-systems CMSC 345, Version 4/03 29

Some Interface Testing Guidelines l l Design tests so that parameters to a called

Some Interface Testing Guidelines l l Design tests so that parameters to a called procedure at the extreme ends of their ranges (boundaries). Always test pointer parameters with null pointers. Design tests that cause the component to fail. In shared memory systems, vary the order in which components are activated. CMSC 345, Version 4/03 30

Stress Testing l l l Exercises the system beyond its maximum design load. Stressing

Stress Testing l l l Exercises the system beyond its maximum design load. Stressing the system often causes defects to come to light. Systems should not fail catastrophically. Stress testing checks for unacceptable loss of service or data. Particularly relevant to distributed systems which can exhibit severe degradation as a network becomes overloaded. CMSC 345, Version 4/03 31

Object-oriented Testing l l l The components to be tested are object classes that

Object-oriented Testing l l l The components to be tested are object classes that are instantiated as objects. No obvious “top” or “bottom” to the system for top-down or bottom-up integration and testing. Levels: • • Testing class methods Testing the class as a whole Testing clusters of cooperating classes Testing the complete OO system CMSC 345, Version 4/03 32

Object Class Testing l Complete test coverage of a class involves • • •

Object Class Testing l Complete test coverage of a class involves • • • l Testing all operations associated with an object Setting and interrogating all object attributes Exercising the object in all possible states Inheritance makes it more difficult to design object class tests as the information to be tested is not localized. CMSC 345, Version 4/03 33

Object Integration l l l Levels of integration are less distinct in objectoriented systems.

Object Integration l l l Levels of integration are less distinct in objectoriented systems. Cluster testing is concerned with integrating and testing clusters of cooperating objects. Identify clusters using knowledge of the operation of objects and the system features that are implemented by these clusters CMSC 345, Version 4/03 34

Approaches to Cluster Testing l Use case or scenario testing • • l Thread

Approaches to Cluster Testing l Use case or scenario testing • • l Thread testing • l Testing is based on user interactions with the system Has the advantage that it tests system features as experienced by users Tests the system’s response to events as processing threads through the system Object interaction testing • Tests sequences of object interactions that stop when an object operation does not call on services from another object CMSC 345, Version 4/03 35

Scenario-based Testing l Identify scenarios from use cases and supplement these with sequence diagrams

Scenario-based Testing l Identify scenarios from use cases and supplement these with sequence diagrams that show the objects involved in the scenario CMSC 345, Version 4/03 36

Sample Sequence Diagram CMSC 345, Version 4/03 37

Sample Sequence Diagram CMSC 345, Version 4/03 37