SlideShare a Scribd company logo
10- Verification and Validation
• Assuring that a software system
meets a user’s needs
• Verification:
• “Are we building the product right”
– The software should conform to its specification.
• Validation:
• “Are we building the right product”
– The software should do what the user really requires.
Verification vs Validation
• Is a whole life-cycle process - V & V must be
applied at each stage in the software process.
• Has two principal objectives:
– The discovery of defects in a system.
– The assessment of whether or not the system is
usable in an operational situation.
The V & V Process
• Dynamic V & V Concerned with exercising
and observing product behavior (testing).
• Static verification Concerned with analysis
of the static system representation to
discover problems.
Dynamic and Static Verification
Static and Dynamic V&V
Formal
specification
High-level
design
Requirements
specification
Detailed
design
Program
Prototype
Dynamic
validation
Static
verification
Software Engineering Lec 10 -software testing--
Example
Testing – Terminology
• Test case
– Individual test
• Test suite
– Collection of test cases
• Test harness
– Program that executes a series of test cases
• Test framework
– Software that facilitates writing & running tests
– Example – JUnit
Testing – Terminology
• Test driver
– Program to create environment for running tests
– Declares variables, creates objects, assigns values
– Executes code and displays results of tests
• Tester (Quality Assurance)
– Person devising and / or performing tests
– More effective if 2nd person writes tests
• Walkthrough
– Programmer explains code to 2nd person
Types of Testing
• Program testing can be used to show the
presence of bugs, but never their absence.”
• Static: Done without actually executing program
– Code inspections
– Walkthroughs
• Dynamic: Done by executing program or its parts
– Module or unit testing
– Integration testing
– System testing
Who Tests the Software?
developer independent tester
Understands the system
but, will test "gently"
and, is driven by
"delivery"
Must learn about the system,
but, will attempt to break it
and, is driven by quality
Testing Principles
• All tests should be traceable to customer
requirements.
• Tests should be planned before testing begins.
• 80% of all errors are in 20% of the code.
• Testing should begin in the small and progress to
the large.
• Exhaustive testing is not possible.
• Testing should be conducted by an independent
third party if possible.
Software Defect Causes
• Specification may be wrong.
• Specification may be a physical impossibility.
• Faulty program design.
• Program may be incorrect.
Types of Errors
• Algorithmic error.
• Computation & precision error.
• Documentation error.
• Capacity error or boundary error.
• Timing and coordination error.
• Performance error.
• Recovery error.
• Hardware & system software error.
• Standards & procedure errors.
Test Strategies
• Black-box or behavioral testing
– knowing the specified function a product is to
perform and demonstrating correct operation based
solely on its specification without regard for its
internal logic
• White-box or glass-box testing
– knowing the internal workings of a product, tests
are performed to check the workings of all possible
logic paths
Black-box Testing
• Focus: I/O behavior. If for any given input, we can predict the
output, then the module passes the test.
– Almost always impossible to generate all possible inputs
("test cases")
• Goal: Reduce number of test cases by equivalence
partitioning:
– Divide input conditions into equivalence classes
– Choose test cases for each equivalence class. (Example: If
an object is supposed to accept a negative number, testing
one negative number is enough)
Black-box Testing (Continued)
• Selection of equivalence classes (No rules, only guidelines):
– Input is valid across range of values. Select test cases from
3 equivalence classes:
• Below the range
• Within the range
• Above the range
– Input is valid if it is from a discrete set. Select test cases
from 2 equivalence classes:
• Valid discrete value
• Invalid discrete value
• Another solution to select only a limited amount of test cases:
– Get knowledge about the inner workings of the unit being
tested => white-box testing
White-box Testing
• Focus: Thoroughness (Coverage). Every statement in
the component is executed at least once.
• Four types of white-box testing
– Statement Testing
– Loop Testing
– Path Testing
– Branch Testing
White-box Testing (Continued)
• Statement Testing (Algebraic Testing): Test single statements
(Choice of operators in polynomials, etc)
• Loop Testing:
– Cause execution of the loop to be skipped completely.
(Exception: Repeat loops)
– Loop to be executed exactly once
– Loop to be executed more than once
• Path testing:
– Make sure all paths in the program are executed
• Branch Testing (Conditional Testing): Make sure that each
possible outcome from a condition is tested at least once
• Unit testing
– testing of individual components
• Module testing
– testing of collections of dependent components
• Sub-system testing
– testing collections of modules integrated into sub-systems
• System testing
– testing the complete system prior to delivery
• Acceptance testing
– testing by users to check that the system satisfies
requirements. Sometimes called alpha testing
Testing Stages
The Testing Process
Sub-system
testing
Module
testing
Unit
testing
System
testing
Acceptance
testing
Component
testing
Integration testing User
testing
Unit Testing
• Program reviews.
• Formal verification.
• Testing the program itself.
– black box and white box testing.
Types of Testing
• Unit Testing:
– Individual subsystem
– Carried out by developers
– Goal: Confirm that subsystems is correctly coded
and carries out the intended functionality
• Integration Testing:
– Groups of subsystems (collection of classes) and
eventually the entire system
– Carried out by developers
– Goal: Test the interface among the subsystem
System Testing
• System Testing:
– The entire system
– Carried out by developers
– Goal: Determine if the system meets the requirements
(functional and global)
• Acceptance Testing:
– Evaluates the system delivered by developers
– Carried out by the client. May involve executing typical
transactions on site on a trial basis
– Goal: Demonstrate that the system meets customer
requirements and is ready to use
• Implementation (Coding) and testing go hand in hand
Eaxample:Test Cases
• Test case 1 : ? (To execute loop exactly
once)
• Test case 2 : ? (To skip loop body)
• Test case 3: ?,? (to execute loop more
than once)
These 3 test cases cover all control flow
paths
Generating Test Data
• Ideally want to test every permutation of valid
and invalid inputs
• Equivalence partitioning it often required to
reduce to infinite test case sets
– Every possible input belongs to one of the
equivalence classes.
– No input belongs to more than one class.
– Each point is representative of class.
Regression Testing
• Check for defects propagated to other modules by
changes made to existing program
– Representative sample of existing test cases is used to
exercise all software functions.
– Additional test cases focusing software functions likely to
be affected by the change.
– Tests cases that focus on the changed software components.
Integration Testing
• Bottom - up testing (test harness).
• Top - down testing (stubs).
• Modified top - down testing - test levels
independently.
• Big Bang.
• Sandwich testing.
Top-Down Integration Testing
• Main program used as a test driver and stubs are
substitutes for components directly subordinate to it.
• Subordinate stubs are replaced one at a time with real
components (following the depth-first or breadth-first
approach).
• Tests are conducted as each component is integrated.
• On completion of each set of tests and other stub is
replaced with a real component.
• Regression testing may be used to ensure that new
errors not introduced.
Bottom-Up Integration Testing
• Low level components are combined in clusters that
perform a specific software function.
• A driver (control program) is written to coordinate
test case input and output.
• The cluster is tested.
• Drivers are removed and clusters are combined
moving upward in the program structure.
Acceptance Testing
• Making sure the software works correctly for
intended user in his or her normal work
environment.
• Alpha test
– version of the complete software is tested by customer
under the supervision of the developer at the
developer’s site
• Beta test
– version of the complete software is tested by customer
at his or her own site without the developer being
present
System Testing
• Recovery testing
– checks system’s ability to recover from failures
• Security testing
– verifies that system protection mechanism prevents
improper penetration or data alteration
• Stress testing
– program is checked to see how well it deals with
abnormal resource demands
• Performance testing
– tests the run-time performance of software
Performance Testing
• Stress test.
• Volume test.
• Configuration test
(hardware &
software).
• Compatibility.
• Regression tests.
• Security tests.
• Timing tests.
• Environmental tests.
• Quality tests.
• Recovery tests.
• Maintenance tests.
• Documentation tests.
• Human factors tests.
Testing Life Cycle
• Establish test objectives.
• Design criteria (review criteria).
– Correct.
– Feasible.
– Coverage.
– Demonstrate functionality .
• Writing test cases.
• Testing test cases.
• Execute test cases.
• Evaluate test results.
Testing Tools
• Simulators.
• Monitors.
• Analyzers.
• Test data generators.
Test Team Members
• Professional testers.
• Analysts.
• System designers.
• Configuration management specialists.
• Users.
Test Documentation Needed
• Requirement being tested.
• Design verification methodology.
• Code verification methodology.
Document Each Test Case
• Requirement tested.
• Facet / feature / path tested.
• Person & date.
• Tools & code needed.
• Test data & instructions.
• Expected results.
• Actual test results & analysis
• Correction, schedule, and signoff.
Test Case Design
"Bugs lurk in corners
and congregate at
boundaries ..."
Boris Beizer
OBJECTIVE
CRITERIA
CONSTRAINT
to uncover errors
in a complete manner
with a minimum of effort and time
Debugging
• Debugging (removal of a defect) occurs as a
consequence of successful testing.
• Some people better at debugging than others.
• Is the cause of the bug reproduced in another
part of the program?
• What “next bug” might be introduced by the fix
that is being proposed?
• What could have been done to prevent this bug
in the first place?
The Debugging Process
Locate
error
Design
error repair
Repair
error
Re-test
program
Debugging Approaches
• Brute force
– memory dumps and run-time traces are examined for
clues to error causes
• Backtracking
– source code is examined by looking backwards from
symptom to potential causes of errors
• Cause elimination
– uses binary partitioning to reduce the number of
locations potential where errors can exist
End of lecture

More Related Content

PPT
<p>Software Testing</p>
PPTX
Python: Object-Oriented Testing (Unit Testing)
PPTX
Software testing
PPTX
Integration in component based technology
PPT
Testing of Object-Oriented Software
PPT
Software testing-and-analysis
PPTX
Database Unit Testing Made Easy with VSTS
PPT
Software testing
<p>Software Testing</p>
Python: Object-Oriented Testing (Unit Testing)
Software testing
Integration in component based technology
Testing of Object-Oriented Software
Software testing-and-analysis
Database Unit Testing Made Easy with VSTS
Software testing

What's hot (19)

PPT
Manual testing - Introduction to Manual Software testing
PPT
PPT
debugging and testing
PPTX
Object oriented testing
PPTX
Software Testing Strategies
PPT
Paper CS
PPT
Testing strategies
PPTX
Fundamentals of software part 1
PPTX
Manual testing
PPT
Software Testing
PPTX
Structured system analysis and design
PPT
Test case design
PPTX
Validation testing
PPT
Automated Testing vs Manual Testing
PPTX
Testing strategies part -1
PPT
Software engineering Testing technique,test case,test suit design
PPTX
Software testing
PDF
ITFT--Software testing
PPTX
Testing
Manual testing - Introduction to Manual Software testing
debugging and testing
Object oriented testing
Software Testing Strategies
Paper CS
Testing strategies
Fundamentals of software part 1
Manual testing
Software Testing
Structured system analysis and design
Test case design
Validation testing
Automated Testing vs Manual Testing
Testing strategies part -1
Software engineering Testing technique,test case,test suit design
Software testing
ITFT--Software testing
Testing
Ad

Similar to Software Engineering Lec 10 -software testing-- (20)

PPTX
SENG202-v-and-v-modeling_121810.pptx
PPT
Software Engineering (Software Quality Assurance & Testing: Supplementary Mat...
PPT
testing strategies and tactics
PPTX
Software Quality Assurance
PPT
An overview to Software Testing
PPT
AutoTest.ppt
PDF
Software testing software engineering.pdf
PPT
4.3_Unit Testing.ppt gfdfghhhhhhhhhhhhh
PPTX
Software testing strategies And its types
PPT
Software Engineering (Testing Overview)
PPT
Software testing part
PPTX
Lect-6-Generic testing types.pptx
PPT
Unit 4 chapter 22 - testing strategies.ppt
PPT
Testing fundamentals
PPTX
Software Engg - Wk 11 - Lec 12 - Software_Testing Part-1.pptx
PPT
SOFTWARE ENGINEERING unit4-1 CLASS notes in pptx 2nd year
PDF
Integration and System Testing SE Unit-4 Part-4.pdf
PDF
Agile Software Testing the Agilogy Way
PPTX
A Software Testing Intro
PPTX
Software testing
SENG202-v-and-v-modeling_121810.pptx
Software Engineering (Software Quality Assurance & Testing: Supplementary Mat...
testing strategies and tactics
Software Quality Assurance
An overview to Software Testing
AutoTest.ppt
Software testing software engineering.pdf
4.3_Unit Testing.ppt gfdfghhhhhhhhhhhhh
Software testing strategies And its types
Software Engineering (Testing Overview)
Software testing part
Lect-6-Generic testing types.pptx
Unit 4 chapter 22 - testing strategies.ppt
Testing fundamentals
Software Engg - Wk 11 - Lec 12 - Software_Testing Part-1.pptx
SOFTWARE ENGINEERING unit4-1 CLASS notes in pptx 2nd year
Integration and System Testing SE Unit-4 Part-4.pdf
Agile Software Testing the Agilogy Way
A Software Testing Intro
Software testing
Ad

More from Taymoor Nazmy (20)

PPTX
Cognitive systems
PPTX
Cognitive systems
PPT
Artificial intelligent Lec 5-logic
PPT
Artificial intelligent Lec 3-ai chapter3-search
PPT
Lec 2-agents
PPT
Artificial intelligent Lec 1-ai-introduction-
PPTX
Image processing 2
PPT
Image processing 1-lectures
PPT
Software Engineering Lec 8-design-
PPT
Software Engineering Lec 7-uml-
PPT
Software Engineering Lec5 oop-uml-i
PPT
Software Engineering Lec 4-requirments
PPT
Software Engineering Lec 3-project managment
PPT
Software Engineering Lec 2
PPT
Software Engineering Lec 1-introduction
PPT
PPT
presentation skill
PPT
PPT
PPT
Cognitive systems
Cognitive systems
Artificial intelligent Lec 5-logic
Artificial intelligent Lec 3-ai chapter3-search
Lec 2-agents
Artificial intelligent Lec 1-ai-introduction-
Image processing 2
Image processing 1-lectures
Software Engineering Lec 8-design-
Software Engineering Lec 7-uml-
Software Engineering Lec5 oop-uml-i
Software Engineering Lec 4-requirments
Software Engineering Lec 3-project managment
Software Engineering Lec 2
Software Engineering Lec 1-introduction
presentation skill

Recently uploaded (20)

PDF
O5-L3 Freight Transport Ops (International) V1.pdf
PDF
Classroom Observation Tools for Teachers
PPTX
Orientation - ARALprogram of Deped to the Parents.pptx
PDF
Anesthesia in Laparoscopic Surgery in India
PDF
2.FourierTransform-ShortQuestionswithAnswers.pdf
PPTX
Microbial diseases, their pathogenesis and prophylaxis
PPTX
202450812 BayCHI UCSC-SV 20250812 v17.pptx
PDF
GENETICS IN BIOLOGY IN SECONDARY LEVEL FORM 3
PDF
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
PPTX
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
PPTX
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...
PDF
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
PDF
A systematic review of self-coping strategies used by university students to ...
PDF
O7-L3 Supply Chain Operations - ICLT Program
PDF
OBE - B.A.(HON'S) IN INTERIOR ARCHITECTURE -Ar.MOHIUDDIN.pdf
PPTX
GDM (1) (1).pptx small presentation for students
PPTX
Lesson notes of climatology university.
PDF
RMMM.pdf make it easy to upload and study
PDF
Trump Administration's workforce development strategy
PPTX
Cell Structure & Organelles in detailed.
O5-L3 Freight Transport Ops (International) V1.pdf
Classroom Observation Tools for Teachers
Orientation - ARALprogram of Deped to the Parents.pptx
Anesthesia in Laparoscopic Surgery in India
2.FourierTransform-ShortQuestionswithAnswers.pdf
Microbial diseases, their pathogenesis and prophylaxis
202450812 BayCHI UCSC-SV 20250812 v17.pptx
GENETICS IN BIOLOGY IN SECONDARY LEVEL FORM 3
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
A systematic review of self-coping strategies used by university students to ...
O7-L3 Supply Chain Operations - ICLT Program
OBE - B.A.(HON'S) IN INTERIOR ARCHITECTURE -Ar.MOHIUDDIN.pdf
GDM (1) (1).pptx small presentation for students
Lesson notes of climatology university.
RMMM.pdf make it easy to upload and study
Trump Administration's workforce development strategy
Cell Structure & Organelles in detailed.

Software Engineering Lec 10 -software testing--

  • 1. 10- Verification and Validation
  • 2. • Assuring that a software system meets a user’s needs • Verification: • “Are we building the product right” – The software should conform to its specification. • Validation: • “Are we building the right product” – The software should do what the user really requires. Verification vs Validation
  • 3. • Is a whole life-cycle process - V & V must be applied at each stage in the software process. • Has two principal objectives: – The discovery of defects in a system. – The assessment of whether or not the system is usable in an operational situation. The V & V Process
  • 4. • Dynamic V & V Concerned with exercising and observing product behavior (testing). • Static verification Concerned with analysis of the static system representation to discover problems. Dynamic and Static Verification
  • 5. Static and Dynamic V&V Formal specification High-level design Requirements specification Detailed design Program Prototype Dynamic validation Static verification
  • 8. Testing – Terminology • Test case – Individual test • Test suite – Collection of test cases • Test harness – Program that executes a series of test cases • Test framework – Software that facilitates writing & running tests – Example – JUnit
  • 9. Testing – Terminology • Test driver – Program to create environment for running tests – Declares variables, creates objects, assigns values – Executes code and displays results of tests • Tester (Quality Assurance) – Person devising and / or performing tests – More effective if 2nd person writes tests • Walkthrough – Programmer explains code to 2nd person
  • 10. Types of Testing • Program testing can be used to show the presence of bugs, but never their absence.” • Static: Done without actually executing program – Code inspections – Walkthroughs • Dynamic: Done by executing program or its parts – Module or unit testing – Integration testing – System testing
  • 11. Who Tests the Software? developer independent tester Understands the system but, will test "gently" and, is driven by "delivery" Must learn about the system, but, will attempt to break it and, is driven by quality
  • 12. Testing Principles • All tests should be traceable to customer requirements. • Tests should be planned before testing begins. • 80% of all errors are in 20% of the code. • Testing should begin in the small and progress to the large. • Exhaustive testing is not possible. • Testing should be conducted by an independent third party if possible.
  • 13. Software Defect Causes • Specification may be wrong. • Specification may be a physical impossibility. • Faulty program design. • Program may be incorrect.
  • 14. Types of Errors • Algorithmic error. • Computation & precision error. • Documentation error. • Capacity error or boundary error. • Timing and coordination error. • Performance error. • Recovery error. • Hardware & system software error. • Standards & procedure errors.
  • 15. Test Strategies • Black-box or behavioral testing – knowing the specified function a product is to perform and demonstrating correct operation based solely on its specification without regard for its internal logic • White-box or glass-box testing – knowing the internal workings of a product, tests are performed to check the workings of all possible logic paths
  • 16. Black-box Testing • Focus: I/O behavior. If for any given input, we can predict the output, then the module passes the test. – Almost always impossible to generate all possible inputs ("test cases") • Goal: Reduce number of test cases by equivalence partitioning: – Divide input conditions into equivalence classes – Choose test cases for each equivalence class. (Example: If an object is supposed to accept a negative number, testing one negative number is enough)
  • 17. Black-box Testing (Continued) • Selection of equivalence classes (No rules, only guidelines): – Input is valid across range of values. Select test cases from 3 equivalence classes: • Below the range • Within the range • Above the range – Input is valid if it is from a discrete set. Select test cases from 2 equivalence classes: • Valid discrete value • Invalid discrete value • Another solution to select only a limited amount of test cases: – Get knowledge about the inner workings of the unit being tested => white-box testing
  • 18. White-box Testing • Focus: Thoroughness (Coverage). Every statement in the component is executed at least once. • Four types of white-box testing – Statement Testing – Loop Testing – Path Testing – Branch Testing
  • 19. White-box Testing (Continued) • Statement Testing (Algebraic Testing): Test single statements (Choice of operators in polynomials, etc) • Loop Testing: – Cause execution of the loop to be skipped completely. (Exception: Repeat loops) – Loop to be executed exactly once – Loop to be executed more than once • Path testing: – Make sure all paths in the program are executed • Branch Testing (Conditional Testing): Make sure that each possible outcome from a condition is tested at least once
  • 20. • Unit testing – testing of individual components • Module testing – testing of collections of dependent components • Sub-system testing – testing collections of modules integrated into sub-systems • System testing – testing the complete system prior to delivery • Acceptance testing – testing by users to check that the system satisfies requirements. Sometimes called alpha testing Testing Stages
  • 22. Unit Testing • Program reviews. • Formal verification. • Testing the program itself. – black box and white box testing.
  • 23. Types of Testing • Unit Testing: – Individual subsystem – Carried out by developers – Goal: Confirm that subsystems is correctly coded and carries out the intended functionality • Integration Testing: – Groups of subsystems (collection of classes) and eventually the entire system – Carried out by developers – Goal: Test the interface among the subsystem
  • 24. System Testing • System Testing: – The entire system – Carried out by developers – Goal: Determine if the system meets the requirements (functional and global) • Acceptance Testing: – Evaluates the system delivered by developers – Carried out by the client. May involve executing typical transactions on site on a trial basis – Goal: Demonstrate that the system meets customer requirements and is ready to use • Implementation (Coding) and testing go hand in hand
  • 25. Eaxample:Test Cases • Test case 1 : ? (To execute loop exactly once) • Test case 2 : ? (To skip loop body) • Test case 3: ?,? (to execute loop more than once) These 3 test cases cover all control flow paths
  • 26. Generating Test Data • Ideally want to test every permutation of valid and invalid inputs • Equivalence partitioning it often required to reduce to infinite test case sets – Every possible input belongs to one of the equivalence classes. – No input belongs to more than one class. – Each point is representative of class.
  • 27. Regression Testing • Check for defects propagated to other modules by changes made to existing program – Representative sample of existing test cases is used to exercise all software functions. – Additional test cases focusing software functions likely to be affected by the change. – Tests cases that focus on the changed software components.
  • 28. Integration Testing • Bottom - up testing (test harness). • Top - down testing (stubs). • Modified top - down testing - test levels independently. • Big Bang. • Sandwich testing.
  • 29. Top-Down Integration Testing • Main program used as a test driver and stubs are substitutes for components directly subordinate to it. • Subordinate stubs are replaced one at a time with real components (following the depth-first or breadth-first approach). • Tests are conducted as each component is integrated. • On completion of each set of tests and other stub is replaced with a real component. • Regression testing may be used to ensure that new errors not introduced.
  • 30. Bottom-Up Integration Testing • Low level components are combined in clusters that perform a specific software function. • A driver (control program) is written to coordinate test case input and output. • The cluster is tested. • Drivers are removed and clusters are combined moving upward in the program structure.
  • 31. Acceptance Testing • Making sure the software works correctly for intended user in his or her normal work environment. • Alpha test – version of the complete software is tested by customer under the supervision of the developer at the developer’s site • Beta test – version of the complete software is tested by customer at his or her own site without the developer being present
  • 32. System Testing • Recovery testing – checks system’s ability to recover from failures • Security testing – verifies that system protection mechanism prevents improper penetration or data alteration • Stress testing – program is checked to see how well it deals with abnormal resource demands • Performance testing – tests the run-time performance of software
  • 33. Performance Testing • Stress test. • Volume test. • Configuration test (hardware & software). • Compatibility. • Regression tests. • Security tests. • Timing tests. • Environmental tests. • Quality tests. • Recovery tests. • Maintenance tests. • Documentation tests. • Human factors tests.
  • 34. Testing Life Cycle • Establish test objectives. • Design criteria (review criteria). – Correct. – Feasible. – Coverage. – Demonstrate functionality . • Writing test cases. • Testing test cases. • Execute test cases. • Evaluate test results.
  • 35. Testing Tools • Simulators. • Monitors. • Analyzers. • Test data generators.
  • 36. Test Team Members • Professional testers. • Analysts. • System designers. • Configuration management specialists. • Users.
  • 37. Test Documentation Needed • Requirement being tested. • Design verification methodology. • Code verification methodology.
  • 38. Document Each Test Case • Requirement tested. • Facet / feature / path tested. • Person & date. • Tools & code needed. • Test data & instructions. • Expected results. • Actual test results & analysis • Correction, schedule, and signoff.
  • 39. Test Case Design "Bugs lurk in corners and congregate at boundaries ..." Boris Beizer OBJECTIVE CRITERIA CONSTRAINT to uncover errors in a complete manner with a minimum of effort and time
  • 40. Debugging • Debugging (removal of a defect) occurs as a consequence of successful testing. • Some people better at debugging than others. • Is the cause of the bug reproduced in another part of the program? • What “next bug” might be introduced by the fix that is being proposed? • What could have been done to prevent this bug in the first place?
  • 41. The Debugging Process Locate error Design error repair Repair error Re-test program
  • 42. Debugging Approaches • Brute force – memory dumps and run-time traces are examined for clues to error causes • Backtracking – source code is examined by looking backwards from symptom to potential causes of errors • Cause elimination – uses binary partitioning to reduce the number of locations potential where errors can exist