Skip to content

TESTING

August 10, 2009
I. Software Testing:
Determine whether the system meets specifications (producer view).
Determine whether the system meets business and user needs (customer view).
Process of finding bugs in a software product.
Fig: 1 Software Testing Levels
II. Software Testing History:
From
To
Phase
Description
1956
Debugging
1. Software Programs are written and then tested by the developers until
they are satisfied that all bugs were removed.
2. There is no clear difference between testing and development.
1957
1978
Demonstration
Debugging is still performed to ensure that the system is bug free. But an
additional testing method is added to check the functionality.
1979
1982
Destruction
The testing is focused on which the system does its expected functionality
and tries to identify bugs. The term Debugging now means fault
identification.
1983
1987
Evaluation
In this, the faults identified at the previous stage in development lifecycle are
resolved, and then testing becomes an activity at the end of each stage of
the lifecycle.
1988
Prevention
The idea is to prevent errors in each stage of the development lifecycle by
testing. Testing is focus on finding where errors might be made.
III. Some major system failures caused by software bugs:
  1. In May of 1996, according to newspaper reports a software bug caused the bankaccounts of 823

customers of a major U.S. bank to be credited with $924,844,208.32 each.

  1. Due to a lack in exception handling caused a loss of half billion dollar loss on June 4 1996 the

first flight of the European Space Agency’s new Ariane 5 rocket failed shortly after launching.

  1. In January 1998, due to a software defect a major U.S telecommunication company resulted with no

charge for long distance calls for 400,000 customers.

  1. In March of 1999, a town Illinois in the U.S. received an unusually large monthly electric bill of $7

million. This was caused by software a defect. It result the customer about 700 times larger than

its normal bill.

5.  In April of 1999 a software bug caused the failure of a $1.2 billion U.S. military satellite launch.

  1. Due to a simple data conversion error in October of 1999 the $125 million NASA Mars Climate

Orbiter spacecraft were failed.

IV. Why bugs in software?
1. Miscommunication: Lack of clear communication while understanding requirements.
2. Software complexity: Handling difficult kinds of application in software development.
3. Programming error: Errors appeared while programming the software.
E.g.: logical error, syntax error
4. Changing requirements: Simple changes in the code may result greater problems.
E.g.: Upgrading projects from older version to newer version products.
5. Time pressures: Mistake will be made due to an pressure in time line.
6. Poorly documented code: While modifying a badly written code, the result is bug.
V. Quality:
Quality means meeting requirement, customers￯﾿ᆵ￯ᄒ﾿￯ᄒᄁ￯﾿ᆵ￯ᄒᄒ￯ᄒタ￯﾿ᆵ￯ᄒᄒ￯ᄒル needs, defect free products, fit to use.
1. Relation between Testing, QA & QC:
TESTING means ‘quality control’.
QUALITY CONTROL measures the quality of a product. (or)
Involves the series of inspections, reviews, and tests used throughout the software process
to ensure the product meets customer requirements.
QUALITY ASSURANCE measures the quality of processes used to create a quality product.
(or)
Consists of auditing and reporting functions of management.
VI. Testing in project phases:
1. Requirements Phase
Determine the requirements.
Generate functional test conditions.
2. Design Phase
Perform a test planning.
Generate structural test conditions.
3. Code Phase
Execute the unit test cases.
Perform bug tracking and convert them in to tests, after fixing the bugs.
4. Test Phase
Perform unit, integration & system testing.
Conduct regression & acceptance testing if needed.
5. Installation Phase
Place tested system into production.
6. Maintenance Phase
Modify and retest.
VII. Testing Policy:
The test policy characteristics the organizational philosophy towards software testing.

Testing Standards:

BS British Standard

BS 7925-1 Software Testing Vocabulary

BS 7925-2 Software Component Testing

Def Stan 00-55 Requirements for Safety-Related Software in Defence Equipment

DO-178B Software Considerations in Airborne Systems and Equipment Certification

ESA European Space Agency

IEC The International Electrotechnical Commission

IEC 60300-3-9 Risk analysis of technological systems

IEC 61508 Functional Safety of electrical/electronic/programmable Safety-Related Systems

IEC 880 Software for computers in the safety systems of nuclear power stations

IEEE The Institute of Electrical and Electronics Engineers

IEEE 610 Standard Computer Dictionary

IEEE 610.12 Software Engineering Terminology

IEEE 730 Standard for Software Quality Assurance Plans

IEEE 829 Standard for Software Test Documentation

IEEE 1008 Standard for Software Unit Testing

IEEE 1012 Standard for Software Verification and Validation

IEEE 1028 Standard for Software Reviews

IEEE 1044 Standard Classification for Software Anomalies

IEEE 1044.1 Guide to Classification for Software Anomalies

ISO The International Organization for Standardization

ISO 9000 Quality management and quality assurance standards

ISO 9001 Model for quality assurance in design, development, production, installation and servicing.

ISO 9000-3 Guidelines for the application of ISO 9001 to the development, supply, installation and

maintenance of computer software

ISO 12207 Software life cycle processes

ISO 15026 System and software integrity levels

ISO 15288 System Life Cycle Processes

ISO 15504 Software process assessment

MISRA Development Guidelines for Vehicle Based Software (from the Motor Industry Software Reliability Association)

NIST The National Institute of Standards and Technology

NIST 500-234 Reference Information for the Software Verification and Validation Process

PSS Procedures, Specifications and Standards

PSS-05-0 ESA Software Engineering Standards

SEI The Software Engineering Institute

SE CMM            Systems Engineering Capability Maturity Model

SW CMM           Capability Maturity Model for Software

TMM Testing Maturity Model

ISO 9001
Provides a high level requirement that suppliers perform testing as part of verification and document it.
ISO 12207
Defines the requirement for specific verification and validation processes to support the primary processes of
development and maintenance.
IEEE 730
Requires the production of both a Software Verification and Validation Plan and the
documentation of any other tests.
SEI:
‘Software Engineering Institute’ at Carnegie-Mellon University; initiated by the U.S. Defense
Department to help improve software development processes.
CMM:
“Capability Maturity Model”, developed by the SEI. It is a model of 5 levels of organizational
“maturity” that determine effectiveness in delivering quality software
Level 1(Initial): The starting point for use of a new process.
Level 2(Repeatable): The process is used repeatedly.
Level 3(Defined): The process is defined and confirmed as a standard business
process.
Level 4(Managed): The process management and measurement takes place.
Level 5(Optimizing): The process management includes planning process for
improvement.
CMM rating organizations:
Total number of Organizations: 1018
Year Range
CMM Level 1
CMM Level 2
CMM Level 3
CMM Level 4
CMM Level 5
1992 to 1996
631
234
132
20
4
1997 to 2001
275
397
234
61
51
Test Plan:
A software project test plan is a document that describes the objectives, scope,
approach, and focus of a software testing effort.
Items in Test Plan:
1. Title
2. Identification of software including version/release numbers
3. Revision history of document including authors, dates, approvals
4. Table of Contents
5. Purpose of document, intended audience
6. Objective of testing effort
7. Software product overview
8. Relevant related document list, such as requirements, design documents, other test
plans, etc.
9. Relevant standards or legal requirements
10. Traceability requirements
11. Relevant naming conventions and identifier conventions
12. Overall software project organization and personnel/contact-info/responsibilties
13. Test organization and personnel/contact-info/responsibilities
14. Assumptions and dependencies
15. Project risk analysis
16. Testing priorities and focus
17. Scope and limitations of testing
18. Test outline : a decomposition of the test approach by test type, feature, functionality,
process, system, module, etc. as applicable
19. Outline of data input equivalence classes, boundary value analysis, error classes
20. Test environment : hardware, operating systems, other required software, data
configurations, interfaces to other systems
21. Test environment setup and configuration issues
22. Test data setup requirements
23. Database setup requirements
24. Outline of system-logging/error-logging/other capabilities, and tools such as screen capture software,
that will be used to help describe and report bugs
25. Discussion of any specialized software or hardware tools that will be used by testers to help track the
cause or source of bugs
25. Test automation – justification and overview
26. Test tools to be used, including versions, patches, etc.
27. Test script/test code maintenance processes and version control
28. Problem tracking and resolution – tools and processes
29. Project test metrics to be used
30. Reporting requirements and testing deliverables
31. Software entrance and exit criteria
32. Initial sanity testing period and criteria
33. Test suspension and restart criteria
34. Personnel allocation
35. Personnel pre-training needs
36. Test site/location
37. Outside test organizations to be utilized and their purpose, responsibilties, deliverables, contact
persons, and coordination issues
38. Relevant proprietary, classified, security and licensing issues.
39. Open issues
40. Appendix – glossary, acronyms, etc.
Process to be done after a bug was found:
1. Complete information such that developers can understand the bug, get an idea of it’s
severity, and reproduce it if necessary.
2. Bug identifier (number, ID, etc.)
3. Current bug status (e.g., ‘Released for Retest’, ‘New’, etc.)
4. The application name or identifier and version
5. The function, module, feature, object, screen, etc. where the bug occurred
6. Environment specifics, system, platform, relevant hardware specifics
7. Test case name/number/identifier
8. One line bug description
9. Full bug description
10. Description of steps needed to reproduce the bug if not covered by a test case or if the
developer doesn’t have easy access to the test case/test script/test tool
11. Names and/or descriptions of file/data/messages/etc. used in test
12. File excerpts/error messages/log file excerpts/screen shots/test tool logs that would be
helpful in finding the cause of the problem
13. Severity estimate (critical,major,average,minor,)
14. Check the bug was reproducible.
15. Tester name
16. Test date
17. Bug reporting date
18. Name of developer/group/organization the problem is assigned to
19. Description of problem cause
20. Description of fix
21. Code section/file/module/class/method that was fixed
22. Date of fix
23. Application version that contains the fix
24. Tester responsible for retest
25. Retest date
26. Retest results
27. Regression testing requirements
28. Tester responsible for regression tests
29. Regression testing results
Certification in Software
Company
Certification
3Com
Adobe
Alcatel
American Society for Quality
BICSI
Certified E-commerce Consultants
Check Point Software
Cisco Systems
Citrix
CIW
Compaq
CompTIA
Computer Associates International
Computer Telephony Institute, Inc.
DSDM Secretariat
Enterprise Certified Corp.
Enterasys Systems
FileNET
FORE Systems
HelpDesk 2000
Hewlett Packard
IBM Corporation
Information Systems Audit and Control Association
Informix Software, Inc.
Institute for Certification of Computing Professionals
Institute for Configuration Management
Institute for Interconnecting & Packaging Electronic Circuits
International Function Point Users Group
International Information Systems Security Certification Consortium
International Society of Certified Electronics Technicians (ISCET)
International Webmasters Association
Learning Tree International
Linux Professional Institute
Lotus
Lucent Technologies
Marimba
MERANT
Mercury Interactive
Microsoft
Motorola
National Association of Communications Systems Engineers (NACSE)
Newbridge Networks
Nortel Networks
Novell
Oracle
ParcPlace-Digitalk
Pine Mountain Group
Planet 3 Wireless
PowerSoft(Sybase)
Project Management Institute
Prosofttraining.com
Quality Assurance Institute
Red Hat Software
Rockwell Software
SAIR
Santa Cruz Operation
SAP Partner Academy
Sarbanes-Oxley Certification Institute
Shiva
Siebel Systems
Software Publishers Association
Solomon Software
Sun Microsystems
Sybase
Sysoft
Tivoli Systems, Inc.
Wild Packets
Xplor
Electronic Document & Printing Professional (EDPP)
Advertisements
No comments yet

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: