Skip over menu

Software Safety Home Page
Software Metrics
Last Modified on: Fri Sep 26 20:15:09 2008
Navigation Menu Left:

Books to read
Compliers with safety in mind
Esterel.org Reactive Programing Language
Guidelines for better software
Hardware Design Tips
ISO9000, the correct meaning
Internationalization (I18N & L10)
Jobs

Cutting costs and analyzing productivity may look great on paper, but the effect on a company's long-term competitiveness can be devastating.

Navigation Menu Right:

Metrics Gone Bad
Metrics
Past Visitors
Photosensitive Epilepsy
Quality
Software Patents Gone Bad
Tools
Translate the Software Safety web site




If you think that I could be of some assistance to you or your organization let me know,
American Society for Quality  Certified Software Quality Engineer.

Software Safety now has a blog! Check it out.

This site has been listed as the EG3 Editor's Choice in the Embedded Safety category for February 2004.

eCLIPS gives this site four of five stars in the September 7th 2004 SAFETY CRITICAL - DESIGN GUIDE.


PicoSearch
  Help
Site Search by PicoSearch 
Text Search


If you want to have the best software orginzation that your money can buy then it is imparitive that you do not let the Bean Counters be in control.   Cutting costs and analyzing productivity may look great on paper, but the effect on a company's long-term competitiveness can be devastating.


Manage Your Time Like Rocks In A Jar
by Harvey Mackay.


NASA's Software Measurement Guidebook: Presents information on the purpose and importance of measurement. It discusses the specific procedures and activities of a measurement program and the roles of the people involved. [PDF, 880KB]


Measuring Function Size

The International Function Point Users Group (IFPUG) is pleased to announce the publication of an International Standard to measure the functional size of software - ISO/IEC 20926:2003 Software engineering - IFPUG 4.1 Unadjusted functional size measurement method - Counting practices manual. The publication of the ISO standard confirms IFPUG Function Points as the preeminent method for measuring software size.

Scott Goldfarb, IFPUG President said:

"This is a landmark achievement for IFPUG and the software industry. IFPUG function point analysis is the only functional sizing method that is well-proven, well-established and well-documented. After 25 years of usage, we now have an officially recognized, worldwide standard for measuring the functional size of software."
...
"In 1979, Allan Albrecht of IBM published a paper on function point analysis - a method for measuring software size from a business perspective. Interest in an industry-wide standard for measuring software size inspired the formation of IFPUG in 1986, to manage the evolution of the method and to provide supporting materials and training services. IFPUG has since grown to become the preeminent software metrics organization with members throughout the world. IFPUG function point analysis is increasingly being used as a basis for software management, outsourcing contracts and process improvement initiatives in a wide variety of software disciplines from financial management to missile defense systems."

Volume 5 · Issue 1 · December 2002 ARTICLES

METRICS
Measuring Software Product Quality - * FULL TEXT *
Rob Hendriks, Erik van Veenendaal, and Robert van Vonderen

Code Complexity Metrics


A metric that you can use to measure the size of your code is important to have.  The CodeCount tool-set is a collection of tools designed to automate the collection of source code sizing information. The CodeCount tool-set spans multiple programming languages and utilizes one of two possible Source Lines of Code (SLOC) definitions, physical or logical.

http://sunset.usc.edu/research/CODECOUNT/index.html

Halstead Complexity Measures was developed to measure a program module's complexity directly from source code, with emphasis on  computational complexity.

Cyclomatic Complexity is the most widely used member of a class of static software metrics. Cyclomatic complexity may be considered a broad measure of soundness and confidence for a program.

Metrics collection tools for C and C++ Source Code  offers access to a collection of static code analysis tools that compute various metrics defined on C and C++ source code. The metrics are primarily size and complexity of various types (lines of code, Halstead, McCabe, etc.).

NIST Special Publication 500-235  Structured Testing: A Testing Methodology Using the Cyclomatic Complexity Metric [PDF] by Arthur H. Watson and Thomas J. McCabe. [DjVu format]

Prepared under NIST Contract 43NANB517266

Abstract

The purpose of this document is to describe the structured testing methodology for software testing, also known as basis path testing. Based on the cyclomatic complexity measure of McCabe, structured testing uses the control flow structure of software to establish path coverage criteria. The resultant test sets provide more thorough testing than statement and branch coverage. Extensions of the fundamental structured testing techniques for integration testing and object-oriented systems are also presented. Several related software complexity metrics are described. Summaries of technical papers, case studies, and empirical results are presented in the appendices.

Keywords

Basis path testing, cyclomatic complexity, McCabe, object oriented, software development, software diagnostic, software metrics, software testing, structured testing.

Acknowledgments

The authors acknowledge the contributions by Patricia McQuaid to Appendix A of this report.

Disclaimer

Certain trade names and company names are mentioned in the text or identified. In no case does such identification imply recommendation or endorsement by the National Institute of Standards and Technology, nor does it imply that the products are necessarily the best available for the purpose.

Executive Summary

This document describes the structured testing methodology for software testing and related software complexity analysis techniques. The key requirement of structured testing is that all decision outcomes must be exercised independently during testing. The number of tests required for a software module is equal to the cyclomatic complexity of that module. The original structured testing document [NBS99] discusses cyclomatic complexity and the basic testing technique. This document gives an expanded and updated presentation of those topics, describes several new complexity measures and testing strategies, and presents the experience gained through the practical application of these techniques.

The software complexity measures described in this document are: cyclomatic complexity, module design complexity, integration complexity, object integration complexity, actual complexity, realizable complexity, essential complexity, and data complexity. The testing techniques are described for module testing, integration testing, and object-oriented testing.

A significant amount of practical advice is given concerning the application of these techniques. The use of complexity measurement to manage software reliability and maintainability is discussed, along with strategies to control complexity during maintenance. Methods to apply the testing techniques are also covered. Both manual techniques and the use of automated support are described.

Many detailed examples of the techniques are given, as well as summaries of technical papers and case studies. Experimental results are given showing that structured testing is superior to statement and branch coverage testing for detecting errors. The bibliography lists over 50 references to related information.

The Cyclomatic Complexity of a software module is calculated from a connected graph of the module (that shows the topology of control flow within the program).

Studies show a correlation between a program's cyclomatic complexity and its error frequency. A low cyclomatic complexity contributes to a program's understandability and indicates it is amenable to modification at lower risk than a more complex program. A module's cyclomatic complexity is also a strong indicator of its testability.

Authoritative sources such as:

http://hissa.nist.gov/HHRFdata/Artifacts/ITLdoc/235/title.htm

and:

http://www.sei.cmu.edu/str/descriptions/cyclomatic.html

give different formulas for calculating Cyclomatic Complexity.

The SEI gives this one:

Cyclomatic complexity (CC) = E - N + p

where E = the number of edges of the graph

N = the number of nodes of the graph

p = the number of connected components

NIST gives these:

V(G) = E - N + 1
V(G) = E - N + 2

The + 1 comes from the fact that you can not have zero branches. The difference between + 2 and + 1 has to do with a virtual path from the exit back to the entry point. + 2 appears to be the more common variation.

I have seen these variations as well:

V(G) = P + 1; which does give the same answer as E - N + 2 given in NIST version.

V(G) = E - N + 2p

In this case p is the number of connected components; the number of independent procedures if each has its own graph. Because most examples shown are for a single graph the term is usually simplified to 2 rather than the more correct (2 * 1).

While these types of metrics are important they should not be relied up on totally.

This example taken from Goodenough and Gerhart via Dunn in Software Defect Removal show why:

if( ((x + y + z)/3) == x )
    printf( "X, Y and Z are equal in value\n" );
else
    printf( "X, Y and Z are NOT equal in value\n" );

Neither counting lines of code/statements or branch testing show the flaw in the logic of the program(mer).


Tim Littlefair has developed a "fuzzy" C++ parser for computing software metrics. The source code for his C++ parser is available from SourceForge.


'If you cannot MEASURE it, you cannot IMPROVE it'.
- Lord Kelvin, IEC's first President (1906).



Select from the menu other areas of Software Safety that you would like to explore.


If you think that I could be of some assistance to you or your organization let me know,
American Society for Quality  Certified Software Quality Engineer.




Go Back To The  Software Safety Home Page