Determining C Complexity

Dave Straker daves at hpopd.HP.COM
Mon Aug 6 06:08:36 AEST 1990


henry at zoo.toronto.edu (Henry Spencer) / 10:45 pm  Aug  4, 1990 / writes:

>In article <7990015 at hpopd.HP.COM> daves at hpopd.HP.COM (Dave Straker) writes:
>>"You can't control what you can't measure"
>
>So what, precisely, are code metrics measuring?
>
>Not code quality, as seen by the customers.  They care about whether it
>works, whether it's fast and small, and whether it will continue to work
>after maintenance.

They'd probably prefer it worked *without* maintenance. I agree whole-heartedly
with starting and ending with the customer and his needs (HP has a strong
focus here), and the most important measures are those that the customer
uses. However, he *does* measure code, possibly not in a rigorous, quantitative
manner, but in a way that says 'this is a good program' (or otherwise).
The most obvious quantitative measure he does make is in defects, which are
eminently countable. We can also analyse them: How did they happen? Why did
they get to the customer? Can we improve our processes such that this would
not have happened?

>The connection between code metrics and any of these things is, at best,
>unverified conjecture.

Start with the customer's needs. Work backwards into your own processes.
Some of these will be related to code. Find the best measure you can
and use it. Close the loop! Make sure the measure and the action resulting
from its use actually does result in improved customer satisfaction.

>If you want to measure bugs, performance, and maintenance ease, there are
>better metrics.  Like number of bugs, timing figures, and maintenance
>man-hours.  Admittedly, there are a lot of variables involved, and it is
>hard work to measure these things well.  Running a program to determine
>the cyclomatic complexity of the code is much easier.  But the customers
>don't *care* about the cyclomatic complexity!

Yes, I suppose the basis of this discussion is McCabe. Nevertheless
his measure is a *tool*. It can be run over a lot of files quickly to
flag *potential* trouble spots. The final judgement must be human.

>Measure the things you care about, and forget the silly code metrics.

Measure the things the *customer* cares about, and forget *all* silly metrics
code or otherwise. Take time to find the good metrics (possibly by trying
the silly ones first - you often don't know they're silly until you try them).

>The 486 is to a modern CPU as a Jules  | Henry Spencer at U of Toronto Zoology
>Verne reprint is to a modern SF novel. |  henry at zoo.toronto.edu   utzoo!henry

Dave Straker            Pinewood Information Systems Division (PWD not PISD)
[8-{)                   HPDESK: David Straker/HP1600/01
                        Unix:   daves at hpopd



More information about the Comp.lang.c mailing list