Determining C Complexity

John Baldwin johnb at srchtec.UUCP
Tue Jul 24 07:25:23 AEST 1990


In article <1050 at ashton.UUCP> tomr at ashton.UUCP (Tom Rombouts) writes:
>
>.......................... there is no real way to measure such things
>as intelligence of the overall design and architecture, elegance of
>algorithms, or quality of comments and documentation.  

True.  But you CAN quantify (statistically, not deterministically) such
things as "did the programmer, in general, decompose the design into
subroutines (functions, procedures, ad infinitum) of manageable size?"

Naturally, you shouldn't become the "code police" on the basis of those
quantifiers alone.  For instance, a certain routine may have to run on
a special stack of limited size; therefore it might be coded as an extremely
long procedure, with calls only to "primitive" subroutines, in order to
avoid exceeding stack depth.

A good policy (by my experience) is to take complexity metrics (and the like)
with a grain of salt:  your metric rates some piece of code with a
McCabe complexity of 22, and the average is 8.2.  So you check the code;
there's no immediate reason for the excess complexity (based on your perusal).
You talk to Joe Programmer, who provides an extremely logical explanation
AND a reasonable defense for doing things this way (as opposed to....).
BTW, note that both an explanation AND a defense are required.

Having satisfied all of this, you good-naturedly remind Joe P. that he
should place a comment near the beginning of the source code which contains
a synopsis of his explanation and defense of the design.  In parallel, you
jot "ok" next to this metric, with some kind of cross reference.

-- 
John T. Baldwin            | Disclaimer:
search technology, inc.    |    Some people claim I never existed.
Norcross, Georgia          | (real .sig under construction
johnb at srchtec.uucp         |  at Starfleet Orbital Navy Yards ;-)



More information about the Comp.lang.c mailing list