Efficient coding considered harmful?

David Chase chase at Ozona.orc.olivetti.com
Thu Nov 3 16:54:32 AEST 1988


In article <1709 at garth.UUCP> smryan at garth.UUCP (Steven Ryan) writes:
>It sounds as though someone's guessing what the data structure size should
>be, guesses wrong, and has to increase it. If data structure size keeps
>getting bumped bit by bit until the problem size is finally determined,
>then we've got an O(n) structure which has been copied O(n/increment)=O(n)
>times so that the cost of creating just this structure is O(n**2) so that
>the overall time is quadratic, at best.

First estimate is size k, too small, so we double it (etc) finally
reaching size k*2^n.  How many bytes have we copied?  We've copied

        k + k * 2 + ... + k * 2^(n-1) = k * (2^n - 1).

This is linear in the final size.  We also did n allocations, but n is
proportional to the log of the size (and bounded by 32 on most
machines that I use) so we don't worry about the time spent
allocating.

"Efficient coding" is indeed harmful if you can't figure out the
complexity of your algorithms.

David



More information about the Comp.lang.c mailing list