Why 'C' does not need BCD...

Geoff Kuenning geoff at desint.UUCP
Tue Nov 27 18:57:34 AEST 1984


In article <166 at harvard.ARPA> breuel at harvard.ARPA (Thomas M. Breuel) writes:

>You don't 'have to convert to binary, add,
>and convert back to BCD in approximately 10 microseconds', since you
>don't store your data in BCD on the disk in the first place if you are
>working with binary numbers.

You shot down this argument at the end of your article, see below.

>[A lengthy claim that human conversion is the important part, and that this
>is inherently slow because of the slowness of I/O devices, is omitted here.]

You are assuming that the job is running standalone.  Real computers have
some compute-bound jobs (even in business installations).  If another job
is eating CPU unnecessarily, that is CPU that is not available to others.

>[Besides, you're probably running UN*X.  Vax calls take a lot more than 10
>us, and UN*X makes lots of procedure calls and copies a lot.]

C does not run only on UN*X.  There are operating systems in the world that
are faster than UN*X, and there are machines that are faster than Vaxen.

>you would still have to do seeks ('double buffering'
>doesn't really help you), either every now-and-then

A seek once out of every 500 transfers (which a well-designed filesystem
can achieve) does not seriously impact performance.  And who says double
buffering doesn't help you?  Properly implemented double buffering most
assuredly does, and examples are easy to come by.

>The slowness of disk I/O is not related to transfer rates (1.8 Mbyte/s
>is almost as fast as dynamic ram chips), but to seek times and
>scheduling problems (DMA, processor interrupt service, ...).

So?  Ever try to swap out 1 MB on a 5-1/4" Winchester?  For that matter, ever
try to swap out 2 MB on an Eagle?  We are talking transfer times in SECONDS
here, compared to seek times in the 20-100ms range.

Further, many business programs do their I/O off of tape drives, which have
no seek times at all.

>|                                               But I have never seen an
>|algorithm for converting binary to decimal without doing divisions.
>
>If you keep on reading, you'll find an algorithm that converts binary
>to BCD without division.

The algorithm is indeed given at the end of the article;  I am hardly
impressed.  For the literalists out there, let me revise the above claim to
"I have never seen an EFFICIENT algorithm...".  The algorithm you gave is
obviously less efficient than using division, since it must do 1 BCD add per
bit, compared to one division per four bits (approx.) for a division
algorithm.

>You can do your arithmetic with 32 bits and store data as 24 bits
>on disk (that's what UN*X used to do on PDP-11's).

This is the "shooting down" of your argument that I mentioned above.  As
soon as you have to take 24-bit numbers, convert them to 32 bits, and
convert them back to 24 for output, you have lost any possible speed
advantage that binary might have given you.  6-digit BCD is much faster.

>[Summary:  binary is always faster, and the world is I/O limited in any case]

24-bit binary is not faster.  And if the world is I/O limited in any case,
why are you getting down on the business people for choosing a data
representation that is conceptually convenient?  How many UNIX programs
laboriously convert a number to ASCII only to pipe into a program that will
immediately crack it into binary?  (S package?  What S package?).  I am
getting awfully tired of this attitude that "my favorite data
representation is the only acceptable one".

There seems to be a common misconception among "serious" computer types
that 100% of business programmers are cretins.  Sorry, folks.  There are a
lot of extremely high-powered brains out there who have chosen to attack
the business world, and they are applying themselves to problems of a
magnitude (in terms of number of data items processed) that you and I
hardly ever have to deal with.  If they can come up with a better or faster
way to run the payroll, it produces a *big* savings for the company.  If
binary was really better, you can count on it that it would be in use.  (Is
in use, actually, because there are applications where it is better.)
-- 

	Geoff Kuenning
	...!ihnp4!trwrb!desint!geoff



More information about the Comp.lang.c mailing list