C needs BCD -- why BCD? Accountants!

Geoff Kuenning geoff at desint.UUCP
Sat Nov 24 16:24:42 AEST 1984


In article <161 at harvard.ARPA> breuel at harvard.ARPA (Thomas M. Breuel) writes:

>As soon as I/O gets involved, it hardly matters which one is faster
>computationally. BTW, you can do conversions using shifts and (sorry
>to say that) BCD arithmetic... but you can write that in assembly
>language.

A Fujitsu Eagle transfers data at about 1.8 megabytes per second.  Many
mainframe disks are even faster.  If you are storing 6-byte (12-digit) BCD
and are adding two streams, you are transferring 18 bytes per operation.
Assuming you double-buffer but have only one controller, this is 100K
operations per second.  You thus have to convert to binary, add, and
convert back to BCD in approximately 10 microseconds.

I am perfectly aware that multiplications can be done with shifts and adds.
Indeed, that's how hardware multipliers work.  But I have never seen an
algorithm for converting binary to decimal without doing divisions.  It is
pretty hard to do 12 digits' worth of binary-to-decimal conversion in 10
microseconds on most machines (though possible on a modern
mainframe--right, Gordon?).  But a 12-digit BCD add can be pretty easy.
And it's actually worse than that, because a mainframe that powerful
typically has several controllers so that the full 1.8 megabyte output data
rate could be maintained if the CPU could keep up with it.
-- 

	Geoff Kuenning
	...!ihnp4!trwrb!desint!geoff



More information about the Comp.lang.c mailing list