CYBER word length

Dick Dunn rcd at ico.UUCP
Sat Nov 15 11:10:21 AEST 1986


> >The Cyber word length was selected to be 60 bits because of the number of 
> >exact divisors it has : 2, 3, 4, 5, 6, 10, 12, 15, 20, and 30.
> 
> That's a great myth.  Almost believable...

Come on...it's neither myth nor the real reason.  Don't you think that
there are more considerations to word size than the number of factors???
(I know of a machine that had a 29-bit word--so there!:-)  There are about
eleventeen things to consider (see below).  Maybe this whole discussion
should have been in comp.arch...

>...But isn't it true that the Cyber
> word length was set a 60 bits to be compatible with the old CDC-6000 series?

The "Cyber" name was attached to the 6000 series later in its life.  The
move to the 64-bit word, 2's comp, and all that came along much later, but
the Cyber 7x machines were essentially the same hardware as the 6y00's
(y!=x, of course:-)

> Isn't the operant concern here to be a multiple of 6?

Sort of.  6-bit characters were a concern.  Another concern made it
desirable to have wordsize a multiple of 12.

>...See, when the 6000
> was born, it was a successor of the CDC-3300 series, which used 36 bit words

Flamers live for postings like this.  The CDC 3300 (and 3200...) - the
"lower 3000" series machines - were fairly slow machines and not really
intended for scientific/numerical work.  The 6600 (the first of the 6000
line) was in a completely different market...it might be considered some
sort of a successor to the 3600/3800 machines.  Oh yeah, and look it up
before you post it--the lower 3000 machines were =>24 bit<= word size and
the upper 3000 were =>48 bit<=.  Both of these were somewhat unusual for
the day, but the 48-bit word was much more useful for single-precision
floating point work than the 36 popular those days.

Other considerations on word size:  The way the machine was actually built,
the 6600 used a memory module which was 4K x 12 bit.  One of these modules
was enough for the memory of a peripheral processor (PPU, a little I/O guy,
of which there were 10 in the standard configuration).  Stack them 5 wide
and you get central memory; the memory was interleaved in 4K chunks (since
the processor design almost required at least 16 banks of memory for the
context switch instruction, but I digress:-)  So there's an argument for a
multiple of 12.  I think the core modules may have been used in other CDC
machines like the 3000's or the 8090, but I'm not sure on that.

Also, remember that this is a RISC-style machine.  It wants 2 forms of
instructions--those which contain address-size constants and those which
don't.  The ones that don't are 3-address reg-to-reg.  If you work out the
numbers, it comes up that 15 and 30 bit instructions are nice.  For a
floating-point value with a healthy exponent and lots of precision (to try
to stay away from double precision as much as you can) you want at least
the 48 bits of the upper 3000's, maybe more.  60 is ok; 72 probably would
have been a bit much.  Before the days of byte-addressible 32-bit designs,
word size was a balancing act among a lot of factors.
-- 
Dick Dunn    {hao,nbires,cbosgd}!ico!rcd	(303)449-2870
   ...Relax...don't worry...have a homebrew.



More information about the Comp.unix.wizards mailing list