dd error?

Edwin Kremer edwin at ruuinf.UUCP
Mon Feb 13 18:52:41 AEST 1989


   In article <6 at holston.UUCP>, barton at holston.UUCP (barton) writes:
   > When attempting to pipe tar output to dd for a
   > larger blocking factor with the line:
   > tar cf - . | dd obs=512k > /dev/rct0
   > I get an error from dd saying: arguement 512k out of range.

I checked the source code on our Harris HCX-9, see a part of
the comments below:

   /* The BIG parameter is machine dependent.  It should be a long integer  */
   /* constant that can be used by the number parser to check the validity  */
   /* of numeric parameters.  On 16-bit machines, it should probably be     */
   /* the maximum unsigned integer, 0177777L.  On 32-bit machines where     */
   /* longs are the same size as ints, the maximum signed integer is more   */
   /* appropriate.  This value is 017777777777L.                            */
   #define BIG	017777777777L

The value 512k is checked against BIG to see if you're using a reasonable
output buffer size. Well, dd concludes you're not :-)
I guess you're on a 16-bit machine, sorry...

						--[ Edwin ]--

-- 
Edwin Kremer, Department of Computer Science, University of Utrecht
Padualaan 14,  P.O. Box 80.089,  3508 TB  Utrecht,  The Netherlands
Phone: +31 - 30 - 534104        |  UUCP    : ...!hp4nl!ruuinf!edwin
    "I speak for myself ..."    |  INTERNET: edwin at cs.ruu.nl



More information about the Comp.unix.xenix mailing list