Arg list too long error? Here's a work-around.

Bernd Felsche bernie at DIALix.oz.au
Sat Nov 17 17:12:43 AEST 1990


In <1990Nov16.001140.11923 at druid.uucp> darcy at druid.uucp (D'Arcy J.M. Cain) writes:

>In article <1990Nov14.192707.1099 at millipore.com> Jeff Lampert writes:
>> [...]
>>The 'find' command does'nt seem to have the 'Arg list' limitation.  It also

'find' does have it, if you pass it too many arguments!  The command
used below hides filename expansion from the shell.  Ask what would 
happen if you hadn't quoted "SRW*"?

>>find . -name "SRW*" -exec rm {} \;
>> [...]

Unfortunate side-effect is that files with matching names in
subdirectories will also be removed.

The following method is slow and also requires many forks, making you
very popular on a busy system.

>ls | grep '^SRW' | while read X
>do rm $i
>done

We all know that the "best" way is to allow the shell to expand
filenames and echo them into xargs... right?

vis:
	echo SRW* | xargs -n10 

-n option defines the maximum number of arguments to pass to rm.
This may be too many for some older systems, or where pathnames are
very long... the -s option to xargs defines the size allowed, but
then, most users don't know what size to use...

I also thought that the 'xargs' debate had finished... but then,
maybe nobody read the second-last entry in the reference manual.
-- 
 ________Bernd_Felsche__________bernie at DIALix.oz.au_____________
[ Phone: +61 9 419 2297		19 Coleman Road			]
[ TZ:	 UTC-8			Calista, Western Australia 6167	]



More information about the Comp.unix.misc mailing list