Argument list too long. Workaround? 

Date:          Wed, Mar 1 2006 6:58 am
Groups:        comp.unix.shell
$ for i in `ls /media/cdrecorder5/subdir/*.zip`; do unzip $i; done

generated the following error message:

bash: /bin/ls: Argument list too long/

According to ls | wc -l, this directory contains 3307 (!) files. I tried narrowing the selection by piping it through grep - ^[a-f] to no avail. I appear to be coming up against an internal limitation of bash and/or ls. Any suggestions on how to get around this?

Argument list too long. Workaround? 

> avail. I appear to be coming up against an internal limitation of bash
> and/or ls. Any suggestions on how to get around this?

It's not a limitation in bash OR ls, but rather in the kernel. The kernel imposes a maximum size that any process can use for its environment: its environment includes not just the normal environment variables, etc. but also the command line arguments.

You can get away from this by avoiding the wildcard; remember that in UNIX wildcards are expanded by the shell first, then the results of the expansion are placed in the command line of the subcommand (hence your very long list on the command line).

Try something like:

ls -1 /media/cdrecorder5/subdir \
  | while read file; do \
      case $file in \
        *.zip) unzip "$file" ;; \
      done

Paul D. Smith <psm…@nortel.com>

Argument list too long. Workaround? 

>     ls -1 /media/cdrecorder5/subdir \
>       | while read file; do \
>           case $file in \
>             *.zip) unzip "$file" ;; \
>           done

That assumes that the filenames in subdir don't have any newline characters.

Also note that, by default, read splits its input to put it into the various variables whose names it is provided with. In that splitting, the blank characters contain in $IFS will be stripped from the beginning and end of each line, the line will then be splitted in as many words as there are variable names provided (the backslash will act as an escape character for the separators and the newline).

To disable this, you need to disable the backslash escaping processing by using the "-r" read option. To disable the stripping of leading blank characters, you need to remove those blank characters from IFS, so that's more "while IFS= read -r"

Funnily enough, I just had to do exactly the same thing yesterday, I wrote it:

for f (/media/sda1/**/*.(#i)zip(.D)) unzip $f

but that's because I use zsh as my interactive shell.

That's zsh's short form of the for loop. **/* is to recurse into subdirectories, (.) is to only select regular files (ommit symlinks, directories…), (D) is to also include dotfiles (.foo.zip), (#i) is to toggle case insensitive matching because I also wanted to extract files called like "FOO.ZIP".

Stephane

Argument list too long. Workaround? 

> blank characters from IFS, so that's more "while IFS= read -r"

This is all very true, but on the other hand read -r is not as portable (Solaris /bin/sh doesn't support it, as one example).

It's a trade-off between correctness, simplicity, and portability. For a one-time situation like this it didn't seem worthwhile to make it ironclad. But, you're definitely correct that I should have made the shortcomings of that solution clear in my post.

Paul D. Smith <psm…@nortel.com>

Argument list too long. Workaround? 

Try This:

cd /media/cdrecorder5/subdir/
ls | grep -i '\.zip$' | xargs unzip -i -t -c {}

K