bug-fileutils
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: mv argument error


From: Bob Proulx
Subject: Re: mv argument error
Date: Sat, 14 Oct 2000 15:12:33 -0600 (MDT)

> I tried to mv about 5000 files with mv, but it said 
> bash: /bin/mv: Argument list too long
> 
> Don'T know whether it is the fault of bash or mv,
> but just wanted to tell you. I solved this problem
> with a for loop, but I think mv should be able to mv
> so many files.

The UNIX operating system tradionally has a fixed limit for the amount
of memory that can be used for a program environment and argument list
combined.  You can use getconf to return that limit.  On my Linux
system (2.2.12) that amount is 128k.  On my HP-UX 11.0 system that
amount is 2M.  It can vary per operating system.  POSIX only requires
20k which was the traditional value used for probably 20 years.  Newer
operating systems releases usually increase that somewhat.

  getconf ARG_MAX
  131072

Note that your message came from "bash" your shell command line
interpreter.  It's job is to expand command line wildcard characters
that match filenames.  It expands them before any program can see
them.  This is therefore common to all programs on most UNIX-like
operating systems.  It cannot exceed the OS limit of ARG_MAX and if it
tries to do so the error "Argument list too long" is returned to the
shell and the shell returns it to your.  This is not a bug in 'mv' or
other utilitities nor is it a bug in 'bash' or any other shell.  It is
an architecture limitation of UNIX-like operating systems.  However,
it is one that is easily worked aroung using the supplied utilities.
Please review the documentation on 'find' and 'xargs' for one possible
combination of programs that work well.  You might think about
increasing that value but I advise against it.  Any limit, even if
large, is still a limit.  As long as it exists then it should be
worked around for robust script operation.  On the command line most
of us ignore it unless we exceed it at which time we fall back to more
robust methods.

Since I cannot deduce your move example I will propose a different
example using chmod as one example where exceeding ARG_MAX argument
length is avoided.

  find htdocs -name '*.html' -print0 | xargs -0 chmod a+r

Bob



reply via email to

[Prev in Thread] Current Thread [Next in Thread]