bug-fileutils
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: ls command


From: Richard Dawe
Subject: Re: ls command
Date: Mon, 07 Jun 2004 21:03:58 +0100
User-agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.5) Gecko/20031031

Hello.

Rich wrote:
I have come accross what seems to be a bug with the ls command.  It only
happens in directories over a few thousand entries.

For example, the directory I am testing has roughly 7,000 files.  We
tried using:

ls *_*
This produces a failure, [Too Many Arguments].

There are limits in how long the command-line can be. 7000 files is a fair number of files, so you are hitting those limits.

These all work fine:
ls
ls -altr
ls -1

I tried other commands like find, tail, and grep. All work fine.
        find . -name "*_*" -exec ls -l {} \;
        tail *_*
        grep "DATA ERROR" *_*

BTW, the *_* produces ~4,300 files out of 7,000.

Not sure if others have seen this type of failure.

Yes, I've seen it quite a few times.

You may find this command quicker than "find -exec":

  find . -name "*_*" -print0 | xargs --null ls -l

This also copes with spaces in the filenames by using nul as the separator.

Hope that helps, bye, Rich =]

--
Richard Dawe [ http://homepages.nildram.co.uk/~phekda/richdawe/ ]

"You can't evaluate a man by logic alone."
  -- McCoy, "I, Mudd", Star Trek





reply via email to

[Prev in Thread] Current Thread [Next in Thread]