[Top][All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Memory limitation of 'find'
From: |
P Fudd |
Subject: |
Re: Memory limitation of 'find' |
Date: |
Thu, 30 Dec 2010 15:16:29 -0800 |
Hello again;
The latest version of find works.
Thanks!
On Thu, 16 Sep 2010 00:19:01 +0100
James Youngman <address@hidden> wrote:
> On Wed, Sep 15, 2010 at 1:56 AM, P Fudd <address@hidden> wrote:
> > Hello!
> >
> > Thanks for writing find, I use it daily.
> >
> > Recently, I've had to use find on unusual filesystems.
> > Specifically, filesystems containing directories with 35,000 files
> > in them, with each filename being 11 characters long.
> >
> > Example:
> > $ ls -f chromat_dir | wc -l
> > 35234
> > $ mkdir /tmp/foo; cp chromat_dir/* /tmp/foo
> > /bin/cp: Argument list too long.
> > $ find chromat_dir > /dev/null
> > find: chromat_dir: Cannot allocate memory
> > $
> >
> > Is there a way to remove the memory limitation? The computer has 16
> > gigs of ram; allocating 387k shouldn't cause it to choke like this.
>
> You are right. find should have no problem descending directory
> hierarchies hundreds of thousands of levels deep and containing tens
> or hundreds of millions of files.
>
> If this problem exists still, it needs to be fixed. First, please
> check with a recent version of findutils; the version you are using
> was released over five years ago. You can find up-to-date versions
> of findutils at ftp.gnu.org.
>
> If you try this with a recent version of find and discover there is
> still a problem, please investigate with some kind of system call
> tracer; I'd be interested in the operations leading up to and (less
> so) immediately following the ENOMEM error.
>
> Thanks,
> James.
[Prev in Thread] |
Current Thread |
[Next in Thread] |
- Re: Memory limitation of 'find',
P Fudd <=