[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: The quickest way to not to descend into sub-directories once a file
From: |
Stephane Chazelas |
Subject: |
Re: The quickest way to not to descend into sub-directories once a file is found? |
Date: |
Fri, 27 Sep 2019 08:49:49 +0100 |
User-agent: |
NeoMutt/20171215 |
2019-09-26 21:57:09 -0500, Peng Yu:
[...]
> I'd like to find files named `file.txt` recursively. But the directory
> structure has a property that if any directory has a file `file.txt`,
> any of its subdirectories will not have file.txt anymore.
>
> Therefore, the quickest way to `find` is to stop descending into
> subdirectories if their ancestor directories already have `file.txt`.
>
> But I don't see how to encode this rule with `find` (-prune doesn't
> seem to apply to this case).
[...]
-prune and -maxdepth are the only things that can stop find from
descending into a directory.
Here, you could do:
find . -type d -exec test -e '{}/file.txt' \; -prune -printf '%p/file.txt\n'
But note that it involves forking a process and executing a test
command in it for each non-pruned directory, which may end-up
being less efficient than traversing the directories that a
file.txt in them.
Also note that it will fail to find file.txt that are broken
symlinks or are in non-searchable directories.
For this kind of advanced directory traversal rule, you may want
to use the tree scanning features of some scripting language
like the File::Find of perl.
--
Stephane