Comment by MontyCarloHall
Comment by MontyCarloHall 17 hours ago
Depends on how big the directory is. If it only contains a few files, I'd just enumerate them all with `find`, filter the results with `grep`, and perform the actual `grep` for "bar" using `xargs`:
find . -type f -name "*.foo" | grep -v '/\.' | xargs grep bar
(This one I could do from muscle memory.)If traversing those hidden files/directories were expensive, I'd tell `find` itself to exclude them. This also lets me switch `xargs` for `find`'s own `-exec` functionality:
find . -path '*/\.*' -prune -o -type f -name "*.foo" -exec grep bar {} +
(I had to look that one up.)
Thanks yeah this is a good example of why I prefer the simpler interface for `rg` and `fd`. Those examples would actually be fine if this were something only did once in awhile (or in a script). But I search from the command line many times per day when I'm working, so I prefer a more streamlined interface.
For the record, I think `git grep` is probably the best builtin solution to the problem I gave, but personally I don't know off-hand how to only search for files matching a glob and to use the current directory rather than the repository root with `git grep` (both of which are must haves for me). I'd also need to learn those same commands for different source control systems besides git (I use one other VCS regularly).