[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

best practices for grep -f?

Subject: best practices for grep -f?
Date: Thu, 4 Dec 2008 06:55:01 -0800 (PST)

Recently, I needed to grep multiple files in a directory, using 28,212
matching patterns. I did this two ways. The first way, I had all 28,212
patterns in one file, (say filename, and used 

grep -f filename files > results1

The second way, I split the one large file into 28 or so smaller files (say
filename1, filename2, and so on...) and used 

grep -f filename1 files > results2 
grep -f filename2 files >> results2

...and so on...basically iterating 28 or so times over files.

By the time each process finished, for the first process (28,212 arguments
in one file) I had a results file that was 1.6MB. But for the second process
(28 files,1000 or so arguments per file, iterating grep once for each file
), I ended up with a 5.9MB file.

I wonder if this is enough to assume that grep has trouble with too many
arguments read in from a file? If that is so, then a safe limit would be
somewhere between 1000 and 28,211. Is there a consensus out there as to
whether or not there's a point at which greping so many arguments in from a
file shouldn't be done?

Tony Zanella
View this message in context: 
Sent from the Gnu - Grep mailing list archive at Nabble.com.

reply via email to

[Prev in Thread] Current Thread [Next in Thread]